hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
f2f2d879d7f1a695b7b27288419b90dbd343d102 | 12,763 | py | Python | test/ffmpeg-qsv/encode/hevc.py | wangzj0601/vaapi-fits | df3188dc6624492ba941e4d0e74c2a0ffdf36759 | [
"BSD-3-Clause"
] | null | null | null | test/ffmpeg-qsv/encode/hevc.py | wangzj0601/vaapi-fits | df3188dc6624492ba941e4d0e74c2a0ffdf36759 | [
"BSD-3-Clause"
] | null | null | null | test/ffmpeg-qsv/encode/hevc.py | wangzj0601/vaapi-fits | df3188dc6624492ba941e4d0e74c2a0ffdf36759 | [
"BSD-3-Clause"
] | null | null | null | ###
### Copyright (C) 2018-2019 Intel Corporation
###
### SPDX-License-Identifier: BSD-3-Clause
###
from ....lib import *
from ..util import *
spec8 = load_test_spec("hevc", "encode", "8bit")
spec10 = load_test_spec("hevc", "encode", "10bit")
def check_bitrate(params):
# calculate actual bitrate
encsize = os.path.getsize(params["encoded"])
bitrate_actual = encsize * 8 * params["fps"] / 1024.0 / params["frames"]
bitrate_gap = abs(bitrate_actual - params["bitrate"]) / params["bitrate"]
get_media()._set_test_details(
size_encoded = encsize,
bitrate_actual = "{:-.2f}".format(bitrate_actual),
bitrate_gap = "{:.2%}".format(bitrate_gap))
assert(bitrate_gap <= 0.10)
def check_bitrate_vbr(params):
# calculate actual bitrate
encsize = os.path.getsize(params["encoded"])
bitrate_actual = encsize * 8 * params["fps"] / 1024.0 / params["frames"]
get_media()._set_test_details(
size_encoded = encsize,
bitrate_actual = "{:-.2f}".format(bitrate_actual))
# acceptable bitrate within 25% of minrate and 10% of maxrate
assert(params["minrate"] * 0.75 <= bitrate_actual <= params["maxrate"] * 1.10)
def check_psnr(params):
if "P010" == params["format"]:
hwformat = params["mformat"]
else:
hwformat = "nv12"
call(
"ffmpeg -hwaccel qsv -hwaccel_device /dev/dri/renderD128 -v verbose"
" -c:v hevc_qsv -load_plugin hevc_hw -i {encoded}"
" -vf 'hwdownload,format={hwformat}' -pix_fmt {mformat} -f rawvideo"
" -vsync passthrough -vframes {frames}"
" -y {decoded}".format(hwformat = hwformat, **params))
get_media().baseline.check_psnr(
psnr = calculate_psnr(
params["source"], params["decoded"],
params["width"], params["height"],
params["frames"], params["format"]),
context = params.get("refctx", []),
)
#-------------------------------------------------#
#---------------------- CQP 8 --------------------#
#-------------------------------------------------#
@slash.requires(have_ffmpeg)
@slash.requires(have_ffmpeg_qsv_accel)
@slash.requires(have_ffmpeg_hevc_qsv_encode)
@slash.requires(have_ffmpeg_hevc_qsv_decode)
@slash.requires(using_compatible_driver)
@slash.parametrize(*gen_hevc_cqp_parameters(spec8, ["main"]))
@platform_tags(HEVC_ENCODE_8BIT_PLATFORMS)
def test_8bit_cqp(case, gop, slices, bframes, qp, quality, profile):
params = spec8[case].copy()
mprofile = mapprofile("hevc-8", profile)
if mprofile is None:
slash.skip_test("{} profile is not supported".format(profile))
params.update(
profile = mprofile, gop = gop, slices = slices, bframes = bframes, qp = qp,
quality = quality, mformat = mapformat(params["format"]))
params["encoded"] = get_media()._test_artifact(
"{}-{gop}-{slices}-{bframes}-{qp}-{quality}-{profile}"
".h265".format(case, **params))
params["decoded"] = get_media()._test_artifact(
"{}-{gop}-{slices}-{bframes}-{qp}-{quality}-{profile}-{width}x{height}-{format}"
".yuv".format(case, **params))
if params["mformat"] is None:
slash.skip_test("{format} format not supported".format(**params))
call(
"ffmpeg -init_hw_device qsv=qsv:hw -hwaccel qsv -filter_hw_device qsv"
" -v debug -f rawvideo -pix_fmt {mformat} -s:v {width}x{height}"
" -i {source} -vf 'hwupload=extra_hw_frames=64' -an"
" -c:v hevc_qsv -profile:v {profile} -g {gop} -bf {bframes} -slices {slices}"
" -q {qp} -preset {quality} -load_plugin hevc_hw -vframes {frames}"
" -y {encoded}".format(**params))
check_psnr(params)
#-------------------------------------------------#
#---------------------- CBR 8 --------------------#
#-------------------------------------------------#
@slash.requires(have_ffmpeg)
@slash.requires(have_ffmpeg_qsv_accel)
@slash.requires(have_ffmpeg_hevc_qsv_encode)
@slash.requires(have_ffmpeg_hevc_qsv_decode)
@slash.requires(using_compatible_driver)
@slash.parametrize(*gen_hevc_cbr_parameters(spec8, ["main"]))
@platform_tags(HEVC_ENCODE_8BIT_PLATFORMS)
def test_8bit_cbr(case, gop, slices, bframes, bitrate, fps, profile):
params = spec8[case].copy()
mprofile = mapprofile("hevc-8", profile)
if mprofile is None:
slash.skip_test("{} profile is not supported".format(profile))
params.update(
profile = mprofile, fps = fps, bitrate = bitrate, gop = gop,
slices = slices, bframes = bframes, mformat = mapformat(params["format"]))
params["encoded"] = get_media()._test_artifact(
"{}-{profile}-{bitrate}-{gop}-{slices}-{bframes}-{fps}"
".h265".format(case, **params))
params["decoded"] = get_media()._test_artifact(
"{}-{profile}-{bitrate}-{gop}-{slices}-{bframes}-{fps}-{width}x{height}-{format}"
".yuv".format(case, **params))
if params["mformat"] is None:
slash.skip_test("{format} format not supported".format(**params))
call(
"ffmpeg -init_hw_device qsv=qsv:hw -hwaccel qsv -filter_hw_device qsv"
" -v debug -f rawvideo -pix_fmt {mformat} -s:v {width}x{height}"
" -r:v {fps} -i {source} -vf 'hwupload=extra_hw_frames=64' -an"
" -c:v hevc_qsv -profile:v {profile} -g {gop} -bf {bframes} -slices {slices}"
" -b:v {bitrate}k -maxrate {bitrate}k -load_plugin hevc_hw"
" -vframes {frames} -y {encoded}".format(**params))
check_bitrate(params)
check_psnr(params)
#-------------------------------------------------#
#---------------------- VBR 8 --------------------#
#-------------------------------------------------#
@slash.requires(have_ffmpeg)
@slash.requires(have_ffmpeg_qsv_accel)
@slash.requires(have_ffmpeg_hevc_qsv_encode)
@slash.requires(have_ffmpeg_hevc_qsv_decode)
@slash.requires(using_compatible_driver)
@slash.parametrize(*gen_hevc_vbr_parameters(spec8, ["main"]))
@platform_tags(HEVC_ENCODE_8BIT_PLATFORMS)
def test_8bit_vbr(case, gop, slices, bframes, bitrate, fps, quality, refs, profile):
params = spec8[case].copy()
mprofile = mapprofile("hevc-8", profile)
if mprofile is None:
slash.skip_test("{} profile is not supported".format(profile))
# target percentage 50%
minrate = bitrate
maxrate = bitrate * 2
params.update(
profile = mprofile, fps = fps, bitrate = bitrate, gop = gop, refs = refs,
slices = slices, bframes = bframes, quality = quality, minrate = minrate,
maxrate = maxrate, mformat = mapformat(params["format"]))
params["encoded"] = get_media()._test_artifact(
"{}-{profile}-{bitrate}-{gop}-{slices}-{bframes}-{fps}-{quality}-{refs}"
".h265".format(case, **params))
params["decoded"] = get_media()._test_artifact(
"{}-{profile}-{bitrate}-{gop}-{slices}-{bframes}-{fps}-{quality}-{refs}"
"-{width}x{height}-{format}"
".yuv".format(case, **params))
if params["mformat"] is None:
slash.skip_test("{format} format not supported".format(**params))
call(
"ffmpeg -init_hw_device qsv=qsv:hw -hwaccel qsv -filter_hw_device qsv"
" -v debug -f rawvideo -pix_fmt {mformat} -s:v {width}x{height}"
" -r:v {fps} -i {source} -vf 'hwupload=extra_hw_frames=64' -an"
" -c:v hevc_qsv -profile:v {profile} -g {gop} -bf {bframes}"
" -slices {slices} -refs {refs} -preset {quality}"
" -b:v {minrate}k -maxrate {maxrate}k -load_plugin hevc_hw"
" -vframes {frames} -y {encoded}".format(**params))
check_bitrate_vbr(params)
check_psnr(params)
#-------------------------------------------------#
#-------------------- CQP 10 ---------------------#
#-------------------------------------------------#
@slash.requires(have_ffmpeg)
@slash.requires(have_ffmpeg_qsv_accel)
@slash.requires(have_ffmpeg_hevc_qsv_encode)
@slash.requires(have_ffmpeg_hevc_qsv_decode)
@slash.requires(using_compatible_driver)
@slash.parametrize(*gen_hevc_cqp_parameters(spec10, ['main10']))
@platform_tags(HEVC_ENCODE_10BIT_PLATFORMS)
def test_10bit_cqp(case, gop, slices, bframes, qp, quality, profile):
params = spec10[case].copy()
params.update(
mprofile = mapprofile("hevc-10", profile), gop = gop, slices = slices,
bframes = bframes, qp = qp, profile = profile, quality = quality,
mformat = mapformat(params["format"]))
if params["mprofile"] is None:
slash.skip_test("{profile} profile is not supported".format(**params))
if params["mformat"] is None:
slash.skip_test("{format} format not supported".format(**params))
params["encoded"] = get_media()._test_artifact(
"{}-{gop}-{slices}-{bframes}-{qp}-{quality}-{profile}"
".h265".format(case, **params))
params["decoded"] = get_media()._test_artifact(
"{}-{gop}-{slices}-{bframes}-{qp}-{quality}-{profile}-{width}x{height}-{format}"
".yuv".format(case, **params))
call(
"ffmpeg -init_hw_device qsv=qsv:hw -hwaccel qsv -filter_hw_device qsv"
" -v debug -f rawvideo -pix_fmt {mformat} -s:v {width}x{height}"
" -i {source} -vf 'hwupload=extra_hw_frames=64' -an -c:v hevc_qsv"
" -profile:v {profile} -g {gop} -bf {bframes} -slices {slices}"
" -q {qp} -preset {quality} -load_plugin hevc_hw -vframes {frames}"
" -y {encoded}".format(**params))
check_psnr(params)
#-------------------------------------------------#
#-------------------- CBR 10 ---------------------#
#-------------------------------------------------#
@slash.requires(have_ffmpeg)
@slash.requires(have_ffmpeg_qsv_accel)
@slash.requires(have_ffmpeg_hevc_qsv_encode)
@slash.requires(have_ffmpeg_hevc_qsv_decode)
@slash.requires(using_compatible_driver)
@slash.parametrize(*gen_hevc_cbr_parameters(spec10, ['main10']))
@platform_tags(HEVC_ENCODE_10BIT_PLATFORMS)
def test_10bit_cbr(case, gop, slices, bframes, bitrate, fps, profile):
params = spec10[case].copy()
params.update(
mprofile = mapprofile("hevc-10", profile), fps = fps, bitrate = bitrate,
profile = profile, gop = gop, slices = slices, bframes = bframes,
mformat = mapformat(params["format"]))
if params["mprofile"] is None:
slash.skip_test("{profile} profile is not supported".format(**params))
if params["mformat"] is None:
slash.skip_test("{format} format not supported".format(**params))
params["encoded"] = get_media()._test_artifact(
"{}-{profile}-{bitrate}-{gop}-{slices}-{bframes}-{fps}"
".h265".format(case, **params))
params["decoded"] = get_media()._test_artifact(
"{}-{profile}-{bitrate}-{gop}-{slices}-{bframes}-{fps}-{width}x{height}-{format}"
".yuv".format(case, **params))
call(
"ffmpeg -init_hw_device qsv=qsv:hw -hwaccel qsv -filter_hw_device qsv"
" -v debug -f rawvideo -pix_fmt {mformat} -s:v {width}x{height}"
" -r:v {fps} -i {source} -vf 'hwupload=extra_hw_frames=64' -an"
" -c:v hevc_qsv -profile:v {profile} -g {gop} -bf {bframes} -slices {slices}"
" -b:v {bitrate}k -maxrate {bitrate}k -load_plugin hevc_hw"
" -vframes {frames} -y {encoded}".format(**params))
check_bitrate(params)
check_psnr(params)
#-------------------------------------------------#
#-------------------- VBR 10 ---------------------#
#-------------------------------------------------#
@slash.requires(have_ffmpeg)
@slash.requires(have_ffmpeg_qsv_accel)
@slash.requires(have_ffmpeg_hevc_qsv_encode)
@slash.requires(have_ffmpeg_hevc_qsv_decode)
@slash.requires(using_compatible_driver)
@slash.parametrize(*gen_hevc_vbr_parameters(spec10, ['main10']))
@platform_tags(HEVC_ENCODE_10BIT_PLATFORMS)
def test_10bit_vbr(case, gop, slices, bframes, bitrate, fps, quality, refs, profile):
params = spec10[case].copy()
# target percentage 50%
minrate = bitrate
maxrate = bitrate * 2
params.update(
mprofile = mapprofile("hevc-10", profile), fps = fps, bitrate = bitrate,
profile = profile, gop = gop, slices = slices, bframes = bframes,
refs = refs, quality = quality, minrate = minrate, maxrate = maxrate,
mformat = mapformat(params["format"]))
if params["mprofile"] is None:
slash.skip_test("{profile} profile is not supported".format(**params))
if params["mformat"] is None:
slash.skip_test("{format} format not supported".format(**params))
params["encoded"] = get_media()._test_artifact(
"{}-{profile}-{bitrate}-{gop}-{slices}-{bframes}-{fps}-{quality}-{refs}"
".h265".format(case, **params))
params["decoded"] = get_media()._test_artifact(
"{}-{profile}-{bitrate}-{gop}-{slices}-{bframes}-{fps}-{quality}-{refs}"
"-{width}x{height}-{format}"
".yuv".format(case, **params))
call(
"ffmpeg -init_hw_device qsv=qsv:hw -hwaccel qsv -filter_hw_device qsv"
" -v debug -f rawvideo -pix_fmt {mformat} -s:v {width}x{height}"
" -r:v {fps} -i {source} -vf 'hwupload=extra_hw_frames=64' -an"
" -c:v hevc_qsv -profile:v {profile} -g {gop} -bf {bframes}"
" -slices {slices} -refs {refs} -preset {quality}"
" -b:v {minrate}k -maxrate {maxrate}k -load_plugin hevc_hw"
" -vframes {frames} -y {encoded}".format(**params))
check_bitrate_vbr(params)
check_psnr(params)
| 39.270769 | 85 | 0.638643 | 1,588 | 12,763 | 4.946474 | 0.099496 | 0.04965 | 0.051941 | 0.070274 | 0.896117 | 0.888733 | 0.885169 | 0.885169 | 0.876003 | 0.859453 | 0 | 0.01326 | 0.131395 | 12,763 | 324 | 86 | 39.391975 | 0.695291 | 0.087362 | 0 | 0.773109 | 0 | 0.088235 | 0.350962 | 0.09145 | 0 | 0 | 0 | 0 | 0.008403 | 1 | 0.037815 | false | 0.004202 | 0.008403 | 0 | 0.046218 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
84045c06f80e14f153165b3e5ba672372dd4ab38 | 2,134 | py | Python | gym-dubins-airplane/gym_dubins_airplane/envs/config.py | Cenderme/super-octo-waddle | 723b838487dd8127f79b4797f76d427c928f56da | [
"MIT"
] | null | null | null | gym-dubins-airplane/gym_dubins_airplane/envs/config.py | Cenderme/super-octo-waddle | 723b838487dd8127f79b4797f76d427c928f56da | [
"MIT"
] | null | null | null | gym-dubins-airplane/gym_dubins_airplane/envs/config.py | Cenderme/super-octo-waddle | 723b838487dd8127f79b4797f76d427c928f56da | [
"MIT"
] | 1 | 2021-03-28T16:06:47.000Z | 2021-03-28T16:06:47.000Z | import math
import numpy as np
class Config:
G = 9.8
EPISODES = 1000
vel_mps = 20 # velocity of aircrafts
action_time = 0.5
action_size = 15
red_health = 0
blue_health = 0
# input dim
window_width = 800 # pixels
window_height = 800 # pixels
window_z = 800 # pixels
diagonal = 800 # this one is used to normalize dist_to_intruder
tick = 30
scale = 30
d_min = 25 # minimum distance between aircrafts for gunfire (dangerous circle)
d_max = 300 # maximum distance between aircrafts for gunfire (outer circle)
# distance param
minimum_separation = 555 / scale
NMAC_dist = 150 / scale
horizon_dist = 4000 / scale
initial_min_dist = 3000 / scale
goal_radius = 600 / scale
dist_norm = window_width
deg_norm = np.pi
# speed
min_speed = 50 / scale
max_speed = 80 / scale
d_speed = 5 / scale
speed_sigma = 2 / scale
position_sigma = 10 / scale
# maximum training steps
max_steps = 1000
=======
import math
import numpy as np
class Config:
G = 9.8
EPISODES = 1000
red_health = 0
blue_health = 0
# input dim
window_width = 800 # pixels
window_height = 800 # pixels
window_z = 800 # pixels
diagonal = 800 # this one is used to normalize dist_to_intruder
tick = 30
scale = 30
d_min = 25 # minimum distance between aircrafts for gunfire (dangerous circle)
d_max = 300 # maximum distance between aircrafts for gunfire (outer circle)
# distance param
minimum_separation = 555 / scale
NMAC_dist = 150 / scale
horizon_dist = 4000 / scale
initial_min_dist = 3000 / scale
goal_radius = 600 / scale
dist_norm = 800
deg_norm = np.pi
# speed
min_speed = 50 / scale
max_speed = 80 / scale
d_speed = 5 / scale
speed_sigma = 2 / scale
position_sigma = 10 / scale
# maximum training steps
max_steps = 1000
# reward setting
position_reward = 10. / 10.
heading_reward = 10 / 10.
collision_penalty = -5. / 10
outside_penalty = -1. / 10
step_penalty = -0.01 / 10
| 22.463158 | 83 | 0.636364 | 291 | 2,134 | 4.477663 | 0.323024 | 0.041443 | 0.046048 | 0.082886 | 0.873369 | 0.873369 | 0.873369 | 0.873369 | 0.873369 | 0.873369 | 0 | 0.096386 | 0.299906 | 2,134 | 94 | 84 | 22.702128 | 0.77577 | 0.251172 | 0 | 0.830769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.061538 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
844d8224b9e31c274ce76ed5211771fd4108ac99 | 4,697 | py | Python | tests/test_demo.py | alex3142/Giovanni_Chat_Bot | 881b833b078d2549c6768cb7bd0bc4deb59f10d6 | [
"Unlicense"
] | null | null | null | tests/test_demo.py | alex3142/Giovanni_Chat_Bot | 881b833b078d2549c6768cb7bd0bc4deb59f10d6 | [
"Unlicense"
] | null | null | null | tests/test_demo.py | alex3142/Giovanni_Chat_Bot | 881b833b078d2549c6768cb7bd0bc4deb59f10d6 | [
"Unlicense"
] | null | null | null | from unittest import TestCase
from giovanni.pipeline import Pipeline
from giovanni.generation import response_templates, ResponseType
class TestPipeline(TestCase):
def __init__(self, *args, **kwargs):
super(TestPipeline, self).__init__(*args, **kwargs)
self.pipeline = Pipeline('../resources/project_giovanni.0.2.ttl', )
def setUp(self):
self.pipeline.new_session(session_type='html')
def test_demo1(self):
response = self.pipeline.process_text("Hi.")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.GREETING])
response = self.pipeline.process_text("Can you suggest a recipe?")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.REQUEST_INGREDIENTS])
response = self.pipeline.process_text("Rice")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.REQUEST_CUISINE])
response = self.pipeline.process_text("Italian")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.SUGGEST_RECIPE_LINK])
self.assertIn("risotto", response.parts[0])
response = self.pipeline.process_text("Sure")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.FAREWELL])
def test_demo2(self):
response = self.pipeline.process_text("Hi.")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.GREETING])
response = self.pipeline.process_text("Can you suggest a recipe?")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.REQUEST_INGREDIENTS])
response = self.pipeline.process_text("Rice, eggs, garlic, but not tomatoes or chicken.")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.REQUEST_CUISINE])
response = self.pipeline.process_text("chinese or french but not italian or african")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.SUGGEST_RECIPE_LINK])
self.assertIn("Fried rice with prawns and chorizo", response.parts[0])
response = self.pipeline.process_text("Sure")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.FAREWELL])
def test_demo3(self):
response = self.pipeline.process_text("Hi, can you suggest something with tomatoes and onion but not chicken.")
self.assertEqual(len(response.templates), 2)
self.assertIn(response.templates[0], response_templates[ResponseType.GREETING])
self.assertIn(response.templates[1], response_templates[ResponseType.REQUEST_CUISINE])
response = self.pipeline.process_text("chinese or french but not italian or african")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.SUGGEST_RECIPE_LINK])
self.assertIn("Fried rice with prawns and chorizo", response.parts[0])
response = self.pipeline.process_text("Sure")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.FAREWELL])
def test_demo4(self):
response = self.pipeline.process_text("Hi, can you suggest something spanish with rice and onion but not beef or peas.")
self.assertEqual(len(response.templates), 2)
self.assertIn(response.templates[0], response_templates[ResponseType.GREETING])
self.assertIn(response.templates[1], response_templates[ResponseType.SUGGEST_RECIPE_LINK])
self.assertIn("chicken paella", response.parts[0])
response = self.pipeline.process_text("No")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.SUGGEST_RECIPE_LINK])
self.assertIn("seafood paella", response.parts[0])
response = self.pipeline.process_text("ok, thanks")
self.assertEqual(len(response.templates), 1)
self.assertIn(response.templates[0], response_templates[ResponseType.FAREWELL])
| 44.311321 | 129 | 0.70577 | 531 | 4,697 | 6.120527 | 0.152542 | 0.277231 | 0.169538 | 0.160615 | 0.860923 | 0.860923 | 0.856615 | 0.856615 | 0.847692 | 0.816308 | 0 | 0.011728 | 0.183096 | 4,697 | 105 | 130 | 44.733333 | 0.835288 | 0 | 0 | 0.632353 | 0 | 0 | 0.11324 | 0.008057 | 0 | 0 | 0 | 0 | 0.573529 | 1 | 0.088235 | false | 0 | 0.044118 | 0 | 0.147059 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
f24cfad4712628744c4d2bc8759a68bcf2b7a017 | 108,048 | py | Python | npsurvival_models.py | georgehc/dksa | bcd9eab6c9ded47f5b166cf1351b06e26e0c8f90 | [
"MIT"
] | 8 | 2020-07-31T00:46:32.000Z | 2021-12-07T08:16:00.000Z | npsurvival_models.py | georgehc/dksa | bcd9eab6c9ded47f5b166cf1351b06e26e0c8f90 | [
"MIT"
] | 1 | 2021-01-21T15:30:44.000Z | 2021-01-22T17:52:02.000Z | npsurvival_models.py | georgehc/dksa | bcd9eab6c9ded47f5b166cf1351b06e26e0c8f90 | [
"MIT"
] | 1 | 2021-01-21T12:37:23.000Z | 2021-01-21T12:37:23.000Z | """
Nonparametric survival estimators
Author: George H. Chen (georgechen [at symbol] cmu.edu)
This file contains the following classes and various helper functions (all of
these implement both Kaplan-Meier and Nelson-Aalen versions):
- BasicSurvival : basic Kaplan-Meier and Nelson-Aalen estimators that do not
account for feature vectors
- KNNSurvival : k-NN survival estimation
- KNNWeightedSurvival: weighted k-NN survival estimation
- KernelSurvival : kernel survival estimation
- RandomSurvivalForest : a heavily modified version of Wrymm's random survival
forest code (the version last updated Feb 28, 2017)
[https://github.com/Wrymm/Random-Survival-Forests]; changes are discussed
below
- RandomSurvivalForestANN : kernel survival estimation where the kernel is
learned using a random survival forest (ANN stands for "adaptive nearest
neighbors"; one can interpret this is either an adaptive kernel method or an
adaptive nearest neighbors method where the neighbors are weighted)
- CDFRegressionKNNWeightedSurvival : implements the "cdf-reg" two-step method
mentioned in the ICML paper:
George H. Chen. Nearest Neighbor and Kernel Survival Analysis:
Nonasymptotic Error Bounds and Strong Consistency Rates. ICML 2019.
Random survival forests are by Hemant Ishwaran, Udaya B. Kogalur, Eugene H.
Blackstone, and Michael S. Lauer: "Random survival forests" (Annals of Applied
Stats 2008); see also Ishwaran and Kogalur's "Random survival forests for R"
article in Rnews (2007) and their R package "randomForestSRC".
Setup
-----
Be sure to compile the cython code by running:
python setup_random_survival_forest_cython.py build_ext --inplace
* * * * *
Main changes to Wrymm's code (the version last updated Feb 28, 2017):
- the log-rank splitting score denominator calculation appeared to be missing a
Y_i factor (prior to taking the square root); this has been fixed
- the log-rank splitting score code is implemented in cython
- Wrymm's code only splits on medians of feature values rather than optimizing
for the best split; I have added both an exhaustive split option (tries
every split threshold among the observed feature values) and a random split
option (Ishwaran et al suggest in their Annals of Applied Stats paper that
this randomized strategy actually works quite well)
- Wrymm's code has `min_samples_split` refer to what scikit-learn calls
`min_samples_leaf`; I switched the variable name to match that of
scikit-learn and also introduced what scikit-learn calls `min_samples_split`
as a parameter
- many survival probabilities are computed at once for a given feature vector
(i.e., rather than computing the probability of a subject surviving beyond
one choice of time, compute the probabilities of a subject surviving beyond a
collection of different times)
- added code to predict subject-specific cumulative hazard functions
- randomization can now be made deterministic by providing either an integer
random seed or a numpy RandomState instance
- pandas has been removed to speed up the code
- parallelism is now supported both in fitting and prediction
"""
from collections import Counter
import functools
import pickle
import numpy as np
from joblib import Parallel, delayed
from lifelines.utils import concordance_index
from sklearn.neighbors import NearestNeighbors
from random_survival_forest_cython import logrank
class RandomSurvivalForest():
def __init__(self, n_estimators=100, max_features='sqrt', max_depth=None,
min_samples_split=2, min_samples_leaf=1, split='logrank',
split_threshold_mode='exhaustive', random_state=None,
n_jobs=None, oob_score=False, feature_importance=False):
"""
A random survival forest survival probability estimator. This is very
similar to the usual random forest that is used for regression and
classification. However, in a random survival forest, the prediction
task is to estimate the survival probability function for a test
feature vector. Training data can have right-censoring. For details,
see any introductory text on survival analysis.
Parameters
----------
n_estimators : int, optional (default=100)
Number of trees.
max_features : int, string, optional (default='sqrt')
Number of features chosen per tree. Allowable string choices are
'sqrt' (max_features=ceil(sqrt(n_features))) and 'log2'
(max_features=ceil(log2(n_features))).
max_depth : int, optional (default=None)
Maximum depth of each tree. If None, then each tree is grown
until other termination criteria are met (see `min_samples_split`
and `min_samples_leaf` parameters).
min_samples_split : int, optional (default=2)
A node must have at least this many samples to be split.
min_samples_leaf : int, float, optional (default=1)
Both sides of a split must have at least this many samples
(or in the case of a fraction, at least a fraction of samples)
for the split to happen. Otherwise, the node is turned into a
leaf node.
split : string, optional (default='logrank')
Currently only the log-rank splitting criterion is supported.
split_threshold_mode : string, optional (default='exhaustive')
If 'exhaustive', then we compute the split score for every observed
feature value as a possible threshold (this can be very expensive).
If 'median', then for any feature, we always split on the median
value observed for that feature (this is the only supported option
in Wrymm's original random survival analysis code).
If 'random', then for any feature, we randomly choose a split
threshold among the observed feature values (this is recommended by
the random survival forest authors if fast computation is desired).
random_state : int, numpy RandomState instance, None, optional
(default=None)
If an integer, then a new numpy RandomState is created with the
integer as the random seed. If a numpy RandomState instance is
provided, then it is used as the pseudorandom number generator. If
None is specified, then a new numpy RandomState is created without
providing a seed.
n_jobs : int, None, optional (default=None)
Number of cores to use with joblib's Parallel. This is the same
`n_jobs` parameter as for Parallel. Setting `n_jobs` to -1 uses all
the cores.
oob_score : boolean, optional (default=False)
Whether to compute an out-of-bag (OOB) accuracy estimate (as with
the original random survival forest paper, this is done using
c-index with cumulative hazard estimates). The OOB estimate is
computed during model fitting (via fit()), and the resulting
c-index estimate is stored in the attribute `oob_score_`.
feature_importance : boolean, optional (default=False)
Whether to compute feature importances (requires `oob_score` to
be set to True). Feature importances are computed during the
model fitting (via fit()), and the resulting feature importances is
stored in the attribute `feature_importances_`.
"""
self.n_estimators = n_estimators
self.max_depth = max_depth
self.min_samples_split = min_samples_split
self.min_samples_leaf = min_samples_leaf
self.max_features = max_features
self.split_threshold_mode = split_threshold_mode
self.n_jobs = n_jobs
self.oob_score = oob_score
self.feature_importance = feature_importance
self.column_names = None
self.oob_score_ = None
self.feature_importances_ = None
if random_state is None:
self.random_state = np.random.RandomState()
elif type(random_state) == int:
self.random_state = np.random.RandomState(random_state)
else:
self.random_state = random_state
if split == 'logrank':
self.split_score_function = logrank
else:
raise NotImplementedError('Unsupported split criterion '
+ '"{0}"'.format(split))
def save(self, filename):
data = {'n_estimators': self.n_estimators,
'max_depth': self.max_depth,
'min_samples_split': self.min_samples_split,
'min_samples_leaf': self.min_samples_leaf,
'max_features': self.max_features,
'split_threshold_mode': self.split_threshold_mode,
'n_jobs': self.n_jobs,
'oob_score': self.oob_score,
'feature_importance': self.feature_importance,
'column_names': list(self.column_names),
'oob_score_': self.oob_score_}
if self.feature_importances_ is not None:
data['feature_importances_'] = self.feature_importances_.tolist()
else:
data['feature_importances_'] = None
data['trees'] = \
[_convert_to_not_use_numpy(tree) for tree in self.trees]
data['tree_bootstrap_indices'] = \
[indices.tolist() for indices in self.tree_bootstrap_indices]
with open(filename, 'wb') as f:
pickle.dump(data, f)
@staticmethod
def load(filename):
with open(filename, 'rb') as f:
data = pickle.load(f)
rsf = \
RandomSurvivalForest(n_estimators=data['n_estimators'],
max_features=data['max_features'],
max_depth=data['max_depth'],
min_samples_split=data['min_samples_split'],
min_samples_leaf=data['min_samples_leaf'],
split='logrank',
split_threshold_mode='exhaustive',
random_state=None,
n_jobs=data['n_jobs'],
oob_score=data['oob_score'],
feature_importance=data['feature_importance'])
rsf.column_names = data['column_names']
rsf.oob_score_ = data['oob_score_']
if data['feature_importances_'] is None:
rsf.feature_importances_ = None
else:
rsf.feature_importances_ = np.array(data['feature_importances_'])
rsf.trees = [_convert_to_use_numpy(tree) for tree in data['trees']]
rsf.tree_bootstrap_indices = \
np.array([indices for indices in data['tree_bootstrap_indices']])
for tree in rsf.trees:
_label_leaves(tree)
return rsf
def fit(self, X, y, column_names=None):
"""
Fits the random survival forest to training data.
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
y : 2D numpy array, shape = [n_samples, 2]
Survival labels (first column is for observed times, second column
is for event indicators). The i-th row corresponds to the i-th row
in `X`.
column_names : list, None, optional (default=None)
Names for features can be specified. This is only for display
purposes when using the `draw` method. If set to None, then
`column_names` is just set to be a range of integers indexing the
columns from 0.
Returns
-------
None
"""
if column_names is None:
self.column_names = list(range(X.shape[1]))
else:
self.column_names = column_names
assert len(column_names) == X.shape[1]
if type(self.max_features) == str:
if self.max_features == 'sqrt':
max_features = int(np.ceil(np.sqrt(X.shape[1])))
elif self.max_features == 'log2':
max_features = int(np.ceil(np.log2(X.shape[1])))
else:
raise NotImplementedError('Unsupported max features choice '
+ '"{0}"'.format(self.max_features))
else:
max_features = self.max_features
self.tree_bootstrap_indices = []
sort_indices = np.argsort(y[:, 0])
X = X[sort_indices].astype(np.float)
y = y[sort_indices].astype(np.float)
random_state = self.random_state
for tree_idx in range(self.n_estimators):
bootstrap_indices = np.sort(random_state.choice(X.shape[0],
X.shape[0],
replace=True))
self.tree_bootstrap_indices.append(bootstrap_indices)
with Parallel(n_jobs=self.n_jobs) as parallel:
self.trees = \
parallel(
delayed(_build_tree)(
X[self.tree_bootstrap_indices[tree_idx]],
y[self.tree_bootstrap_indices[tree_idx]],
0, self.max_depth, max_features,
self.split_score_function, self.min_samples_split,
self.min_samples_leaf, self.split_threshold_mode,
np.random.RandomState(random_state.randint(4294967296)))
for tree_idx in range(self.n_estimators))
if self.oob_score:
parallel_args = []
oob_masks = []
for tree_idx, bootstrap_indices \
in enumerate(self.tree_bootstrap_indices):
oob_mask = np.ones(X.shape[0], dtype=np.bool)
for idx in bootstrap_indices:
oob_mask[idx] = 0
if oob_mask.sum() > 0:
X_oob = X[oob_mask]
if len(X_oob.shape) == 1:
X_oob = X_oob.reshape(1, -1)
parallel_args.append((tree_idx, X_oob))
oob_masks.append(
(oob_mask,
{original_idx: new_idx
for new_idx, original_idx
in enumerate(np.where(oob_mask)[0])}))
sorted_unique_times = np.unique(y[:, 0])
results = parallel(
delayed(_predict_tree)(
self.trees[tree_idx], 'cum_haz', X_oob,
sorted_unique_times, True)
for (tree_idx, X_oob) in parallel_args)
num_unique_times = len(sorted_unique_times)
cum_hazard_scores = []
oob_y = []
for idx in range(X.shape[0]):
num = 0.
den = 0.
for tree_idx2, (oob_mask, forward_map) \
in enumerate(oob_masks):
if oob_mask[idx]:
num += results[tree_idx2][forward_map[idx]].sum()
den += 1
if den > 0:
cum_hazard_scores.append(num / den)
oob_y.append(y[idx])
cum_hazard_scores = np.array(cum_hazard_scores)
oob_y = np.array(oob_y)
self.oob_score_ = concordance_index(oob_y[:, 0],
-cum_hazard_scores,
oob_y[:, 1])
if self.feature_importance:
self.feature_importances_ = []
for col_idx in range(X.shape[1]):
vimp_results = \
parallel(
delayed(_predict_tree_vimp)(
self.trees[tree_idx], 'cum_haz',
X_oob, sorted_unique_times, True,
col_idx,
np.random.RandomState(
random_state.randint(4294967296)))
for (tree_idx, X_oob)
in parallel_args)
cum_hazard_scores = []
oob_y = []
for idx in range(X.shape[0]):
num = 0.
den = 0.
for tree_idx2, (oob_mask, forward_map) \
in enumerate(oob_masks):
if oob_mask[idx]:
num += vimp_results[tree_idx2][
forward_map[idx]].sum()
den += 1
if den > 0:
cum_hazard_scores.append(num / den)
oob_y.append(y[idx])
if len(cum_hazard_scores) > 0:
cum_hazard_scores = np.array(cum_hazard_scores)
oob_y = np.array(oob_y)
vimp = self.oob_score_ - \
concordance_index(oob_y[:, 0],
-cum_hazard_scores,
oob_y[:, 1])
else:
vimp = np.nan
self.feature_importances_.append(vimp)
self.feature_importances_ \
= np.array(self.feature_importances_)
for tree in self.trees:
_label_leaves(tree)
def predict_leaf_ids(self, X):
results = Parallel(n_jobs=self.n_jobs)(
delayed(_predict_tree_leaf_id)(self.trees[tree_idx], X)
for tree_idx in range(self.n_estimators))
return np.array(results).T
def predict_surv(self, X, times, presorted_times=False,
use_kaplan_meier=True):
"""
Computes the forest's survival probability function estimate for each
feature vector evaluated at user-specified times.
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the survival probability function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
use_kaplan_meier : boolean, optional (default=True)
In the original random survival forests paper, only the cumulative
hazard function H(t|x) is predicted from the leafs rather than the
survival function S(t|x). One can back out the survival function
from the cumulative hazard function since S(t|x)=exp(-H(t|x)).
If this flag is set to True, then we have the forest predict S(t|x)
using Kaplan-Meier estimates at the leaves (instead of the
default of predicting H(t|x) with Nelson-Aalen estimates at the
leaves), and average the trees' S(t|x) estimates.
Returns
-------
output : 2D numpy array
Survival probability function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
if use_kaplan_meier:
results = Parallel(n_jobs=self.n_jobs)(
delayed(_predict_tree)(self.trees[tree_idx], 'surv', X, times,
presorted_times)
for tree_idx in range(self.n_estimators))
return functools.reduce(lambda x, y: x + y, results) \
/ self.n_estimators
else:
return np.exp(-self.predict_cum_haz(X, times, presorted_times))
def predict_cum_haz(self, X, times, presorted_times=False,
use_kaplan_meier=False, surv_eps=1e-12):
"""
Computes the forest's cumulative hazard function estimate for each
feature vector evaluated at user-specified times.
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the cumulative hazard function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
use_kaplan_meier : boolean, optional (default=False)
In the original random survival forests paper, only the cumulative
hazard function H(t|x) is predicted from the leafs rather than the
survival function S(t|x). One can back out the cumulative hazard
function from the survival function since H(t|x)=-log(S(t|x)).
If this flag is set to True, then we have the forest predict S(t|x)
first using Kaplan-Meier estimates at the leaves (instead of the
default of predicting H(t|x) with Nelson-Aalen estimates at the
leaves), and then we back out an estimate for H(t|x).
surv_eps : float, optional (default=1e-12)
If `use_kaplan_meier` is set to True, then we clip the estimated
survival function so that any value less than `surv_eps` is set to
`surv_eps`. This makes it so that when we take the negative log of
the survival function, we don't take logs of 0.
Returns
-------
output : 2D numpy array
Cumulative hazard function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
if use_kaplan_meier:
surv = self.predict_surv(X, times, presorted_times, True)
return -np.log(np.clip(surv, surv_eps, 1.))
else:
results = Parallel(n_jobs=self.n_jobs)(
delayed(_predict_tree)(self.trees[tree_idx], 'cum_haz', X, times,
presorted_times)
for tree_idx in range(self.n_estimators))
return functools.reduce(lambda x, y: x + y, results) \
/ self.n_estimators
def _print_with_depth(self, string, depth):
"""
Auxiliary function to print a string with indentation dependent on
depth.
"""
print("{0}{1}".format(" " * depth, string))
def _print_tree(self, tree, current_depth=0):
"""
Auxiliary function to print a survival tree.
"""
if 'surv' in tree:
self._print_with_depth(tree['times'], current_depth)
return
self._print_with_depth(
"{0} > {1}".format(self.column_names[tree['feature']],
tree['threshold']),
current_depth)
self._print_tree(tree['left'], current_depth + 1)
self._print_tree(tree['right'], current_depth + 1)
def draw(self):
"""
Prints out each tree of the random survival forest.
"""
for tree_idx, tree in enumerate(self.trees):
print("==========================================\nTree",
tree_idx)
self._print_tree(tree)
class BasicSurvival():
def __init__(self):
self.tree = None
def fit(self, y):
self.tree = _fit_leaf(y)
def predict_surv(self, times, presorted_times=False,
limit_from_left=False):
"""
Computes the Kaplan-Meier survival probability function estimate at
user-specified times.
Parameters
----------
times : 1D numpy array (default=None)
Times to compute the survival probability function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the survival
probability function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
Returns
-------
output : 1D numpy array
Survival probability function evaluated at each of the times
specified in `times`.
"""
return _predict_leaf(self.tree, 'surv', times, presorted_times,
limit_from_left)
def predict_cum_haz(self, times, presorted_times=False,
limit_from_left=False):
"""
Computes the Nelson-Aalen cumulative hazard function estimate at
user-specified times.
Parameters
----------
times : 1D numpy array
Times to compute the cumulative hazard function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the
cumulative hazard function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
Returns
-------
output : 1D numpy array
Cumulative hazard function evaluated at each of the times
specified in `times`.
"""
return _predict_leaf(self.tree, 'cum_haz', times, presorted_times,
limit_from_left)
class KNNSurvival():
def __init__(self, *args, **kwargs):
"""
Arguments are the same as for `sklearn.neighbors.NearestNeighbors`.
The simplest usage of this class is to use a single argument, which is
`n_neighbors` for the number of nearest neighbors (Euclidean distance
is assumed in this case). If you want to parallelize across different
search queries, use the `n_jobs` keyword parameter (-1 to use all
cores). To use other distances and for other details, please refer to
the documentation for sklearn's `NearestNeighbors` class.
*Important:* The prediction methods for this class use unweighted
k-nearest neighbors, where "k" is set equal to the `n_neighbors`
parameter.
"""
self.NN_index_args = args
self.NN_index_kwargs = kwargs
self.NN_index = None
def fit(self, X, y):
"""
Constructs a nearest-neighbor index given training data (so that for
a future data point, we can use the nearest-neighbor index to quickly
find what the closest training data are to the future point).
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
y : 2D numpy array, shape = [n_samples, 2]
Survival labels (first column is for observed times, second column
is for event indicators). The i-th row corresponds to the i-th row
in `X`.
Returns
-------
None
"""
self.train_y = y
self.NN_index = NearestNeighbors(*self.NN_index_args,
**self.NN_index_kwargs)
self.NN_index.fit(X)
def predict_surv(self, X, times, presorted_times=False,
limit_from_left=False, n_neighbors=None):
"""
Computes the k-NN Kaplan-Meier survival probability function estimate
at user-specified times.
*Important:* The default number of nearest neighbors to use is whatever
was specified in `args` or `kwargs` when creating an instance of this
class (the "k" in k-NN)!
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the survival probability function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the survival
probability function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
n_neighbors : int, None, optional (default=None)
Number of nearest neighbors to use. If set to None then the number
used is whatever was passed into `args` or `kwargs` when creating
an instance of this class.
Returns
-------
output : 2D numpy array
Survival probability function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
indices = self.NN_index.kneighbors(X, n_neighbors=n_neighbors,
return_distance=False)
train_y = self.train_y
return np.array([_predict_leaf(_fit_leaf(train_y[idx]), 'surv', times,
presorted_times, limit_from_left)
for idx in indices])
def predict_cum_haz(self, X, times, presorted_times=False,
limit_from_left=False, n_neighbors=None):
"""
Computes the k-NN Nelson-Aalen cumulative hazard function estimate at
user-specified times.
*Important:* The default number of nearest neighbors to use is whatever
was specified in `args` or `kwargs` when creating an instance of this
class (the "k" in k-NN)!
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the cumulative hazard function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the
cumulative hazard function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
n_neighbors : int, None, optional (default=None)
Number of nearest neighbors to use. If set to None then the number
used is whatever was passed into `args` or `kwargs` when creating
an instance of this class.
Returns
-------
output : 2D numpy array
Cumulative hazard function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
indices = self.NN_index.kneighbors(X, n_neighbors=n_neighbors,
return_distance=False)
train_y = self.train_y
return np.array([_predict_leaf(_fit_leaf(train_y[idx]), 'cum_haz',
times, presorted_times, limit_from_left)
for idx in indices])
class KNNWeightedSurvival():
def __init__(self, *args, **kwargs):
"""
Arguments are the same as for `sklearn.neighbors.NearestNeighbors`.
The simplest usage of this class is to use a single argument, which is
`n_neighbors` for the number of nearest neighbors (Euclidean distance
is assumed in this case). If you want to parallelize across different
search queries, use the `n_jobs` keyword parameter (-1 to use all
cores). To use other distances and for other details, please refer to
the documentation for sklearn's `NearestNeighbors` class.
*Important:* The prediction methods for this class use weighted
k-nearest neighbors, where "k" is set equal to the `n_neighbors`
parameter. The weights are specified through a kernel function K. In
particular, the i-th nearest neighbor X_i for a test point x is given a
weight of:
K( (distance between x and X_i) / (distance between x and X_k) ).
"""
self.NN_index_args = args
self.NN_index_kwargs = kwargs
self.NN_index = None
def fit(self, X, y):
"""
Constructs a nearest-neighbor index given training data (so that for
a future data point, we can use the nearest-neighbor index to quickly
find what the closest training data are to the future point).
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
y : 2D numpy array, shape = [n_samples, 2]
Survival labels (first column is for observed times, second column
is for event indicators). The i-th row corresponds to the i-th row
in `X`.
Returns
-------
None
"""
self.train_y = y
self.NN_index = NearestNeighbors(*self.NN_index_args,
**self.NN_index_kwargs)
self.NN_index.fit(X)
def predict_surv(self, X, times, presorted_times=False,
limit_from_left=False, n_neighbors=None,
kernel_function=None):
"""
Computes the weighted k-NN Kaplan-Meier survival probability function
estimate at user-specified times.
*Important:* The default number of nearest neighbors to use is whatever
was specified in `args` or `kwargs` when creating an instance of this
class (the "k" in k-NN)!
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the survival probability function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the survival
probability function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
n_neighbors : int, None, optional (default=None)
Number of nearest neighbors to use. If set to None then the number
used is whatever was passed into `args` or `kwargs` when creating
an instance of this class.
kernel_function : function, None, optional (default=None)
Kernel function to use. None corresponds to unweighted k-NN
survival analysis. If a function is specified, then the weighting
function used is of the form
"kernel(distance / distance to k-th nearest neighbor)".
Returns
-------
output : 2D numpy array
Survival probability function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
if kernel_function is None:
kernel_function = lambda s: 1
dists, indices = self.NN_index.kneighbors(X, n_neighbors=n_neighbors,
return_distance=True)
train_y = self.train_y
output = []
n_times = len(times)
for dist, idx in zip(dists, indices):
max_dist = np.max(dist)
weights = np.array([kernel_function(d / max_dist) for d in dist])
zero_weight = (weights == 0)
if zero_weight.sum() > 0:
weights_subset = weights[~zero_weight]
if weights_subset.size > 0:
output.append(
_predict_leaf(
_fit_leaf_weighted(train_y[idx[~zero_weight]],
weights_subset),
'surv', times, presorted_times, limit_from_left))
else:
output.append(np.ones(n_times))
else:
output.append(
_predict_leaf(
_fit_leaf_weighted(train_y[idx],
weights),
'surv', times, presorted_times, limit_from_left))
return np.array(output)
def predict_cum_haz(self, X, times, presorted_times=False,
limit_from_left=False, n_neighbors=None,
kernel_function=None):
"""
Computes the weighted k-NN Nelson-Aalen cumulative hazard function
estimate at user-specified times.
*Important:* The default number of nearest neighbors to use is whatever
was specified in `args` or `kwargs` when creating an instance of this
class (the "k" in k-NN)!
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the cumulative hazard function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the
cumulative hazard function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
n_neighbors : int, None, optional (default=None)
Number of nearest neighbors to use. If set to None then the number
used is whatever was passed into `args` or `kwargs` when creating
an instance of this class.
kernel_function : function, None, optional (default=None)
Kernel function to use. None corresponds to unweighted k-NN
survival analysis. If a function is specified, then the weighting
function used is of the form
"kernel(distance / distance to k-th nearest neighbor)".
Returns
-------
output : 2D numpy array
Cumulative hazard function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
if kernel_function is None:
kernel_function = lambda s: 1
dists, indices = self.NN_index.kneighbors(X, n_neighbors=n_neighbors,
return_distance=True)
train_y = self.train_y
output = []
n_times = len(times)
for dist, idx in zip(dists, indices):
max_dist = np.max(dist)
weights = np.array([kernel_function(d / max_dist) for d in dist])
zero_weight = (weights == 0)
if zero_weight.sum() > 0:
weights_subset = weights[~zero_weight]
if weights_subset.size > 0:
output.append(
_predict_leaf(
_fit_leaf_weighted(train_y[idx[~zero_weight]],
weights_subset),
'cum_haz', times, presorted_times, limit_from_left))
else:
output.append(np.zeros(n_times))
else:
output.append(
_predict_leaf(
_fit_leaf_weighted(train_y[idx],
weights),
'cum_haz', times, presorted_times, limit_from_left))
return np.array(output)
class KernelSurvival():
def __init__(self, *args, **kwargs):
"""
Arguments are the same as for `sklearn.neighbors.NearestNeighbors`.
The simplest usage of this class is to use a single argument, which is
`radius` for fixed-radius near-neighbor search (Euclidean distance is
assumed in this case). Put another way, any training data point farther
than `radius` away from a test point is assumed to contribute 0 weight
toward prediction for the test point. If you want to parallelize across
different search queries, use the `n_jobs` keyword parameter (-1 to use
all cores). To use other distances and for other details, please refer
to the documentation for sklearn's `NearestNeighbors` class.
"""
self.NN_index_args = args
self.NN_index_kwargs = kwargs
self.NN_index = None
def fit(self, X, y):
"""
Constructs a nearest-neighbor index given training data (so that for
a future data point, we can use the nearest-neighbor index to quickly
find what the closest training data are to the future point).
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
y : 2D numpy array, shape = [n_samples, 2]
Survival labels (first column is for observed times, second column
is for event indicators). The i-th row corresponds to the i-th row
in `X`.
Returns
-------
None
"""
self.train_y = y
self.NN_index = NearestNeighbors(*self.NN_index_args,
**self.NN_index_kwargs)
self.NN_index.fit(X)
def predict_surv(self, X, times, presorted_times=False,
limit_from_left=False, radius=None,
kernel_function=None):
"""
Computes the kernel Kaplan-Meier survival probability function estimate
at user-specified times.
*Important:* The default radius to use is whatever was specified in
`args` or `kwargs` when creating an instance of this class!
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the survival probability function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the survival
probability function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
radius : float, None, optional (default=None)
Neighbors farther than this distance from a test point have kernel
weight 0.
kernel_function : function, None, optional (default=None)
Kernel function to use. None corresponds to fixed-radius near
neighbors kernel survival analysis (i.e., a box kernel that
becomes 0 after `radius` distance away). If a function is
specified, then the weighting function used is of the form
"kernel(distance / radius)".
Returns
-------
output : 2D numpy array
Survival probability function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
if radius is None:
radius = self.NN_index.radius
if kernel_function is None:
kernel_function = lambda s: 1 # box kernel (i.e., uniform weights)
dists, indices = self.NN_index.radius_neighbors(X, radius=radius,
return_distance=True)
train_y = self.train_y
output = []
n_times = len(times)
for dist, idx in zip(dists, indices):
if dist.size > 0:
weights = np.array([kernel_function(d / radius) for d in dist])
zero_weight = (weights == 0)
if zero_weight.sum() > 0:
weights_subset = weights[~zero_weight]
if weights_subset.size > 0:
output.append(
_predict_leaf(
_fit_leaf_weighted(train_y[idx[~zero_weight]],
weights_subset),
'surv', times, presorted_times,
limit_from_left))
else:
output.append(np.ones(n_times))
else:
output.append(
_predict_leaf(
_fit_leaf_weighted(train_y[idx],
weights),
'surv', times, presorted_times, limit_from_left))
else:
output.append(np.ones(n_times))
return np.array(output)
def predict_cum_haz(self, X, times, presorted_times=False,
limit_from_left=False, radius=None,
kernel_function=None):
"""
Computes the kernel Nelson-Aalen cumulative hazard function estimate at
user-specified times.
*Important:* The default radius to use is whatever was specified in
`args` or `kwargs` when creating an instance of this class!
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the cumulative hazard function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the
cumulative hazard function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
radius : float, None, optional (default=None)
Neighbors farther than this distance from a test point have kernel
weight 0.
kernel_function : function, None, optional (default=None)
Kernel function to use. None corresponds to fixed-radius near
neighbors kernel survival analysis (i.e., a box kernel that
becomes 0 after `radius` distance away). If a function is
specified, then the weighting function used is of the form
"kernel(distance / radius)".
Returns
-------
output : 2D numpy array
Cumulative hazard function evaluated at each of the times specified
in `times` for each feature vector. The i-th row corresponds to the
i-th feature vector.
"""
if radius is None:
radius = self.NN_index.radius
if kernel_function is None:
kernel_function = lambda s: 1 # box kernel (i.e., uniform weights)
dists, indices = self.NN_index.radius_neighbors(X, radius=radius,
return_distance=True)
train_y = self.train_y
output = []
n_times = len(times)
for dist, idx in zip(dists, indices):
if dist.size > 0:
weights = np.array([kernel_function(d / radius) for d in dist])
zero_weight = (weights == 0)
if zero_weight.sum() > 0:
weights_subset = weights[~zero_weight]
if weights_subset.size > 0:
output.append(
_predict_leaf(
_fit_leaf_weighted(train_y[idx[~zero_weight]],
weights_subset),
'cum_haz', times, presorted_times,
limit_from_left))
else:
output.append(np.zeros(n_times))
else:
output.append(
_predict_leaf(
_fit_leaf_weighted(train_y[idx],
weights),
'cum_haz', times, presorted_times,
limit_from_left))
else:
output.append(np.zeros(n_times))
return np.array(output)
class RandomSurvivalForestANN():
def __init__(self, n_estimators=100, max_features='sqrt', max_depth=None,
min_samples_split=2, min_samples_leaf=1, split='logrank',
split_threshold_mode='exhaustive', random_state=None,
n_jobs=None):
"""
A modified version of the random survival forest survival probability
estimator. From a theoretical standpoint, tree construction works the
same way so each survival tree is associated with the same training
subjects as regular random survival forests. However, what needs to be
stored at each leaf is different in that instead of computing survival
probability or cumulative hazard function estimates per tree, we
instead use the learned tree only for identifying the adaptive nearest
neighbors (which have weights!). These weighted nearest neighbors
found per test point are then used to make a survival probability or
cumulative hazard function estimate using kernel variants of the
Kaplan-Meier and Nelson-Aalen estimators.
Parameters
----------
n_estimators : int, optional (default=100)
Number of trees.
max_features : int, string, optional (default='sqrt')
Number of features chosen per tree. Allowable string choices are
'sqrt' (max_features=ceil(sqrt(n_features))) and 'log2'
(max_features=ceil(log2(n_features))).
max_depth : int, optional (default=None)
Maximum depth of each tree. If None, then each tree is grown
until other termination criteria are met (see `min_samples_split`
and `min_samples_leaf` parameters).
min_samples_split : int, optional (default=2)
A node must have at least this many samples to be split.
min_samples_leaf : int, float, optional (default=1)
Both sides of a split must have at least this many samples
(or in the case of a fraction, at least a fraction of samples)
for the split to happen. Otherwise, the node is turned into a
leaf node.
split : string, optional (default='logrank')
Currently only the log-rank splitting criterion is supported.
split_threshold_mode : string, optional (default='exhaustive')
If 'exhaustive', then we compute the split score for every observed
feature value as a possible threshold (this can be very expensive).
If 'median', then for any feature, we always split on the median
value observed for that feature (this is the only supported option
in Wrymm's original random survival analysis code).
If 'random', then for any feature, we randomly choose a split
threshold among the observed feature values (this is recommended by
the random survival forest authors if fast computation is desired).
random_state : int, numpy RandomState instance, None, optional
(default=None)
If an integer, then a new numpy RandomState is created with the
integer as the random seed. If a numpy RandomState instance is
provided, then it is used as the pseudorandom number generator. If
None is specified, then a new numpy RandomState is created without
providing a seed.
n_jobs : int, None, optional (default=None)
Number of cores to use with joblib's Parallel. This is the same
`n_jobs` parameter as for Parallel. Setting `n_jobs` to -1 uses all
the cores.
"""
self.n_estimators = n_estimators
self.max_depth = max_depth
self.min_samples_split = min_samples_split
self.min_samples_leaf = min_samples_leaf
self.max_features = max_features
self.split_threshold_mode = split_threshold_mode
self.n_jobs = n_jobs
self.column_names = None
if random_state is None:
self.random_state = np.random.RandomState()
elif type(random_state) == int:
self.random_state = np.random.RandomState(random_state)
else:
self.random_state = random_state
if split == 'logrank':
self.split_score_function = logrank
else:
raise NotImplementedError('Unsupported split criterion '
+ '"{0}"'.format(split))
def save(self, filename):
data = {'n_estimators': self.n_estimators,
'max_depth': self.max_depth,
'min_samples_split': self.min_samples_split,
'min_samples_leaf': self.min_samples_leaf,
'max_features': self.max_features,
'split_threshold_mode': self.split_threshold_mode,
'n_jobs': self.n_jobs,
'oob_score': self.oob_score,
'feature_importance': self.feature_importance,
'column_names': list(self.column_names),
'oob_score_': self.oob_score_}
if self.feature_importances_ is not None:
data['feature_importances_'] = self.feature_importances_.tolist()
else:
data['feature_importances_'] = None
data['trees'] = \
[_convert_to_not_use_numpy(tree) for tree in self.trees]
data['tree_bootstrap_indices'] = \
[indices.tolist() for indices in self.tree_bootstrap_indices]
with open(filename, 'wb') as f:
pickle.dump(data, f)
@staticmethod
def load(filename):
with open(filename, 'rb') as f:
data = pickle.load(f)
rsf = \
RandomSurvivalForest(n_estimators=data['n_estimators'],
max_features=data['max_features'],
max_depth=data['max_depth'],
min_samples_split=data['min_samples_split'],
min_samples_leaf=data['min_samples_leaf'],
split='logrank',
split_threshold_mode='exhaustive',
random_state=None,
n_jobs=data['n_jobs'],
oob_score=data['oob_score'],
feature_importance=data['feature_importance'])
rsf.column_names = data['column_names']
rsf.oob_score_ = data['oob_score_']
if data['feature_importances_'] is None:
rsf.feature_importances_ = None
else:
rsf.feature_importances_ = np.array(data['feature_importances_'])
rsf.trees = [_convert_to_use_numpy(tree) for tree in data['trees']]
rsf.tree_bootstrap_indices = \
np.array([indices for indices in data['tree_bootstrap_indices']])
return rsf
def fit(self, X, y, column_names=None):
"""
Fits the random survival forest to training data.
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
y : 2D numpy array, shape = [n_samples, 2]
Survival labels (first column is for observed times, second column
is for event indicators). The i-th row corresponds to the i-th row
in `X`.
column_names : list, None, optional (default=None)
Names for features can be specified. This is only for display
purposes when using the `draw` method. If set to None, then
`column_names` is just set to be a range of integers indexing the
columns from 0.
Returns
-------
None
"""
if column_names is None:
self.column_names = list(range(X.shape[1]))
else:
self.column_names = column_names
assert len(column_names) == X.shape[1]
if type(self.max_features) == str:
if self.max_features == 'sqrt':
max_features = int(np.ceil(np.sqrt(X.shape[1])))
elif self.max_features == 'log2':
max_features = int(np.ceil(np.log2(X.shape[1])))
else:
raise NotImplementedError('Unsupported max features choice '
+ '"{0}"'.format(self.max_features))
else:
max_features = self.max_features
self.tree_bootstrap_indices = []
sort_indices = np.argsort(y[:, 0])
X = X[sort_indices].astype(np.float)
y = y[sort_indices].astype(np.float)
self.train_y = y
random_state = self.random_state
for tree_idx in range(self.n_estimators):
bootstrap_indices = np.sort(random_state.choice(X.shape[0],
X.shape[0],
replace=True))
self.tree_bootstrap_indices.append(bootstrap_indices)
with Parallel(n_jobs=self.n_jobs) as parallel:
self.trees = \
parallel(
delayed(_build_tree_ANN)(
X[self.tree_bootstrap_indices[tree_idx]],
y[self.tree_bootstrap_indices[tree_idx]],
self.tree_bootstrap_indices[tree_idx],
0, self.max_depth, max_features,
self.split_score_function, self.min_samples_split,
self.min_samples_leaf, self.split_threshold_mode,
np.random.RandomState(random_state.randint(4294967296)))
for tree_idx in range(self.n_estimators))
def predict_surv(self, X, times, presorted_times=False,
use_kaplan_meier=True):
"""
Computes the forest's survival probability function estimate for each
feature vector evaluated at user-specified times.
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the survival probability function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
use_kaplan_meier : boolean, optional (default=False)
If this flag is set to True, then we have the forest predict S(t|x)
using a conditional Kaplan-Meier estimator. Otherwise, we have the
forest predict H(t|x) using a conditional Nelson-Aalen estimator
and then back out an estimate of S(t|x) via S(t|x)=exp(-H(t|x)).
Returns
-------
output : 2D numpy array
Survival probability function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
if use_kaplan_meier:
# step 1: find adaptive nearest neighbors
results = Parallel(n_jobs=self.n_jobs)(
delayed(_compute_tree_ANN)(self.trees[tree_idx], X)
for tree_idx in range(self.n_estimators))
# step 2: aggregate adaptive nearest neighbors
output = []
y = self.train_y
for i in range(len(X)):
histogram = Counter()
total = 0
for t in range(self.n_estimators):
for j in results[t][i]:
histogram[j] += 1
total += len(results[t][i])
nearest_neighbors = sorted(histogram.keys())
weights = [histogram[j] / total for j in nearest_neighbors]
output.append(
_predict_leaf(
_fit_leaf_weighted(y[np.array(nearest_neighbors,
dtype=np.int)],
np.array(weights)),
'surv', times, presorted_times))
return np.array(output)
else:
return np.exp(-self.predict_cum_haz(X, times, presorted_times))
def predict_cum_haz(self, X, times, presorted_times=False,
use_kaplan_meier=False, surv_eps=1e-12):
"""
Computes the forest's cumulative hazard function estimate for each
feature vector evaluated at user-specified times.
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the cumulative hazard function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
use_kaplan_meier : boolean, optional (default=False)
If this flag is set to True, then we have the forest predict S(t|x)
first using a conditional Kaplan-Meier estimate and then back out
an estimate of H(t|x) via H(t|x)=-log(S(t|x)).
surv_eps : float, optional (default=1e-12)
If `use_kaplan_meier` is set to True, then we clip the estimated
survival function so that any value less than `surv_eps` is set to
`surv_eps`. This makes it so that when we take the negative log of
the survival function, we don't take logs of 0.
Returns
-------
output : 2D numpy array
Cumulative hazard function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
if use_kaplan_meier:
surv = self.predict_surv(X, times, presorted_times, True)
return -np.log(np.clip(surv, surv_eps, 1.))
else:
# step 1: find adaptive nearest neighbors
results = Parallel(n_jobs=self.n_jobs)(
delayed(_compute_tree_ANN)(self.trees[tree_idx], X)
for tree_idx in range(self.n_estimators))
# step 2: aggregate adaptive nearest neighbors
output = []
y = self.train_y
for i in range(len(X)):
histogram = Counter()
total = 0
for t in range(self.n_estimators):
for j in results[t][i]:
histogram[j] += 1
total += len(results[t][i])
nearest_neighbors = sorted(histogram.keys())
weights = [histogram[j] / total for j in nearest_neighbors]
output.append(
_predict_leaf(
_fit_leaf_weighted(y[np.array(nearest_neighbors,
dtype=np.int)],
np.array(weights)),
'cum_haz', times, presorted_times))
return np.array(output)
def _print_with_depth(self, string, depth):
"""
Auxiliary function to print a string with indentation dependent on
depth.
"""
print("{0}{1}".format(" " * depth, string))
def _print_tree(self, tree, current_depth=0):
"""
Auxiliary function to print a survival tree.
"""
if 'surv' in tree:
self._print_with_depth(tree['times'], current_depth)
return
self._print_with_depth(
"{0} > {1}".format(self.column_names[tree['feature']],
tree['threshold']),
current_depth)
self._print_tree(tree['left'], current_depth + 1)
self._print_tree(tree['right'], current_depth + 1)
def draw(self):
"""
Prints out each tree of the random survival forest.
"""
for tree_idx, tree in enumerate(self.trees):
print("==========================================\nTree",
tree_idx)
self._print_tree(tree)
def _find_best_feature_split(X, y, max_features, split_score_function,
min_samples_split, min_samples_leaf,
split_threshold_mode, random_state):
"""
Finds the best single feature to split on and the split threshold to use.
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
y : 2D numpy array, shape = [n_samples, 2]
Survival labels (first column is for observed times, second column is
for event indicators). The i-th row corresponds to the i-th row in `X`.
max_features : int
Number of randomly chosen features that we find a split for.
split_score_function : function
Function that computes a split score. Look at `logrank` for an example.
min_samples_split : int
See documentation for RandomSurvivalForest's `__init__` function.
min_samples_leaf : int, float
See documentation for RandomSurvivalForest's `__init__` function.
split_threshold_mode : string
See documentation for RandomSurvivalForest's `__init__` function.
random_state : numpy RandomState instance
Pseudorandom number generator.
*Warning*: for this function, `random_state` actually does have to be a
numpy RandomState instance. This is for computational efficiency
reasons as to not keep having to sanity check the input.
Returns
-------
None, or (feature column index as integer, split threshold as float, mask
for which data go into the left branch)
"""
num_features = X.shape[1]
if max_features >= num_features:
candidate_features = list(range(num_features))
else:
candidate_features = list(random_state.choice(num_features,
max_features,
replace=False))
num_candidate_features = len(candidate_features)
X_slice = X[:, candidate_features].copy()
drop_features = []
keep_feature_mask = np.ones(num_candidate_features, dtype=np.bool)
for idx in range(num_candidate_features):
nan_mask = np.isnan(X_slice[:, idx])
num_nans = nan_mask.sum()
if num_nans > 0:
not_nan_mask = ~nan_mask
if np.any(not_nan_mask):
# impute
X_slice[nan_mask, idx] = \
random_state.choice(X_slice[not_nan_mask, idx],
num_nans)
else:
drop_features.append(idx)
keep_feature_mask[idx] = 0
num_drop_features = len(drop_features)
num_candidate_features -= num_drop_features
if num_candidate_features == 0:
return None
if num_drop_features > 0:
X_slice = X_slice[:, keep_feature_mask]
for idx in drop_features[::-1]:
del candidate_features[idx]
if split_threshold_mode == 'exhaustive':
score_arg_pairs \
= [(split_score_function(X_slice[:, idx], y, split_threshold,
min_samples_split, min_samples_leaf),
(col_idx,
split_threshold,
X_slice[:, idx] <= split_threshold))
for idx, col_idx in enumerate(candidate_features)
for split_threshold in np.sort(np.unique(X_slice[:, idx]))]
argmax = np.argmax([score for score, arg in score_arg_pairs])
best_score, best_arg = score_arg_pairs[argmax]
if best_score == 0:
return None
else:
return best_arg
elif split_threshold_mode == 'median':
max_score = -np.inf
best_arg = None
for idx, col_idx in enumerate(candidate_features):
split_threshold = np.median(X_slice[:, idx])
score = split_score_function(X_slice[:, idx], y, split_threshold,
min_samples_split, min_samples_leaf)
if score > max_score:
max_score = score
best_arg = (col_idx, split_threshold,
X_slice[:, idx] <= split_threshold)
if max_score == 0:
return None
else:
return best_arg
elif split_threshold_mode == 'random':
max_score = -np.inf
best_arg = None
for idx, col_idx in enumerate(candidate_features):
split_threshold = random_state.choice(X_slice[:, idx])
score = split_score_function(X_slice[:, idx], y, split_threshold,
min_samples_split, min_samples_leaf)
if score > max_score:
max_score = score
best_arg = (col_idx, split_threshold,
X_slice[:, idx] <= split_threshold)
if max_score == 0:
return None
else:
return best_arg
else:
raise NotImplementedError('Unsupported split threshold strategy '
+ '"{0}"'.format(split_threshold_mode))
def _fit_leaf(y):
"""
Computes leaf node information given survival labels (observed times and
event indicators).
Parameters
----------
y : 2D numpy array, shape=[n_samples, 2]
The two columns correspond to observed times and event indicators.
Returns
-------
tree : dictionary
The leaf node information stored as a dictionary. Specifically, the
key-value pairs of this dictionary are as follows:
- 'times': stores the sorted unique observed times
- 'event_counts': in the same order as `times`, the number of events
at each unique observed time
- 'at_risk_counts': in the same order as `times`, the number of
subjects at risk at each unique observed time
- 'surv': in the same order as `times`, the Kaplan-Meier survival
probability estimate at each unique observed time
- 'cum_haz': in the same order as `times`, the Nelson-Aalen cumulative
hazard estimate at each unique observed time
"""
if len(y.shape) == 1:
y = y.reshape(1, -1)
sorted_unique_observed_times = np.unique(y[:, 0])
num_unique_observed_times = len(sorted_unique_observed_times)
time_to_idx = {time: idx
for idx, time in enumerate(sorted_unique_observed_times)}
event_counts = np.zeros(num_unique_observed_times)
dropout_counts = np.zeros(num_unique_observed_times)
at_risk_counts = np.zeros(num_unique_observed_times)
at_risk_counts[0] = len(y)
for observed_time, event_ind in y:
idx = time_to_idx[observed_time]
if event_ind:
event_counts[idx] += 1
dropout_counts[idx] += 1
for idx in range(num_unique_observed_times - 1):
at_risk_counts[idx + 1] = at_risk_counts[idx] - dropout_counts[idx]
event_mask = (event_counts > 0)
if event_mask.sum() > 0:
sorted_unique_observed_times = sorted_unique_observed_times[event_mask]
event_counts = event_counts[event_mask]
at_risk_counts = at_risk_counts[event_mask]
else:
sorted_unique_observed_times = np.zeros((1,))
event_counts = np.zeros((1,))
at_risk_counts = np.array([len(y)])
hazard_func = event_counts / np.clip(at_risk_counts, 1e-12, None)
surv_func = np.exp(np.cumsum(np.log(1. - hazard_func + 1e-12)))
cum_haz_func = np.cumsum(hazard_func)
return {'times': sorted_unique_observed_times,
'event_counts': event_counts,
'at_risk_counts': at_risk_counts,
'surv': surv_func,
'cum_haz': cum_haz_func}
def _fit_leaf_weighted(y, weights):
"""
Computes leaf node information given survival labels (observed times and
event indicators) that have weights. This is for computing kernel variants
of the Kaplan-Meier and Nelson-Aalen estimators.
Parameters
----------
y : 2D numpy array, shape=[n_samples, 2]
The two columns correspond to observed times and event indicators.
weights : 1D numpy array, shape=[n_samples]
Nonnegative weights; i-th weight corresponds to the i-th row in `y`.
Returns
-------
tree : dictionary
The leaf node information stored as a dictionary. Specifically, the
key-value pairs of this dictionary are as follows:
- 'times': stores the sorted unique observed times
- 'event_counts': in the same order as `times`, the number of events
at each unique observed time
- 'at_risk_counts': in the same order as `times`, the number of
subjects at risk at each unique observed time
- 'surv': in the same order as `times`, the Kaplan-Meier survival
probability estimate at each unique observed time
- 'cum_haz': in the same order as `times`, the Nelson-Aalen cumulative
hazard estimate at each unique observed time
"""
if y.size == 0:
return {'times': sorted_unique_observed_times,
'event_counts': event_counts,
'at_risk_counts': at_risk_counts,
'surv': surv_func,
'cum_haz': cum_haz_func}
if len(y.shape) == 1:
y = y.reshape(1, -1)
sorted_unique_observed_times = np.sort(np.unique(y[:, 0]))
num_unique_observed_times = len(sorted_unique_observed_times)
time_to_idx = {time: idx
for idx, time in enumerate(sorted_unique_observed_times)}
event_counts = np.zeros(num_unique_observed_times)
dropout_counts = np.zeros(num_unique_observed_times)
at_risk_counts = np.zeros(num_unique_observed_times)
at_risk_counts[0] = np.sum(weights)
for (observed_time, event_ind), weight in zip(y, weights):
idx = time_to_idx[observed_time]
if event_ind:
event_counts[idx] += weight
dropout_counts[idx] += weight
for idx in range(num_unique_observed_times - 1):
at_risk_counts[idx + 1] = at_risk_counts[idx] - dropout_counts[idx]
event_mask = (event_counts > 0)
if event_mask.sum() > 0:
sorted_unique_observed_times = sorted_unique_observed_times[event_mask]
event_counts = event_counts[event_mask]
at_risk_counts = at_risk_counts[event_mask]
else:
sorted_unique_observed_times = np.zeros((1,))
event_counts = np.zeros((1,))
at_risk_counts = np.array([len(y)])
hazard_func = event_counts / np.clip(at_risk_counts, 1e-12, None)
surv_func = np.exp(np.cumsum(np.log(1. - hazard_func + 1e-12)))
cum_haz_func = np.cumsum(hazard_func)
return {'times': sorted_unique_observed_times,
'event_counts': event_counts,
'at_risk_counts': at_risk_counts,
'surv': surv_func,
'cum_haz': cum_haz_func}
def _build_tree(X, y, current_depth, max_depth, max_features,
split_score_function, min_samples_split, min_samples_leaf,
split_threshold_mode, random_state):
"""
Builds a survival tree.
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
y : 2D numpy array, shape = [n_samples, 2]
Survival labels (first column is for observed times, second column is
for event indicators). The i-th row corresponds to the i-th row in `X`.
current_depth : int
Current depth of the tree building progress (starts at 0).
max_depth : int
Maximum depth of tree building progress. If `current_depth` is equal to
`max_depth`, then we do not split any further and create a leaf node at
`tree`.
max_features : int
Number of randomly chosen features that we find a split for.
split_score_function : function
Function that computes a split score. Look at `logrank` for an example.
min_samples_split : int
See documentation for RandomSurvivalForest's `__init__` function.
min_samples_leaf : int, float
See documentation for RandomSurvivalForest's `__init__` function.
split_threshold_mode : string
See documentation for RandomSurvivalForest's `__init__` function.
random_state : numpy RandomState instance
Pseudorandom number generator.
*Warning*: for this function, `random_state` actually does have to be a
numpy RandomState instance. This is for computational efficiency
reasons as to not keep having to sanity check the input.
Returns
-------
tree : dictionary
A tree built using the given data. If the tree is a leaf node, then its
key-value pairs are explained in the documentation for _fit_leaf (see
the return value). Otherwise, the key-value pairs are as follows:
- 'feature': which feature index to split on at the current node
- 'threshold': which feature threshold value to split on at the current
node; the splits are <= threshold (left branch), and > threshold
(right branch)
- 'left': the tree for the left branch, stored as a dictionary
- 'right': the tree for the right branch, stored as a dictionary
"""
if len(np.unique(y[:, 0])) == 1 or current_depth == max_depth:
return _fit_leaf(y)
best_arg = _find_best_feature_split(X, y, max_features,
split_score_function,
min_samples_split, min_samples_leaf,
split_threshold_mode, random_state)
if best_arg == None:
return _fit_leaf(y)
best_feature_idx, split_threshold, left_mask = best_arg
tree = {'feature': best_feature_idx,
'threshold': split_threshold}
tree['left'] = _build_tree(X[left_mask], y[left_mask], current_depth + 1,
max_depth, max_features, split_score_function,
min_samples_split, min_samples_leaf,
split_threshold_mode, random_state)
right_mask = ~left_mask
tree['right'] = _build_tree(X[right_mask], y[right_mask],
current_depth + 1, max_depth, max_features,
split_score_function, min_samples_split,
min_samples_leaf, split_threshold_mode,
random_state)
return tree
def _build_tree_ANN(X, y, train_indices, current_depth, max_depth,
max_features, split_score_function, min_samples_split,
min_samples_leaf, split_threshold_mode, random_state):
"""
Similar to `_build_tree()` but for the adaptive nearest neighbors variant
of random survival forests.
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
y : 2D numpy array, shape = [n_samples, 2]
Survival labels (first column is for observed times, second column is
for event indicators). The i-th row corresponds to the i-th row in `X`.
train_indices : 1D numpy array, shape = [n_samples]
Specifies which training subject index each row of `X` corresponds to.
current_depth : int
Current depth of the tree building progress (starts at 0).
max_depth : int
Maximum depth of tree building progress. If `current_depth` is equal to
`max_depth`, then we do not split any further and create a leaf node at
`tree`.
max_features : int
Number of randomly chosen features that we find a split for.
split_score_function : function
Function that computes a split score. Look at `logrank` for an example.
min_samples_split : int
See documentation for RandomSurvivalForest's `__init__` function.
min_samples_leaf : int, float
See documentation for RandomSurvivalForest's `__init__` function.
split_threshold_mode : string
See documentation for RandomSurvivalForest's `__init__` function.
random_state : numpy RandomState instance
Pseudorandom number generator.
*Warning*: for this function, `random_state` actually does have to be a
numpy RandomState instance. This is for computational efficiency
reasons as to not keep having to sanity check the input.
Returns
-------
tree : dictionary
A tree built using the given data. If the tree is a leaf node, then its
key-value pairs are explained in the documentation for _fit_leaf (see
the return value). Otherwise, the key-value pairs are as follows:
- 'feature': which feature index to split on at the current node
- 'threshold': which feature threshold value to split on at the current
node; the splits are <= threshold (left branch), and > threshold
(right branch)
- 'left': the tree for the left branch, stored as a dictionary
- 'right': the tree for the right branch, stored as a dictionary
"""
if len(np.unique(y[:, 0])) == 1 or current_depth == max_depth:
return {'train_indices': train_indices}
best_arg = _find_best_feature_split(X, y, max_features,
split_score_function,
min_samples_split, min_samples_leaf,
split_threshold_mode, random_state)
if best_arg == None:
return {'train_indices': train_indices}
best_feature_idx, split_threshold, left_mask = best_arg
tree = {'feature': best_feature_idx,
'threshold': split_threshold}
tree['left'] = _build_tree_ANN(X[left_mask], y[left_mask],
train_indices[left_mask], current_depth + 1,
max_depth, max_features, split_score_function,
min_samples_split, min_samples_leaf,
split_threshold_mode, random_state)
right_mask = ~left_mask
tree['right'] = _build_tree_ANN(X[right_mask], y[right_mask],
train_indices[right_mask], current_depth + 1,
max_depth, max_features, split_score_function,
min_samples_split, min_samples_leaf,
split_threshold_mode, random_state)
return tree
def _predict_leaf(tree, mode, times, presorted_times, limit_from_left=False):
"""
Computes either the Kaplan-Meier survival function estimate or the
Nelson-Aalen cumulative hazard function estimate at user-specified times
using survival label data in a leaf node.
Parameters
----------
tree : dictionary
Leaf node of a decision tree where we pull survival label information
from.
mode : string
Either 'surv' for survival probabilities or 'cum_haz' for cumulative
hazard function.
times : 1D numpy array
Times to compute the survival probability or cumulative hazard function
at.
presorted_times : boolean
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to the
left, i.e., instead of outputting f(t) where f is either the survival
probability or cumulative hazard function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
Returns
-------
output : 1D numpy array
Survival probability or cumulative hazard function evaluated at each of
the times specified in `times`.
"""
unique_observed_times = tree['times']
surv_func = tree[mode]
if times is None:
return surv_func
if presorted_times:
sort_indices = range(len(times))
else:
sort_indices = np.argsort(times)
num_leaf_times = len(unique_observed_times)
leaf_time_idx = 0
last_seen_surv_prob = 1.
output = np.zeros(len(times))
if limit_from_left:
for sort_idx in sort_indices:
time = times[sort_idx]
while leaf_time_idx < num_leaf_times:
if unique_observed_times[leaf_time_idx] <= time:
last_seen_surv_prob = surv_func[leaf_time_idx]
leaf_time_idx += 1
else:
break
output[sort_idx] = last_seen_surv_prob
# return np.interp(times, unique_observed_times[1:], surv_func[:-1])
else:
for sort_idx in sort_indices:
time = times[sort_idx]
while leaf_time_idx < num_leaf_times:
if unique_observed_times[leaf_time_idx] < time:
last_seen_surv_prob = surv_func[leaf_time_idx]
leaf_time_idx += 1
else:
break
output[sort_idx] = last_seen_surv_prob
# return np.interp(times, unique_observed_times, surv_func)
return output
def _predict_row(tree, mode, x, times, presorted_times):
"""
For a given survival tree and a feature vector, compute the tree's
survival probability or cumulative hazard function estimate for the feature
vector evaluated at user-specified times.
Parameters
----------
tree : dictionary
Tree node of a decision tree. We traverse down the tree taking branches
that depend on the given feature vector's values.
mode : string
Either 'surv' for survival probabilities or 'cum_haz' for cumulative
hazard function.
x : 1D numpy array, shape = [n_features]
Feature vector.
times : 1D numpy array
Times to compute the survival probability or cumulative hazard function
at.
presorted_times : boolean
Flag for whether `times` is already sorted.
Returns
-------
output : 1D numpy array
Survival probability or cumulative hazard function evaluated at each of
the times specified in `times`.
"""
if 'surv' in tree:
return _predict_leaf(tree, mode, times, presorted_times)
if x[tree['feature']] <= tree['threshold']:
return _predict_row(tree['left'], mode, x, times, presorted_times)
else:
return _predict_row(tree['right'], mode, x, times, presorted_times)
def _predict_tree(tree, mode, X, times, presorted_times):
"""
For a given survival tree and many feature vectors, compute the tree's
survival probability function estimate for each feature vector evaluated at
user-specified times.
Parameters
----------
tree : dictionary
Tree node of a decision tree. We traverse down the tree taking branches
that depend on the given feature vector's values.
mode : string
Either 'surv' for survival probabilities or 'cum_haz' for cumulative
hazard function.
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the survival probability or cumulative hazard function
at.
presorted_times : boolean
Flag for whether `times` is already sorted.
Returns
-------
output : 2D numpy array
Survival probability or cumulative hazard function evaluated at each of
the times specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
return np.array([_predict_row(tree, mode, x, times, presorted_times)
for x in X])
def _predict_row_vimp(tree, mode, x, times, presorted_times, randomize_idx,
random_state):
"""
The same as _predict_row except for handling variable importance, i.e., a
single variable's branching decisions get randomized.
Parameters
----------
tree : dictionary
Tree node of a decision tree. We traverse down the tree taking branches
that depend on the given feature vector's values.
mode : string
Either 'surv' for survival probabilities or 'cum_haz' for cumulative
hazard function.
x : 1D numpy array, shape = [n_features]
Feature vector.
times : 1D numpy array
Times to compute the survival probability or cumulative hazard function
at.
presorted_times : boolean
Flag for whether `times` is already sorted.
randomize_idx : int
When this feature index is encountered, randomize the branching
decision.
random_state : numpy RandomState instance
Pseudorandom number generator.
*Warning*: for this function, `random_state` actually does have to be a
numpy RandomState instance. This is for computational efficiency
reasons as to not keep having to sanity check the input.
Returns
-------
output : 1D numpy array
Survival probability or cumulative hazard function evaluated at each of
the times specified in `times`.
"""
if 'surv' in tree:
return _predict_leaf(tree, mode, times, presorted_times)
if tree['feature'] == randomize_idx:
if random_state.randint(2) == 0:
return _predict_row_vimp(tree['left'], mode, x, times,
presorted_times, randomize_idx,
random_state)
else:
return _predict_row_vimp(tree['right'], mode, x, times,
presorted_times, randomize_idx,
random_state)
elif x[tree['feature']] <= tree['threshold']:
return _predict_row_vimp(tree['left'], mode, x, times,
presorted_times, randomize_idx, random_state)
else:
return _predict_row_vimp(tree['right'], mode, x, times,
presorted_times, randomize_idx, random_state)
def _predict_tree_vimp(tree, mode, X, times, presorted_times, randomize_idx,
random_state):
"""
The same as _predict_tree except for handling variable importance, i.e., a
single variable's branching decisions get randomized.
Parameters
----------
tree : dictionary
Tree node of a decision tree. We traverse down the tree taking branches
that depend on the given feature vector's values.
mode : string
Either 'surv' for survival probabilities or 'cum_haz' for cumulative
hazard function.
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the survival probability or cumulative hazard function
at.
presorted_times : boolean
Flag for whether `times` is already sorted.
randomize_idx : int
When this feature index is encountered, randomize the branching
decision.
random_state : numpy RandomState instance
Pseudorandom number generator.
*Warning*: for this function, `random_state` actually does have to be a
numpy RandomState instance. This is for computational efficiency
reasons as to not keep having to sanity check the input.
Returns
-------
output : 2D numpy array
Survival probability or cumulative hazard function evaluated at each of
the times specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
return np.array([_predict_row_vimp(tree, mode, x, times,
presorted_times, randomize_idx,
random_state)
for x in X])
def _compute_tree_ANN(tree, X):
"""
Finds the adaptive nearest neighbors for a collection of feature vectors.
Parameters
----------
tree : dictionary
Tree node of a decision tree. We traverse down the tree taking branches
that depend on the given feature vector's values.
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
Returns
-------
output : 2D numpy array
Survival probability or cumulative hazard function evaluated at each of
the times specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
return [_compute_ANN_row(tree, x) for x in X]
def _compute_ANN_row(tree, x):
"""
For a given survival tree and a feature vector, traverse down the tree to
find the feature vector's adaptive nearest neighbors.
Parameters
----------
tree : dictionary
Tree node of a decision tree. We traverse down the tree taking branches
that depend on the given feature vector's values.
x : 1D numpy array, shape = [n_features]
Feature vector.
Returns
-------
output : 1D numpy array
Training subject indices that are the adaptive nearest neighbors of the
input feature vector.
"""
if 'train_indices' in tree:
return tree['train_indices']
if x[tree['feature']] <= tree['threshold']:
return _compute_ANN_row(tree['left'], x)
else:
return _compute_ANN_row(tree['right'], x)
class CDFRegressionKNNWeightedSurvival():
def __init__(self, *args, **kwargs):
"""
Arguments are the same as for `sklearn.neighbors.NearestNeighbors`.
The simplest usage of this class is to use a single argument, which is
`n_neighbors` for the number of nearest neighbors (Euclidean distance
is assumed in this case). If you want to parallelize across different
search queries, use the `n_jobs` keyword parameter (-1 to use all
cores). To use other distances and for other details, please refer to
the documentation for sklearn's `NearestNeighbors` class.
*Important:* The prediction methods for this class use weighted
k-nearest neighbors, where "k" is set equal to the `n_neighbors`
parameter. The weights are specified through a kernel function K. In
particular, the i-th nearest neighbor X_i for a test point x is given a
weight of:
K( (distance between x and X_i) / (distance between x and X_k) ).
"""
self.NN_index_args = args
self.NN_index_kwargs = kwargs
self.NN_index = None
def fit(self, X, y):
"""
Constructs a nearest-neighbor index given training data (so that for
a future data point, we can use the nearest-neighbor index to quickly
find what the closest training data are to the future point).
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
y : 2D numpy array, shape = [n_samples, 2]
Survival labels (first column is for observed times, second column
is for event indicators). The i-th row corresponds to the i-th row
in `X`.
Returns
-------
None
"""
self.train_y = y
self.NN_index = NearestNeighbors(*self.NN_index_args,
**self.NN_index_kwargs)
self.NN_index.fit(X)
def predict_surv(self, X, times, presorted_times=False,
limit_from_left=False, n_neighbors=None,
kernel_function=None):
"""
Computes the weighted k-NN CDF estimation followed by k-NN regression
survival probability function estimate at user-specified times.
*Important:* The default number of nearest neighbors to use is whatever
was specified in `args` or `kwargs` when creating an instance of this
class (the "k" in k-NN)!
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the survival probability function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the survival
probability function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
n_neighbors : int, None, optional (default=None)
Number of nearest neighbors to use. If set to None then the number
used is whatever was passed into `args` or `kwargs` when creating
an instance of this class.
kernel_function : function, None, optional (default=None)
Kernel function to use. None corresponds to unweighted k-NN
survival analysis. If a function is specified, then the weighting
function used is of the form
"kernel(distance / distance to k-th nearest neighbor)".
Returns
-------
output : 2D numpy array
Survival probability function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
if kernel_function is None:
kernel_function = lambda s: 1
dists, indices = self.NN_index.kneighbors(X, n_neighbors=n_neighbors,
return_distance=True)
train_y = self.train_y
output = []
n_times = len(times)
for dist, idx in zip(dists, indices):
max_dist = np.max(dist)
weights = np.array([kernel_function(d / max_dist) for d in dist])
zero_weight = (weights == 0)
if zero_weight.sum() > 0:
weights_subset = weights[~zero_weight]
if weights_subset.size > 0:
labels_subset = train_y[idx[~zero_weight]]
else:
output.append(np.ones(n_times))
continue
else:
labels_subset = train_y[idx]
weights_subset = weights
# step 1
weighted_edf_times, weighted_edf = \
compute_weighted_edf(labels_subset[:, 0], weights_subset)
one_minus_weighted_edf = 1 - weighted_edf
if weighted_edf[0] < 1 and weighted_edf_times[0] > 0:
weighted_edf_times = \
np.concatenate(([0.], weighted_edf_times))
weighted_edf = \
np.concatenate(([1.], weighted_edf))
# step 2
denoms = np.interp(labels_subset[:, 0], weighted_edf_times,
weighted_edf)
neg_log_S_est = np.zeros(len(times))
for time_idx, t in enumerate(times):
neg_log_S_est[time_idx] = \
np.inner(labels_subset[:, 1]
* (labels_subset[:, 0] <= t) / denoms,
weights_subset)
output.append(np.exp(-neg_log_S_est))
return np.array(output)
def predict_cum_haz(self, X, times, presorted_times=False,
limit_from_left=False, n_neighbors=None,
kernel_function=None):
"""
Computes the weighted k-NN CDF estimation followed by k-NN regression
cumulative hazard function estimate at user-specified times.
*Important:* The default number of nearest neighbors to use is whatever
was specified in `args` or `kwargs` when creating an instance of this
class (the "k" in k-NN)!
Parameters
----------
X : 2D numpy array, shape = [n_samples, n_features]
Feature vectors.
times : 1D numpy array
Times to compute the cumulative hazard function at.
presorted_times : boolean, optional (default=False)
Flag for whether `times` is already sorted.
limit_from_left : boolean, optional (default=False)
Flag for whether to output the function evaluated at a time just to
the left, i.e., instead of outputting f(t) where f is the
cumulative hazard function estimate, output:
f(t-) := limit as t' approaches t from the left of f(t').
n_neighbors : int, None, optional (default=None)
Number of nearest neighbors to use. If set to None then the number
used is whatever was passed into `args` or `kwargs` when creating
an instance of this class.
kernel_function : function, None, optional (default=None)
Kernel function to use. None corresponds to unweighted k-NN
survival analysis. If a function is specified, then the weighting
function used is of the form
"kernel(distance / distance to k-th nearest neighbor)".
Returns
-------
output : 2D numpy array
Cumulative hazard function evaluated at each of the times
specified in `times` for each feature vector. The i-th row
corresponds to the i-th feature vector.
"""
if kernel_function is None:
kernel_function = lambda s: 1
dists, indices = self.NN_index.kneighbors(X, n_neighbors=n_neighbors,
return_distance=True)
train_y = self.train_y
output = []
n_times = len(times)
for dist, idx in zip(dists, indices):
max_dist = np.max(dist)
weights = np.array([kernel_function(d / max_dist) for d in dist])
zero_weight = (weights == 0)
if zero_weight.sum() > 0:
weights_subset = weights[~zero_weight]
if weights_subset.size > 0:
labels_subset = train_y[idx[~zero_weight]]
else:
output.append(np.ones(n_times))
continue
else:
labels_subset = train_y[idx]
weights_subset = weights
# step 1
weighted_edf_times, weighted_edf = \
compute_weighted_edf(labels_subset[:, 0], weights_subset)
one_minus_weighted_edf = 1 - weighted_edf
if weighted_edf[0] < 1 and weighted_edf_times[0] > 0:
weighted_edf_times = \
np.concatenate(([0.], weighted_edf_times))
weighted_edf = \
np.concatenate(([1.], weighted_edf))
# step 2
denoms = np.interp(labels_subset[:, 0], weighted_edf_times,
weighted_edf)
neg_log_S_est = np.zeros(len(times))
for time_idx, t in enumerate(times):
neg_log_S_est[time_idx] = \
np.inner(labels_subset[:, 1]
* (labels_subset[:, 0] <= t) / denoms,
weights_subset)
output.append(neg_log_S_est)
return np.array(output)
def compute_weighted_edf(obs, weights=None):
"""
Computes a weighted empirical distribution function.
Parameters
----------
obs : 1D numpy array
Observations to construct the weighted empirical distribution from.
weights : 1D numpy array, None, optional (default=None)
Nonnegative weights for the observations. The i-th weight corresponds
to the i-th value in `obs`. None refers to using uniform weights,
i.e., each point has weight 1/len(obs).
Returns
-------
sorted_unique_obs : 1D numpy array
Sorted unique observations in ascending order.
weighted_edf : 1D numpy array
The weighted empirical distribution function evaluated at each of the
values in `sorted_unique_obs`, in the same order.
"""
if weights is None:
weights = np.ones(len(obs))
weights /= weights.shape[0]
sorted_unique_obs = np.sort(np.unique(obs))
obs_to_idx = {obs: idx for idx, obs in enumerate(sorted_unique_obs)}
weighted_edf = np.zeros(len(sorted_unique_obs))
for x, w in zip(obs, weights):
weighted_edf[obs_to_idx[x]] += w
weighted_edf = np.cumsum(weighted_edf)
return sorted_unique_obs, weighted_edf
def _convert_to_not_use_numpy(tree):
if 'surv' in tree:
new_leaf = {}
for key in tree:
if type(tree[key]) == np.ndarray:
new_leaf[key] = tree[key].tolist()
else:
new_leaf[key] = tree[key]
return new_leaf
new_inner_node = {}
for key in tree:
if key == 'left':
new_inner_node['left'] = _convert_to_not_use_numpy(tree['left'])
elif key == 'right':
new_inner_node['right'] = _convert_to_not_use_numpy(tree['right'])
elif type(tree[key]) == np.ndarray:
new_inner_node[key] = tree[key].tolist()
else:
new_inner_node[key] = tree[key]
return new_inner_node
def _convert_to_use_numpy(tree):
if 'surv' in tree:
new_leaf = {}
for key in tree:
if type(tree[key]) == list:
new_leaf[key] = np.array(tree[key])
else:
new_leaf[key] = tree[key]
return new_leaf
new_inner_node = {}
for key in tree:
if key == 'left':
new_inner_node['left'] = _convert_to_use_numpy(tree['left'])
elif key == 'right':
new_inner_node['right'] = _convert_to_use_numpy(tree['right'])
elif type(tree[key]) == list:
new_inner_node[key] = np.array(tree[key])
else:
new_inner_node[key] = tree[key]
return new_inner_node
def _label_leaves(tree, cur_leaf_id=0):
if 'surv' in tree:
tree['leaf_id'] = cur_leaf_id
return cur_leaf_id + 1
cur_leaf_id = _label_leaves(tree['left'], cur_leaf_id)
cur_leaf_id = _label_leaves(tree['right'], cur_leaf_id)
return cur_leaf_id
def _predict_tree_leaf_id(tree, X):
return np.array([_predict_row_leaf_id(tree, x) for x in X])
def _predict_row_leaf_id(tree, x):
if 'surv' in tree:
return tree['leaf_id']
if x[tree['feature']] <= tree['threshold']:
return _predict_row_leaf_id(tree['left'], x)
else:
return _predict_row_leaf_id(tree['right'], x)
| 41.524981 | 82 | 0.591978 | 13,199 | 108,048 | 4.675657 | 0.052201 | 0.013611 | 0.005153 | 0.01037 | 0.872703 | 0.859999 | 0.846907 | 0.833474 | 0.827916 | 0.822941 | 0 | 0.005539 | 0.333315 | 108,048 | 2,601 | 83 | 41.540946 | 0.851195 | 0.455011 | 0 | 0.775814 | 0 | 0 | 0.036003 | 0.003554 | 0 | 0 | 0 | 0 | 0.00186 | 1 | 0.053023 | false | 0 | 0.029767 | 0.00093 | 0.150698 | 0.016744 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f2a22f46d96dfbd7ca0f41023f60dfc50ce3ec02 | 188 | py | Python | moha/system/basis/__init__.py | ZhaoYilin/moha | d701fd921839474380982db1478e66f0dc8cbd98 | [
"MIT"
] | 12 | 2019-12-07T18:37:34.000Z | 2022-03-30T14:23:38.000Z | moha/system/basis/__init__.py | ZhaoYilin/moha | d701fd921839474380982db1478e66f0dc8cbd98 | [
"MIT"
] | null | null | null | moha/system/basis/__init__.py | ZhaoYilin/moha | d701fd921839474380982db1478e66f0dc8cbd98 | [
"MIT"
] | 2 | 2019-12-08T05:48:47.000Z | 2021-10-31T21:40:21.000Z | from __future__ import division, print_function
from __future__ import absolute_import
from moha.system.basis.gaussian_orbital import *
from moha.system.basis.slater_determinant import *
| 31.333333 | 50 | 0.856383 | 25 | 188 | 5.96 | 0.56 | 0.134228 | 0.214765 | 0.268456 | 0.33557 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095745 | 188 | 5 | 51 | 37.6 | 0.876471 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.25 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f2bedb18597c76b058eacdbfc215511af9dd4e65 | 50,512 | py | Python | venv/lib/python3.8/site-packages/spaceone/api/identity/v1/user_pb2.py | choonho/plugin-prometheus-mon-webhook | afa7d65d12715fd0480fb4f92a9c62da2d6128e0 | [
"Apache-2.0"
] | null | null | null | venv/lib/python3.8/site-packages/spaceone/api/identity/v1/user_pb2.py | choonho/plugin-prometheus-mon-webhook | afa7d65d12715fd0480fb4f92a9c62da2d6128e0 | [
"Apache-2.0"
] | null | null | null | venv/lib/python3.8/site-packages/spaceone/api/identity/v1/user_pb2.py | choonho/plugin-prometheus-mon-webhook | afa7d65d12715fd0480fb4f92a9c62da2d6128e0 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: spaceone/api/identity/v1/user.proto
"""Generated protocol buffer code."""
from google.protobuf.internal import enum_type_wrapper
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
from google.protobuf import struct_pb2 as google_dot_protobuf_dot_struct__pb2
from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2
from spaceone.api.core.v1 import query_pb2 as spaceone_dot_api_dot_core_dot_v1_dot_query__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='spaceone/api/identity/v1/user.proto',
package='spaceone.api.identity.v1',
syntax='proto3',
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n#spaceone/api/identity/v1/user.proto\x12\x18spaceone.api.identity.v1\x1a\x1bgoogle/protobuf/empty.proto\x1a\x1cgoogle/protobuf/struct.proto\x1a\x1cgoogle/api/annotations.proto\x1a spaceone/api/core/v1/query.proto\"\xa0\x02\n\x11\x43reateUserRequest\x12\x0f\n\x07user_id\x18\x01 \x01(\t\x12\x10\n\x08password\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\r\n\x05\x65mail\x18\x04 \x01(\t\x12\x35\n\tuser_type\x18\x05 \x01(\x0e\x32\".spaceone.api.identity.v1.UserType\x12\x36\n\x07\x62\x61\x63kend\x18\x06 \x01(\x0e\x32%.spaceone.api.identity.v1.UserBackend\x12\x10\n\x08language\x18\x07 \x01(\t\x12\x10\n\x08timezone\x18\x08 \x01(\t\x12%\n\x04tags\x18\t \x01(\x0b\x32\x17.google.protobuf.Struct\x12\x11\n\tdomain_id\x18\n \x01(\t\"\xb1\x01\n\x11UpdateUserRequest\x12\x0f\n\x07user_id\x18\x01 \x01(\t\x12\x10\n\x08password\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\r\n\x05\x65mail\x18\x04 \x01(\t\x12\x10\n\x08language\x18\x07 \x01(\t\x12\x10\n\x08timezone\x18\x08 \x01(\t\x12%\n\x04tags\x18\t \x01(\x0b\x32\x17.google.protobuf.Struct\x12\x11\n\tdomain_id\x18\n \x01(\t\"1\n\x0bUserRequest\x12\x0f\n\x07user_id\x18\x01 \x01(\t\x12\x11\n\tdomain_id\x18\x02 \x01(\t\"B\n\x0eGetUserRequest\x12\x0f\n\x07user_id\x18\x01 \x01(\t\x12\x11\n\tdomain_id\x18\x02 \x01(\t\x12\x0c\n\x04only\x18\x03 \x03(\t\"\xf6\x01\n\tUserQuery\x12*\n\x05query\x18\x01 \x01(\x0b\x32\x1b.spaceone.api.core.v1.Query\x12\x0f\n\x07user_id\x18\x02 \x01(\t\x12\x0c\n\x04name\x18\x03 \x01(\t\x12\r\n\x05state\x18\x04 \x01(\t\x12\r\n\x05\x65mail\x18\x05 \x01(\t\x12\x35\n\tuser_type\x18\x06 \x01(\x0e\x32\".spaceone.api.identity.v1.UserType\x12\x36\n\x07\x62\x61\x63kend\x18\x07 \x01(\x0e\x32%.spaceone.api.identity.v1.UserBackend\x12\x11\n\tdomain_id\x18\x0b \x01(\t\"\xa7\x03\n\x08UserInfo\x12\x0f\n\x07user_id\x18\x01 \x01(\t\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\x37\n\x05state\x18\x03 \x01(\x0e\x32(.spaceone.api.identity.v1.UserInfo.State\x12\r\n\x05\x65mail\x18\x04 \x01(\t\x12\x35\n\tuser_type\x18\x05 \x01(\x0e\x32\".spaceone.api.identity.v1.UserType\x12\x36\n\x07\x62\x61\x63kend\x18\x06 \x01(\x0e\x32%.spaceone.api.identity.v1.UserBackend\x12\x10\n\x08language\x18\x07 \x01(\t\x12\x10\n\x08timezone\x18\x08 \x01(\t\x12%\n\x04tags\x18\n \x01(\x0b\x32\x17.google.protobuf.Struct\x12\x18\n\x10last_accessed_at\x18\x0b \x01(\t\x12\x12\n\ncreated_at\x18\x0c \x01(\t\x12\x11\n\tdomain_id\x18\r \x01(\t\"9\n\x05State\x12\x08\n\x04NONE\x10\x00\x12\x0b\n\x07\x45NABLED\x10\x01\x12\x0c\n\x08\x44ISABLED\x10\x02\x12\x0b\n\x07PENDING\x10\x03\"U\n\tUsersInfo\x12\x33\n\x07results\x18\x01 \x03(\x0b\x32\".spaceone.api.identity.v1.UserInfo\x12\x13\n\x0btotal_count\x18\x02 \x01(\x05\"X\n\rUserStatQuery\x12\x34\n\x05query\x18\x01 \x01(\x0b\x32%.spaceone.api.core.v1.StatisticsQuery\x12\x11\n\tdomain_id\x18\x02 \x01(\t\"F\n\x0e\x46indUserSearch\x12\x11\n\x07user_id\x18\x01 \x01(\tH\x00\x12\x11\n\x07keyword\x18\x02 \x01(\tH\x00\x42\x0e\n\x0csearch_alias\"\\\n\rFindUserQuery\x12\x38\n\x06search\x18\x01 \x01(\x0b\x32(.spaceone.api.identity.v1.FindUserSearch\x12\x11\n\tdomain_id\x18\x02 \x01(\t\"c\n\x0c\x46indUserInfo\x12\x0f\n\x07user_id\x18\x01 \x01(\t\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\r\n\x05\x65mail\x18\x03 \x01(\t\x12%\n\x04tags\x18\x04 \x01(\x0b\x32\x17.google.protobuf.Struct\"]\n\rFindUsersInfo\x12\x37\n\x07results\x18\x01 \x03(\x0b\x32&.spaceone.api.identity.v1.FindUserInfo\x12\x13\n\x0btotal_count\x18\x02 \x01(\x05*8\n\x0bUserBackend\x12\x10\n\x0cNONE_BACKEND\x10\x00\x12\t\n\x05LOCAL\x10\x01\x12\x0c\n\x08\x45XTERNAL\x10\x02*6\n\x08UserType\x12\x12\n\x0eNONE_USER_TYPE\x10\x00\x12\x08\n\x04USER\x10\x01\x12\x0c\n\x08\x41PI_USER\x10\x02\x32\xbe\t\n\x04User\x12u\n\x06\x63reate\x12+.spaceone.api.identity.v1.CreateUserRequest\x1a\".spaceone.api.identity.v1.UserInfo\"\x1a\x82\xd3\xe4\x93\x02\x14\"\x12/identity/v1/users\x12u\n\x06update\x12+.spaceone.api.identity.v1.UpdateUserRequest\x1a\".spaceone.api.identity.v1.UserInfo\"\x1a\x82\xd3\xe4\x93\x02\x14\x1a\x12/identity/v1/users\x12\x7f\n\x06\x65nable\x12%.spaceone.api.identity.v1.UserRequest\x1a\".spaceone.api.identity.v1.UserInfo\"*\x82\xd3\xe4\x93\x02$\x1a\"/identity/v1/user/{user_id}/enable\x12\x81\x01\n\x07\x64isable\x12%.spaceone.api.identity.v1.UserRequest\x1a\".spaceone.api.identity.v1.UserInfo\"+\x82\xd3\xe4\x93\x02%\x1a#/identity/v1/user/{user_id}/disable\x12\x63\n\x06\x64\x65lete\x12%.spaceone.api.identity.v1.UserRequest\x1a\x16.google.protobuf.Empty\"\x1a\x82\xd3\xe4\x93\x02\x14*\x12/identity/v1/users\x12x\n\x03get\x12(.spaceone.api.identity.v1.GetUserRequest\x1a\".spaceone.api.identity.v1.UserInfo\"#\x82\xd3\xe4\x93\x02\x1d\x12\x1b/identity/v1/user/{user_id}\x12\x89\x01\n\x04list\x12#.spaceone.api.identity.v1.UserQuery\x1a#.spaceone.api.identity.v1.UsersInfo\"7\x82\xd3\xe4\x93\x02\x31\x12\x12/identity/v1/usersZ\x1b\"\x19/identity/v1/users/search\x12i\n\x04stat\x12\'.spaceone.api.identity.v1.UserStatQuery\x1a\x17.google.protobuf.Struct\"\x1f\x82\xd3\xe4\x93\x02\x19\"\x17/identity/v1/users/stat\x12y\n\x04\x66ind\x12\'.spaceone.api.identity.v1.FindUserQuery\x1a\'.spaceone.api.identity.v1.FindUsersInfo\"\x1f\x82\xd3\xe4\x93\x02\x19\"\x17/identity/v1/users/find\x12r\n\x04sync\x12%.spaceone.api.identity.v1.UserRequest\x1a\".spaceone.api.identity.v1.UserInfo\"\x1f\x82\xd3\xe4\x93\x02\x19\"\x17/identity/v1/users/syncb\x06proto3'
,
dependencies=[google_dot_protobuf_dot_empty__pb2.DESCRIPTOR,google_dot_protobuf_dot_struct__pb2.DESCRIPTOR,google_dot_api_dot_annotations__pb2.DESCRIPTOR,spaceone_dot_api_dot_core_dot_v1_dot_query__pb2.DESCRIPTOR,])
_USERBACKEND = _descriptor.EnumDescriptor(
name='UserBackend',
full_name='spaceone.api.identity.v1.UserBackend',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='NONE_BACKEND', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LOCAL', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='EXTERNAL', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=1992,
serialized_end=2048,
)
_sym_db.RegisterEnumDescriptor(_USERBACKEND)
UserBackend = enum_type_wrapper.EnumTypeWrapper(_USERBACKEND)
_USERTYPE = _descriptor.EnumDescriptor(
name='UserType',
full_name='spaceone.api.identity.v1.UserType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='NONE_USER_TYPE', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='USER', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='API_USER', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=2050,
serialized_end=2104,
)
_sym_db.RegisterEnumDescriptor(_USERTYPE)
UserType = enum_type_wrapper.EnumTypeWrapper(_USERTYPE)
NONE_BACKEND = 0
LOCAL = 1
EXTERNAL = 2
NONE_USER_TYPE = 0
USER = 1
API_USER = 2
_USERINFO_STATE = _descriptor.EnumDescriptor(
name='State',
full_name='spaceone.api.identity.v1.UserInfo.State',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='NONE', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ENABLED', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DISABLED', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PENDING', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=1394,
serialized_end=1451,
)
_sym_db.RegisterEnumDescriptor(_USERINFO_STATE)
_CREATEUSERREQUEST = _descriptor.Descriptor(
name='CreateUserRequest',
full_name='spaceone.api.identity.v1.CreateUserRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user_id', full_name='spaceone.api.identity.v1.CreateUserRequest.user_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='password', full_name='spaceone.api.identity.v1.CreateUserRequest.password', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='spaceone.api.identity.v1.CreateUserRequest.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='email', full_name='spaceone.api.identity.v1.CreateUserRequest.email', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user_type', full_name='spaceone.api.identity.v1.CreateUserRequest.user_type', index=4,
number=5, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='backend', full_name='spaceone.api.identity.v1.CreateUserRequest.backend', index=5,
number=6, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='language', full_name='spaceone.api.identity.v1.CreateUserRequest.language', index=6,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timezone', full_name='spaceone.api.identity.v1.CreateUserRequest.timezone', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tags', full_name='spaceone.api.identity.v1.CreateUserRequest.tags', index=8,
number=9, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='domain_id', full_name='spaceone.api.identity.v1.CreateUserRequest.domain_id', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=189,
serialized_end=477,
)
_UPDATEUSERREQUEST = _descriptor.Descriptor(
name='UpdateUserRequest',
full_name='spaceone.api.identity.v1.UpdateUserRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user_id', full_name='spaceone.api.identity.v1.UpdateUserRequest.user_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='password', full_name='spaceone.api.identity.v1.UpdateUserRequest.password', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='spaceone.api.identity.v1.UpdateUserRequest.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='email', full_name='spaceone.api.identity.v1.UpdateUserRequest.email', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='language', full_name='spaceone.api.identity.v1.UpdateUserRequest.language', index=4,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timezone', full_name='spaceone.api.identity.v1.UpdateUserRequest.timezone', index=5,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tags', full_name='spaceone.api.identity.v1.UpdateUserRequest.tags', index=6,
number=9, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='domain_id', full_name='spaceone.api.identity.v1.UpdateUserRequest.domain_id', index=7,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=480,
serialized_end=657,
)
_USERREQUEST = _descriptor.Descriptor(
name='UserRequest',
full_name='spaceone.api.identity.v1.UserRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user_id', full_name='spaceone.api.identity.v1.UserRequest.user_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='domain_id', full_name='spaceone.api.identity.v1.UserRequest.domain_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=659,
serialized_end=708,
)
_GETUSERREQUEST = _descriptor.Descriptor(
name='GetUserRequest',
full_name='spaceone.api.identity.v1.GetUserRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user_id', full_name='spaceone.api.identity.v1.GetUserRequest.user_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='domain_id', full_name='spaceone.api.identity.v1.GetUserRequest.domain_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='only', full_name='spaceone.api.identity.v1.GetUserRequest.only', index=2,
number=3, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=710,
serialized_end=776,
)
_USERQUERY = _descriptor.Descriptor(
name='UserQuery',
full_name='spaceone.api.identity.v1.UserQuery',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='query', full_name='spaceone.api.identity.v1.UserQuery.query', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user_id', full_name='spaceone.api.identity.v1.UserQuery.user_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='spaceone.api.identity.v1.UserQuery.name', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='state', full_name='spaceone.api.identity.v1.UserQuery.state', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='email', full_name='spaceone.api.identity.v1.UserQuery.email', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user_type', full_name='spaceone.api.identity.v1.UserQuery.user_type', index=5,
number=6, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='backend', full_name='spaceone.api.identity.v1.UserQuery.backend', index=6,
number=7, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='domain_id', full_name='spaceone.api.identity.v1.UserQuery.domain_id', index=7,
number=11, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=779,
serialized_end=1025,
)
_USERINFO = _descriptor.Descriptor(
name='UserInfo',
full_name='spaceone.api.identity.v1.UserInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user_id', full_name='spaceone.api.identity.v1.UserInfo.user_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='spaceone.api.identity.v1.UserInfo.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='state', full_name='spaceone.api.identity.v1.UserInfo.state', index=2,
number=3, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='email', full_name='spaceone.api.identity.v1.UserInfo.email', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user_type', full_name='spaceone.api.identity.v1.UserInfo.user_type', index=4,
number=5, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='backend', full_name='spaceone.api.identity.v1.UserInfo.backend', index=5,
number=6, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='language', full_name='spaceone.api.identity.v1.UserInfo.language', index=6,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timezone', full_name='spaceone.api.identity.v1.UserInfo.timezone', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tags', full_name='spaceone.api.identity.v1.UserInfo.tags', index=8,
number=10, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='last_accessed_at', full_name='spaceone.api.identity.v1.UserInfo.last_accessed_at', index=9,
number=11, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='created_at', full_name='spaceone.api.identity.v1.UserInfo.created_at', index=10,
number=12, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='domain_id', full_name='spaceone.api.identity.v1.UserInfo.domain_id', index=11,
number=13, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_USERINFO_STATE,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1028,
serialized_end=1451,
)
_USERSINFO = _descriptor.Descriptor(
name='UsersInfo',
full_name='spaceone.api.identity.v1.UsersInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='results', full_name='spaceone.api.identity.v1.UsersInfo.results', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='total_count', full_name='spaceone.api.identity.v1.UsersInfo.total_count', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1453,
serialized_end=1538,
)
_USERSTATQUERY = _descriptor.Descriptor(
name='UserStatQuery',
full_name='spaceone.api.identity.v1.UserStatQuery',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='query', full_name='spaceone.api.identity.v1.UserStatQuery.query', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='domain_id', full_name='spaceone.api.identity.v1.UserStatQuery.domain_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1540,
serialized_end=1628,
)
_FINDUSERSEARCH = _descriptor.Descriptor(
name='FindUserSearch',
full_name='spaceone.api.identity.v1.FindUserSearch',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user_id', full_name='spaceone.api.identity.v1.FindUserSearch.user_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='keyword', full_name='spaceone.api.identity.v1.FindUserSearch.keyword', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='search_alias', full_name='spaceone.api.identity.v1.FindUserSearch.search_alias',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=1630,
serialized_end=1700,
)
_FINDUSERQUERY = _descriptor.Descriptor(
name='FindUserQuery',
full_name='spaceone.api.identity.v1.FindUserQuery',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='search', full_name='spaceone.api.identity.v1.FindUserQuery.search', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='domain_id', full_name='spaceone.api.identity.v1.FindUserQuery.domain_id', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1702,
serialized_end=1794,
)
_FINDUSERINFO = _descriptor.Descriptor(
name='FindUserInfo',
full_name='spaceone.api.identity.v1.FindUserInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user_id', full_name='spaceone.api.identity.v1.FindUserInfo.user_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='spaceone.api.identity.v1.FindUserInfo.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='email', full_name='spaceone.api.identity.v1.FindUserInfo.email', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tags', full_name='spaceone.api.identity.v1.FindUserInfo.tags', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1796,
serialized_end=1895,
)
_FINDUSERSINFO = _descriptor.Descriptor(
name='FindUsersInfo',
full_name='spaceone.api.identity.v1.FindUsersInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='results', full_name='spaceone.api.identity.v1.FindUsersInfo.results', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='total_count', full_name='spaceone.api.identity.v1.FindUsersInfo.total_count', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1897,
serialized_end=1990,
)
_CREATEUSERREQUEST.fields_by_name['user_type'].enum_type = _USERTYPE
_CREATEUSERREQUEST.fields_by_name['backend'].enum_type = _USERBACKEND
_CREATEUSERREQUEST.fields_by_name['tags'].message_type = google_dot_protobuf_dot_struct__pb2._STRUCT
_UPDATEUSERREQUEST.fields_by_name['tags'].message_type = google_dot_protobuf_dot_struct__pb2._STRUCT
_USERQUERY.fields_by_name['query'].message_type = spaceone_dot_api_dot_core_dot_v1_dot_query__pb2._QUERY
_USERQUERY.fields_by_name['user_type'].enum_type = _USERTYPE
_USERQUERY.fields_by_name['backend'].enum_type = _USERBACKEND
_USERINFO.fields_by_name['state'].enum_type = _USERINFO_STATE
_USERINFO.fields_by_name['user_type'].enum_type = _USERTYPE
_USERINFO.fields_by_name['backend'].enum_type = _USERBACKEND
_USERINFO.fields_by_name['tags'].message_type = google_dot_protobuf_dot_struct__pb2._STRUCT
_USERINFO_STATE.containing_type = _USERINFO
_USERSINFO.fields_by_name['results'].message_type = _USERINFO
_USERSTATQUERY.fields_by_name['query'].message_type = spaceone_dot_api_dot_core_dot_v1_dot_query__pb2._STATISTICSQUERY
_FINDUSERSEARCH.oneofs_by_name['search_alias'].fields.append(
_FINDUSERSEARCH.fields_by_name['user_id'])
_FINDUSERSEARCH.fields_by_name['user_id'].containing_oneof = _FINDUSERSEARCH.oneofs_by_name['search_alias']
_FINDUSERSEARCH.oneofs_by_name['search_alias'].fields.append(
_FINDUSERSEARCH.fields_by_name['keyword'])
_FINDUSERSEARCH.fields_by_name['keyword'].containing_oneof = _FINDUSERSEARCH.oneofs_by_name['search_alias']
_FINDUSERQUERY.fields_by_name['search'].message_type = _FINDUSERSEARCH
_FINDUSERINFO.fields_by_name['tags'].message_type = google_dot_protobuf_dot_struct__pb2._STRUCT
_FINDUSERSINFO.fields_by_name['results'].message_type = _FINDUSERINFO
DESCRIPTOR.message_types_by_name['CreateUserRequest'] = _CREATEUSERREQUEST
DESCRIPTOR.message_types_by_name['UpdateUserRequest'] = _UPDATEUSERREQUEST
DESCRIPTOR.message_types_by_name['UserRequest'] = _USERREQUEST
DESCRIPTOR.message_types_by_name['GetUserRequest'] = _GETUSERREQUEST
DESCRIPTOR.message_types_by_name['UserQuery'] = _USERQUERY
DESCRIPTOR.message_types_by_name['UserInfo'] = _USERINFO
DESCRIPTOR.message_types_by_name['UsersInfo'] = _USERSINFO
DESCRIPTOR.message_types_by_name['UserStatQuery'] = _USERSTATQUERY
DESCRIPTOR.message_types_by_name['FindUserSearch'] = _FINDUSERSEARCH
DESCRIPTOR.message_types_by_name['FindUserQuery'] = _FINDUSERQUERY
DESCRIPTOR.message_types_by_name['FindUserInfo'] = _FINDUSERINFO
DESCRIPTOR.message_types_by_name['FindUsersInfo'] = _FINDUSERSINFO
DESCRIPTOR.enum_types_by_name['UserBackend'] = _USERBACKEND
DESCRIPTOR.enum_types_by_name['UserType'] = _USERTYPE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
CreateUserRequest = _reflection.GeneratedProtocolMessageType('CreateUserRequest', (_message.Message,), {
'DESCRIPTOR' : _CREATEUSERREQUEST,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.CreateUserRequest)
})
_sym_db.RegisterMessage(CreateUserRequest)
UpdateUserRequest = _reflection.GeneratedProtocolMessageType('UpdateUserRequest', (_message.Message,), {
'DESCRIPTOR' : _UPDATEUSERREQUEST,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.UpdateUserRequest)
})
_sym_db.RegisterMessage(UpdateUserRequest)
UserRequest = _reflection.GeneratedProtocolMessageType('UserRequest', (_message.Message,), {
'DESCRIPTOR' : _USERREQUEST,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.UserRequest)
})
_sym_db.RegisterMessage(UserRequest)
GetUserRequest = _reflection.GeneratedProtocolMessageType('GetUserRequest', (_message.Message,), {
'DESCRIPTOR' : _GETUSERREQUEST,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.GetUserRequest)
})
_sym_db.RegisterMessage(GetUserRequest)
UserQuery = _reflection.GeneratedProtocolMessageType('UserQuery', (_message.Message,), {
'DESCRIPTOR' : _USERQUERY,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.UserQuery)
})
_sym_db.RegisterMessage(UserQuery)
UserInfo = _reflection.GeneratedProtocolMessageType('UserInfo', (_message.Message,), {
'DESCRIPTOR' : _USERINFO,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.UserInfo)
})
_sym_db.RegisterMessage(UserInfo)
UsersInfo = _reflection.GeneratedProtocolMessageType('UsersInfo', (_message.Message,), {
'DESCRIPTOR' : _USERSINFO,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.UsersInfo)
})
_sym_db.RegisterMessage(UsersInfo)
UserStatQuery = _reflection.GeneratedProtocolMessageType('UserStatQuery', (_message.Message,), {
'DESCRIPTOR' : _USERSTATQUERY,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.UserStatQuery)
})
_sym_db.RegisterMessage(UserStatQuery)
FindUserSearch = _reflection.GeneratedProtocolMessageType('FindUserSearch', (_message.Message,), {
'DESCRIPTOR' : _FINDUSERSEARCH,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.FindUserSearch)
})
_sym_db.RegisterMessage(FindUserSearch)
FindUserQuery = _reflection.GeneratedProtocolMessageType('FindUserQuery', (_message.Message,), {
'DESCRIPTOR' : _FINDUSERQUERY,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.FindUserQuery)
})
_sym_db.RegisterMessage(FindUserQuery)
FindUserInfo = _reflection.GeneratedProtocolMessageType('FindUserInfo', (_message.Message,), {
'DESCRIPTOR' : _FINDUSERINFO,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.FindUserInfo)
})
_sym_db.RegisterMessage(FindUserInfo)
FindUsersInfo = _reflection.GeneratedProtocolMessageType('FindUsersInfo', (_message.Message,), {
'DESCRIPTOR' : _FINDUSERSINFO,
'__module__' : 'spaceone.api.identity.v1.user_pb2'
# @@protoc_insertion_point(class_scope:spaceone.api.identity.v1.FindUsersInfo)
})
_sym_db.RegisterMessage(FindUsersInfo)
_USER = _descriptor.ServiceDescriptor(
name='User',
full_name='spaceone.api.identity.v1.User',
file=DESCRIPTOR,
index=0,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=2107,
serialized_end=3321,
methods=[
_descriptor.MethodDescriptor(
name='create',
full_name='spaceone.api.identity.v1.User.create',
index=0,
containing_service=None,
input_type=_CREATEUSERREQUEST,
output_type=_USERINFO,
serialized_options=b'\202\323\344\223\002\024\"\022/identity/v1/users',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='update',
full_name='spaceone.api.identity.v1.User.update',
index=1,
containing_service=None,
input_type=_UPDATEUSERREQUEST,
output_type=_USERINFO,
serialized_options=b'\202\323\344\223\002\024\032\022/identity/v1/users',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='enable',
full_name='spaceone.api.identity.v1.User.enable',
index=2,
containing_service=None,
input_type=_USERREQUEST,
output_type=_USERINFO,
serialized_options=b'\202\323\344\223\002$\032\"/identity/v1/user/{user_id}/enable',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='disable',
full_name='spaceone.api.identity.v1.User.disable',
index=3,
containing_service=None,
input_type=_USERREQUEST,
output_type=_USERINFO,
serialized_options=b'\202\323\344\223\002%\032#/identity/v1/user/{user_id}/disable',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='delete',
full_name='spaceone.api.identity.v1.User.delete',
index=4,
containing_service=None,
input_type=_USERREQUEST,
output_type=google_dot_protobuf_dot_empty__pb2._EMPTY,
serialized_options=b'\202\323\344\223\002\024*\022/identity/v1/users',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='get',
full_name='spaceone.api.identity.v1.User.get',
index=5,
containing_service=None,
input_type=_GETUSERREQUEST,
output_type=_USERINFO,
serialized_options=b'\202\323\344\223\002\035\022\033/identity/v1/user/{user_id}',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='list',
full_name='spaceone.api.identity.v1.User.list',
index=6,
containing_service=None,
input_type=_USERQUERY,
output_type=_USERSINFO,
serialized_options=b'\202\323\344\223\0021\022\022/identity/v1/usersZ\033\"\031/identity/v1/users/search',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='stat',
full_name='spaceone.api.identity.v1.User.stat',
index=7,
containing_service=None,
input_type=_USERSTATQUERY,
output_type=google_dot_protobuf_dot_struct__pb2._STRUCT,
serialized_options=b'\202\323\344\223\002\031\"\027/identity/v1/users/stat',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='find',
full_name='spaceone.api.identity.v1.User.find',
index=8,
containing_service=None,
input_type=_FINDUSERQUERY,
output_type=_FINDUSERSINFO,
serialized_options=b'\202\323\344\223\002\031\"\027/identity/v1/users/find',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='sync',
full_name='spaceone.api.identity.v1.User.sync',
index=9,
containing_service=None,
input_type=_USERREQUEST,
output_type=_USERINFO,
serialized_options=b'\202\323\344\223\002\031\"\027/identity/v1/users/sync',
create_key=_descriptor._internal_create_key,
),
])
_sym_db.RegisterServiceDescriptor(_USER)
DESCRIPTOR.services_by_name['User'] = _USER
# @@protoc_insertion_point(module_scope)
| 46.683919 | 5,329 | 0.755444 | 6,701 | 50,512 | 5.394717 | 0.050589 | 0.043596 | 0.081992 | 0.081328 | 0.826362 | 0.792337 | 0.77148 | 0.723237 | 0.703928 | 0.688382 | 0 | 0.045616 | 0.114646 | 50,512 | 1,081 | 5,330 | 46.727105 | 0.762729 | 0.022688 | 0 | 0.678536 | 1 | 0.061325 | 0.229207 | 0.193114 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.002967 | 0.008902 | 0 | 0.008902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4b9d6e8ba7ca2d54b7036a7bd47498072212b718 | 105 | py | Python | fe/live/code/browser/__init__.py | egnartsms/livejs | e421fb52c5be58a4064a2238a1570e99a11ccdde | [
"MIT"
] | 2 | 2019-11-21T21:16:32.000Z | 2020-01-31T09:44:06.000Z | fe/live/code/browser/__init__.py | egnartsms/livejs | e421fb52c5be58a4064a2238a1570e99a11ccdde | [
"MIT"
] | null | null | null | fe/live/code/browser/__init__.py | egnartsms/livejs | e421fb52c5be58a4064a2238a1570e99a11ccdde | [
"MIT"
] | null | null | null | from .edit_commands import * # noqa
from .sel_commands import * # noqa
from .listener import * # noqa
| 26.25 | 36 | 0.714286 | 14 | 105 | 5.214286 | 0.5 | 0.410959 | 0.493151 | 0.60274 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 105 | 3 | 37 | 35 | 0.869048 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
29c478b3069d2a11cc04a949d934009f827f72a0 | 241 | py | Python | semantic_aware_models/models/language/bert/__init__.py | ITAINNOVA/SAME | d46dda98753fcb3606e04c3db2d20c9e700140e8 | [
"OML"
] | null | null | null | semantic_aware_models/models/language/bert/__init__.py | ITAINNOVA/SAME | d46dda98753fcb3606e04c3db2d20c9e700140e8 | [
"OML"
] | null | null | null | semantic_aware_models/models/language/bert/__init__.py | ITAINNOVA/SAME | d46dda98753fcb3606e04c3db2d20c9e700140e8 | [
"OML"
] | 1 | 2020-03-19T12:41:54.000Z | 2020-03-19T12:41:54.000Z | from __future__ import absolute_import
from semantic_aware_models.models.language.bert.data_processors import *
from semantic_aware_models.models.language.bert.inputs import *
from semantic_aware_models.models.language.bert.metrics import * | 48.2 | 72 | 0.871369 | 33 | 241 | 6 | 0.393939 | 0.151515 | 0.272727 | 0.348485 | 0.712121 | 0.712121 | 0.712121 | 0.712121 | 0 | 0 | 0 | 0 | 0.06639 | 241 | 5 | 73 | 48.2 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
29daffbd1541eb42b8cfab50ca7b89cea075b6cf | 7,081 | py | Python | test/test_certificates_api.py | cvent/octopus-deploy-api-client | 0e03e842e1beb29b132776aee077df570b88366a | [
"Apache-2.0"
] | null | null | null | test/test_certificates_api.py | cvent/octopus-deploy-api-client | 0e03e842e1beb29b132776aee077df570b88366a | [
"Apache-2.0"
] | null | null | null | test/test_certificates_api.py | cvent/octopus-deploy-api-client | 0e03e842e1beb29b132776aee077df570b88366a | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Octopus Server API
No description provided (generated by Swagger Codegen https://github.com/swagger-api/swagger-codegen) # noqa: E501
OpenAPI spec version: 2019.6.7+Branch.tags-2019.6.7.Sha.aa18dc6809953218c66f57eff7d26481d9b23d6a
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import unittest
import octopus_deploy_swagger_client
from octopus_deploy_client.certificates_api import CertificatesApi # noqa: E501
from octopus_deploy_swagger_client.rest import ApiException
class TestCertificatesApi(unittest.TestCase):
"""CertificatesApi unit test stubs"""
def setUp(self):
self.api = octopus_deploy_client.certificates_api.CertificatesApi() # noqa: E501
def tearDown(self):
pass
def test_create_response_descriptor_certificate_certificate_resource(self):
"""Test case for create_response_descriptor_certificate_certificate_resource
Create a CertificateResource # noqa: E501
"""
pass
def test_create_response_descriptor_certificate_certificate_resource_spaces(self):
"""Test case for create_response_descriptor_certificate_certificate_resource_spaces
Create a CertificateResource # noqa: E501
"""
pass
def test_custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_archive_action(self):
"""Test case for custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_archive_action
"""
pass
def test_custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_archive_action_spaces(self):
"""Test case for custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_archive_action_spaces
"""
pass
def test_custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_export_action(self):
"""Test case for custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_export_action
"""
pass
def test_custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_export_action_spaces(self):
"""Test case for custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_export_action_spaces
"""
pass
def test_custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_replace_action(self):
"""Test case for custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_replace_action
"""
pass
def test_custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_replace_action_spaces(self):
"""Test case for custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_replace_action_spaces
"""
pass
def test_custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_un_archive_action(self):
"""Test case for custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_un_archive_action
"""
pass
def test_custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_un_archive_action_spaces(self):
"""Test case for custom_action_response_descriptor_octopus_server_web_api_actions_certificates_certificate_un_archive_action_spaces
"""
pass
def test_custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificate_by_id_or_thumbprint_responder(self):
"""Test case for custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificate_by_id_or_thumbprint_responder
"""
pass
def test_custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificate_by_id_or_thumbprint_responder_spaces(self):
"""Test case for custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificate_by_id_or_thumbprint_responder_spaces
"""
pass
def test_custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificate_usage_responder(self):
"""Test case for custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificate_usage_responder
"""
pass
def test_custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificate_usage_responder_spaces(self):
"""Test case for custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificate_usage_responder_spaces
"""
pass
def test_custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificates_list_responder(self):
"""Test case for custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificates_list_responder
"""
pass
def test_custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificates_list_responder_spaces(self):
"""Test case for custom_query_response_descriptor_octopus_server_web_api_actions_certificates_certificates_list_responder_spaces
"""
pass
def test_delete_on_background_response_descriptor_certificate_certificate_resource(self):
"""Test case for delete_on_background_response_descriptor_certificate_certificate_resource
Delete a CertificateResource by ID # noqa: E501
"""
pass
def test_delete_on_background_response_descriptor_certificate_certificate_resource_spaces(self):
"""Test case for delete_on_background_response_descriptor_certificate_certificate_resource_spaces
Delete a CertificateResource by ID # noqa: E501
"""
pass
def test_list_all_response_descriptor_certificate_certificate_resource(self):
"""Test case for list_all_response_descriptor_certificate_certificate_resource
Get a list of CertificateResources # noqa: E501
"""
pass
def test_list_all_response_descriptor_certificate_certificate_resource_spaces(self):
"""Test case for list_all_response_descriptor_certificate_certificate_resource_spaces
Get a list of CertificateResources # noqa: E501
"""
pass
def test_modify_response_descriptor_certificate_certificate_resource(self):
"""Test case for modify_response_descriptor_certificate_certificate_resource
Modify a CertificateResource by ID # noqa: E501
"""
pass
def test_modify_response_descriptor_certificate_certificate_resource_spaces(self):
"""Test case for modify_response_descriptor_certificate_certificate_resource_spaces
Modify a CertificateResource by ID # noqa: E501
"""
pass
if __name__ == '__main__':
unittest.main()
| 40.462857 | 150 | 0.802994 | 821 | 7,081 | 6.300853 | 0.105968 | 0.153103 | 0.135318 | 0.167794 | 0.906244 | 0.893099 | 0.893099 | 0.876087 | 0.852503 | 0.830853 | 0 | 0.012046 | 0.15591 | 7,081 | 174 | 151 | 40.695402 | 0.853438 | 0.46533 | 0 | 0.410714 | 1 | 0 | 0.002282 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.410714 | 0.089286 | 0 | 0.535714 | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 10 |
d9a1665ba3a7e1275e3d039c81a3f7d1f4600d72 | 38 | py | Python | examples/tuple_basic.py | igfish/toyvm | bb1ab371a8c71ba01522556235fc9f017c9b6b8f | [
"MIT"
] | null | null | null | examples/tuple_basic.py | igfish/toyvm | bb1ab371a8c71ba01522556235fc9f017c9b6b8f | [
"MIT"
] | null | null | null | examples/tuple_basic.py | igfish/toyvm | bb1ab371a8c71ba01522556235fc9f017c9b6b8f | [
"MIT"
] | null | null | null | x = 1
t = (x, 1, 2, 3, 4, 5)
print(t)
| 9.5 | 22 | 0.394737 | 11 | 38 | 1.363636 | 0.727273 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 0.315789 | 38 | 3 | 23 | 12.666667 | 0.346154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d9b408cdb0c032e38ea6f50f9b57c9611f5a5a06 | 1,978 | py | Python | local_api/serializers.py | Planthive/PlantHive_WebApp | 3ae9623406348981b4873b6d857ee9124188a4ee | [
"Apache-2.0"
] | null | null | null | local_api/serializers.py | Planthive/PlantHive_WebApp | 3ae9623406348981b4873b6d857ee9124188a4ee | [
"Apache-2.0"
] | null | null | null | local_api/serializers.py | Planthive/PlantHive_WebApp | 3ae9623406348981b4873b6d857ee9124188a4ee | [
"Apache-2.0"
] | null | null | null | from rest_framework import serializers
#from .models import Hero, upload
from .models import growth_schedule, manual_schedule
# class HeroSerializer(serializers.HyperlinkedModelSerializer):
# class Meta:
# model = Hero
# fields = ('name', 'alias')
class Growth_scheduleSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = growth_schedule
fields = '__all__'
isactive = serializers.BooleanField()
timestamps = serializers.JSONField()
def create(self, validated_data):
"""
Create and return a new `Upload` instance, given the validated data.
"""
return upload.objects.create(**validated_data)
def update(self, instance, validated_data):
"""
Update and return an existing `Snippet` instance, given the validated data.
"""
instance.isactive = validated_data.get('isactive', instance.isactive)
instance.timestamps = validated_data.get('timestamps', instance.timestamps)
instance.save()
return instance
class Manual_scheduleSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = manual_schedule
fields = '__all__'
isactive = serializers.BooleanField()
timestamps = serializers.JSONField()
def create(self, validated_data):
"""
Create and return a new `Upload` instance, given the validated data.
"""
return upload.objects.create(**validated_data)
def update(self, instance, validated_data):
"""
Update and return an existing `Snippet` instance, given the validated data.
"""
instance.isactive = validated_data.get('isactive', instance.isactive)
instance.timestamps = validated_data.get('timestamps', instance.timestamps)
instance.save()
return instance
| 36.62963 | 87 | 0.637007 | 182 | 1,978 | 6.785714 | 0.241758 | 0.147368 | 0.051822 | 0.080972 | 0.85749 | 0.816194 | 0.704453 | 0.704453 | 0.704453 | 0.704453 | 0 | 0 | 0.277553 | 1,978 | 53 | 88 | 37.320755 | 0.864241 | 0.230536 | 0 | 0.785714 | 0 | 0 | 0.035894 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.071429 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d9e467de1693cffa29cb6e57a6d52c188844c19c | 10,356 | py | Python | validations_libs/tests/test_validation_logs.py | openstack/validations-libs | 7d416acbe89a9ba23cabfd4e97c80affe57e06cb | [
"Apache-2.0"
] | 1 | 2020-03-11T09:13:28.000Z | 2020-03-11T09:13:28.000Z | validations_libs/tests/test_validation_logs.py | openstack/validations-libs | 7d416acbe89a9ba23cabfd4e97c80affe57e06cb | [
"Apache-2.0"
] | null | null | null | validations_libs/tests/test_validation_logs.py | openstack/validations-libs | 7d416acbe89a9ba23cabfd4e97c80affe57e06cb | [
"Apache-2.0"
] | 1 | 2021-03-23T08:31:43.000Z | 2021-03-23T08:31:43.000Z | # Copyright 2020 Red Hat, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
try:
from unittest import mock
except ImportError:
import mock
from unittest import TestCase
from validations_libs.validation_logs import ValidationLogs
from validations_libs.tests import fakes
class TestValidationLogs(TestCase):
def setUp(self):
super(TestValidationLogs, self).setUp()
@mock.patch('json.load', return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST)
@mock.patch('six.moves.builtins.open')
def test_validation_log_file(self, mock_open, mock_json):
vlogs = ValidationLogs('/tmp/foo')
content = vlogs._get_content('/tmp/foo/bar.json')
self.assertEqual(content, fakes.VALIDATIONS_LOGS_CONTENTS_LIST)
@mock.patch('six.moves.builtins.open')
def test_log_not_found(self, mock_open):
mock_open.side_effect = IOError()
vlogs = ValidationLogs()
self.assertRaises(
IOError,
vlogs._get_content,
'/var/log/non-existing.json'
)
@mock.patch('glob.glob')
@mock.patch('json.load')
@mock.patch('six.moves.builtins.open')
def test_get_logfile_by_validation(self, mock_open, mock_json, mock_glob):
mock_glob.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
vlogs = ValidationLogs('/tmp/foo')
log = vlogs.get_logfile_by_validation('foo')
self.assertEqual(log,
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json'])
@mock.patch('glob.glob')
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_logfile_content_by_validation(self, mock_open, mock_json,
mock_glob):
mock_glob.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
vlogs = ValidationLogs('/tmp/foo')
content = vlogs.get_logfile_content_by_validation('foo')
self.assertEqual(content, fakes.VALIDATIONS_LOGS_CONTENTS_LIST)
@mock.patch('glob.glob')
@mock.patch('json.load')
@mock.patch('six.moves.builtins.open')
def test_get_logfile_by_uuid(self, mock_open, mock_json, mock_glob):
mock_glob.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
vlogs = ValidationLogs('/tmp/foo')
log = vlogs.get_logfile_by_uuid('123')
self.assertEqual(log,
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json'])
@mock.patch('glob.glob')
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_logfile_content_by_uuid(self, mock_open, mock_json,
mock_glob):
mock_glob.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
vlogs = ValidationLogs('/tmp/foo')
content = vlogs.get_logfile_content_by_uuid('123')
self.assertEqual(content, fakes.VALIDATIONS_LOGS_CONTENTS_LIST)
@mock.patch('glob.glob')
@mock.patch('json.load')
@mock.patch('six.moves.builtins.open')
def test_get_logfile_by_uuid_validation_id(self, mock_open, mock_json,
mock_glob):
mock_glob.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
vlogs = ValidationLogs('/tmp/foo')
log = vlogs.get_logfile_by_uuid_validation_id('123', 'foo')
self.assertEqual(log,
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json'])
@mock.patch('glob.glob')
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_logfile_content_by_uuid_validation_id(self, mock_open,
mock_json,
mock_glob):
mock_glob.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
vlogs = ValidationLogs('/tmp/foo')
content = vlogs.get_logfile_content_by_uuid_validation_id('123', 'foo')
self.assertEqual(content, fakes.VALIDATIONS_LOGS_CONTENTS_LIST)
@mock.patch('os.path.isfile')
@mock.patch('os.listdir')
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_all_logfiles(self, mock_open, mock_json,
mock_listdir, mock_isfile):
mock_listdir.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
mock_isfile.return_value = True
vlogs = ValidationLogs('/tmp/foo')
log = vlogs.get_all_logfiles()
self.assertEqual(log,
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json'])
@mock.patch('os.path.isfile')
@mock.patch('os.listdir')
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_all_logfiles_yaml(self, mock_open, mock_json,
mock_listdir, mock_isfile):
mock_listdir.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json',
'/tmp/123_foo_2020-03-30T13:17:22.447857Z.yaml']
mock_isfile.return_value = True
vlogs = ValidationLogs('/tmp/foo')
log = vlogs.get_all_logfiles(extension='yaml')
self.assertEqual(log,
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.yaml'])
@mock.patch('os.path.isfile')
@mock.patch('os.listdir')
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_all_logfiles_bad_name(self, mock_open, mock_json,
mock_listdir, mock_isfile):
mock_listdir.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json',
'/tmp/fooo_json.py']
mock_isfile.return_value = True
vlogs = ValidationLogs('/tmp/foo')
log = vlogs.get_all_logfiles()
self.assertEqual(log,
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json'])
@mock.patch('os.path.isfile')
@mock.patch('os.listdir')
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_all_logfiles_content(self, mock_open, mock_json,
mock_listdir, mock_isfile):
mock_listdir.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
mock_isfile.return_value = True
vlogs = ValidationLogs('/tmp/foo')
content = vlogs.get_all_logfiles_content()
self.assertEqual(content, fakes.VALIDATIONS_LOGS_CONTENTS_LIST)
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_validations_stats(self, mock_open, mock_json):
vlogs = ValidationLogs('/tmp/foo')
content = vlogs.get_validations_stats(
fakes.VALIDATIONS_LOGS_CONTENTS_LIST)
self.assertEqual(content, fakes.VALIDATIONS_STATS)
@mock.patch('validations_libs.validation_logs.ValidationLogs.'
'get_logfile_by_uuid_validation_id')
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_results(self, mock_open, mock_json, mock_get_validation):
mock_get_validation.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
vlogs = ValidationLogs('/tmp/foo')
content = vlogs.get_results(uuid='123', validation_id='foo')
self.assertEqual(content, [{
'UUID': '123',
'Validations': 'foo',
'Status': 'PASSED',
'Status_by_Host': 'undercloud,PASSED',
'Host_Group': 'undercloud',
'Unreachable_Hosts': '',
'Duration': '0:00:03.753',
'Validations': 'foo'}])
def test_get_results_none(self):
vlogs = ValidationLogs('/tmp/foo')
self.assertRaises(RuntimeError, vlogs.get_results, uuid=None)
@mock.patch('validations_libs.validation_logs.ValidationLogs.'
'get_logfile_by_uuid_validation_id')
@mock.patch('json.load',
return_value=fakes.VALIDATIONS_LOGS_CONTENTS_LIST[0])
@mock.patch('six.moves.builtins.open')
def test_get_results_list(self, mock_open, mock_json, mock_get_validation):
mock_get_validation.return_value = \
['/tmp/123_foo_2020-03-30T13:17:22.447857Z.json']
vlogs = ValidationLogs('/tmp/foo')
content = vlogs.get_results(uuid=['123', '123'], validation_id='foo')
self.assertEqual(content, [
{
'UUID': '123',
'Validations': 'foo',
'Status': 'PASSED',
'Status_by_Host': 'undercloud,PASSED',
'Host_Group': 'undercloud',
'Unreachable_Hosts': '',
'Duration': '0:00:03.753',
'Validations': 'foo'},
{
'UUID': '123',
'Validations': 'foo',
'Status': 'PASSED',
'Status_by_Host': 'undercloud,PASSED',
'Host_Group': 'undercloud',
'Unreachable_Hosts': '',
'Duration': '0:00:03.753',
'Validations': 'foo'}])
| 42.793388 | 79 | 0.618772 | 1,250 | 10,356 | 4.868 | 0.1192 | 0.066557 | 0.028102 | 0.040592 | 0.833361 | 0.813476 | 0.813476 | 0.811011 | 0.806081 | 0.805423 | 0 | 0.066061 | 0.255987 | 10,356 | 241 | 80 | 42.970954 | 0.723686 | 0.054944 | 0 | 0.718447 | 0 | 0 | 0.231627 | 0.142068 | 0 | 0 | 0 | 0 | 0.07767 | 1 | 0.082524 | false | 0.029126 | 0.029126 | 0 | 0.116505 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d9e8f634ae4a4e469bf8ee0f749ab84efd9981ff | 13,908 | py | Python | tests/handlers/test_presence.py | xsteadfastx/synapse | 9a8ae6f1bf833b58416fae1add1972ac3e9d2d59 | [
"Apache-2.0"
] | 1 | 2017-02-03T18:58:29.000Z | 2017-02-03T18:58:29.000Z | tests/handlers/test_presence.py | xsteadfastx/synapse | 9a8ae6f1bf833b58416fae1add1972ac3e9d2d59 | [
"Apache-2.0"
] | null | null | null | tests/handlers/test_presence.py | xsteadfastx/synapse | 9a8ae6f1bf833b58416fae1add1972ac3e9d2d59 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright 2016 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from tests import unittest
from mock import Mock, call
from synapse.api.constants import PresenceState
from synapse.handlers.presence import (
handle_update, handle_timeout,
IDLE_TIMER, SYNC_ONLINE_TIMEOUT, LAST_ACTIVE_GRANULARITY, FEDERATION_TIMEOUT,
FEDERATION_PING_INTERVAL,
)
from synapse.storage.presence import UserPresenceState
class PresenceUpdateTestCase(unittest.TestCase):
def test_offline_to_online(self):
wheel_timer = Mock()
user_id = "@foo:bar"
now = 5000000
prev_state = UserPresenceState.default(user_id)
new_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
)
state, persist_and_notify, federation_ping = handle_update(
prev_state, new_state, is_mine=True, wheel_timer=wheel_timer, now=now
)
self.assertTrue(persist_and_notify)
self.assertTrue(state.currently_active)
self.assertEquals(new_state.state, state.state)
self.assertEquals(new_state.status_msg, state.status_msg)
self.assertEquals(state.last_federation_update_ts, now)
self.assertEquals(wheel_timer.insert.call_count, 3)
wheel_timer.insert.assert_has_calls([
call(
now=now,
obj=user_id,
then=new_state.last_active_ts + IDLE_TIMER
),
call(
now=now,
obj=user_id,
then=new_state.last_user_sync_ts + SYNC_ONLINE_TIMEOUT
),
call(
now=now,
obj=user_id,
then=new_state.last_active_ts + LAST_ACTIVE_GRANULARITY
),
], any_order=True)
def test_online_to_online(self):
wheel_timer = Mock()
user_id = "@foo:bar"
now = 5000000
prev_state = UserPresenceState.default(user_id)
prev_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
currently_active=True,
)
new_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
)
state, persist_and_notify, federation_ping = handle_update(
prev_state, new_state, is_mine=True, wheel_timer=wheel_timer, now=now
)
self.assertFalse(persist_and_notify)
self.assertTrue(federation_ping)
self.assertTrue(state.currently_active)
self.assertEquals(new_state.state, state.state)
self.assertEquals(new_state.status_msg, state.status_msg)
self.assertEquals(state.last_federation_update_ts, now)
self.assertEquals(wheel_timer.insert.call_count, 3)
wheel_timer.insert.assert_has_calls([
call(
now=now,
obj=user_id,
then=new_state.last_active_ts + IDLE_TIMER
),
call(
now=now,
obj=user_id,
then=new_state.last_user_sync_ts + SYNC_ONLINE_TIMEOUT
),
call(
now=now,
obj=user_id,
then=new_state.last_active_ts + LAST_ACTIVE_GRANULARITY
),
], any_order=True)
def test_online_to_online_last_active_noop(self):
wheel_timer = Mock()
user_id = "@foo:bar"
now = 5000000
prev_state = UserPresenceState.default(user_id)
prev_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now - LAST_ACTIVE_GRANULARITY - 10,
currently_active=True,
)
new_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
)
state, persist_and_notify, federation_ping = handle_update(
prev_state, new_state, is_mine=True, wheel_timer=wheel_timer, now=now
)
self.assertFalse(persist_and_notify)
self.assertTrue(federation_ping)
self.assertTrue(state.currently_active)
self.assertEquals(new_state.state, state.state)
self.assertEquals(new_state.status_msg, state.status_msg)
self.assertEquals(state.last_federation_update_ts, now)
self.assertEquals(wheel_timer.insert.call_count, 3)
wheel_timer.insert.assert_has_calls([
call(
now=now,
obj=user_id,
then=new_state.last_active_ts + IDLE_TIMER
),
call(
now=now,
obj=user_id,
then=new_state.last_user_sync_ts + SYNC_ONLINE_TIMEOUT
),
call(
now=now,
obj=user_id,
then=new_state.last_active_ts + LAST_ACTIVE_GRANULARITY
),
], any_order=True)
def test_online_to_online_last_active(self):
wheel_timer = Mock()
user_id = "@foo:bar"
now = 5000000
prev_state = UserPresenceState.default(user_id)
prev_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now - LAST_ACTIVE_GRANULARITY - 1,
currently_active=True,
)
new_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
)
state, persist_and_notify, federation_ping = handle_update(
prev_state, new_state, is_mine=True, wheel_timer=wheel_timer, now=now
)
self.assertTrue(persist_and_notify)
self.assertFalse(state.currently_active)
self.assertEquals(new_state.state, state.state)
self.assertEquals(new_state.status_msg, state.status_msg)
self.assertEquals(state.last_federation_update_ts, now)
self.assertEquals(wheel_timer.insert.call_count, 2)
wheel_timer.insert.assert_has_calls([
call(
now=now,
obj=user_id,
then=new_state.last_active_ts + IDLE_TIMER
),
call(
now=now,
obj=user_id,
then=new_state.last_user_sync_ts + SYNC_ONLINE_TIMEOUT
)
], any_order=True)
def test_remote_ping_timer(self):
wheel_timer = Mock()
user_id = "@foo:bar"
now = 5000000
prev_state = UserPresenceState.default(user_id)
prev_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
)
new_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
)
state, persist_and_notify, federation_ping = handle_update(
prev_state, new_state, is_mine=False, wheel_timer=wheel_timer, now=now
)
self.assertFalse(persist_and_notify)
self.assertFalse(federation_ping)
self.assertFalse(state.currently_active)
self.assertEquals(new_state.state, state.state)
self.assertEquals(new_state.status_msg, state.status_msg)
self.assertEquals(wheel_timer.insert.call_count, 1)
wheel_timer.insert.assert_has_calls([
call(
now=now,
obj=user_id,
then=new_state.last_federation_update_ts + FEDERATION_TIMEOUT
),
], any_order=True)
def test_online_to_offline(self):
wheel_timer = Mock()
user_id = "@foo:bar"
now = 5000000
prev_state = UserPresenceState.default(user_id)
prev_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
currently_active=True,
)
new_state = prev_state.copy_and_replace(
state=PresenceState.OFFLINE,
)
state, persist_and_notify, federation_ping = handle_update(
prev_state, new_state, is_mine=True, wheel_timer=wheel_timer, now=now
)
self.assertTrue(persist_and_notify)
self.assertEquals(new_state.state, state.state)
self.assertEquals(state.last_federation_update_ts, now)
self.assertEquals(wheel_timer.insert.call_count, 0)
def test_online_to_idle(self):
wheel_timer = Mock()
user_id = "@foo:bar"
now = 5000000
prev_state = UserPresenceState.default(user_id)
prev_state = prev_state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
currently_active=True,
)
new_state = prev_state.copy_and_replace(
state=PresenceState.UNAVAILABLE,
)
state, persist_and_notify, federation_ping = handle_update(
prev_state, new_state, is_mine=True, wheel_timer=wheel_timer, now=now
)
self.assertTrue(persist_and_notify)
self.assertEquals(new_state.state, state.state)
self.assertEquals(state.last_federation_update_ts, now)
self.assertEquals(new_state.state, state.state)
self.assertEquals(new_state.status_msg, state.status_msg)
self.assertEquals(wheel_timer.insert.call_count, 1)
wheel_timer.insert.assert_has_calls([
call(
now=now,
obj=user_id,
then=new_state.last_user_sync_ts + SYNC_ONLINE_TIMEOUT
)
], any_order=True)
class PresenceTimeoutTestCase(unittest.TestCase):
def test_idle_timer(self):
user_id = "@foo:bar"
now = 5000000
state = UserPresenceState.default(user_id)
state = state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now - IDLE_TIMER - 1,
last_user_sync_ts=now,
)
new_state = handle_timeout(
state, is_mine=True, syncing_user_ids=set(), now=now
)
self.assertIsNotNone(new_state)
self.assertEquals(new_state.state, PresenceState.UNAVAILABLE)
def test_sync_timeout(self):
user_id = "@foo:bar"
now = 5000000
state = UserPresenceState.default(user_id)
state = state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
last_user_sync_ts=now - SYNC_ONLINE_TIMEOUT - 1,
)
new_state = handle_timeout(
state, is_mine=True, syncing_user_ids=set(), now=now
)
self.assertIsNotNone(new_state)
self.assertEquals(new_state.state, PresenceState.OFFLINE)
def test_sync_online(self):
user_id = "@foo:bar"
now = 5000000
state = UserPresenceState.default(user_id)
state = state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now - SYNC_ONLINE_TIMEOUT - 1,
last_user_sync_ts=now - SYNC_ONLINE_TIMEOUT - 1,
)
new_state = handle_timeout(
state, is_mine=True, syncing_user_ids=set([user_id]), now=now
)
self.assertIsNotNone(new_state)
self.assertEquals(new_state.state, PresenceState.ONLINE)
def test_federation_ping(self):
user_id = "@foo:bar"
now = 5000000
state = UserPresenceState.default(user_id)
state = state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
last_user_sync_ts=now,
last_federation_update_ts=now - FEDERATION_PING_INTERVAL - 1,
)
new_state = handle_timeout(
state, is_mine=True, syncing_user_ids=set(), now=now
)
self.assertIsNotNone(new_state)
self.assertEquals(new_state, new_state)
def test_no_timeout(self):
user_id = "@foo:bar"
now = 5000000
state = UserPresenceState.default(user_id)
state = state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
last_user_sync_ts=now,
last_federation_update_ts=now,
)
new_state = handle_timeout(
state, is_mine=True, syncing_user_ids=set(), now=now
)
self.assertIsNone(new_state)
def test_federation_timeout(self):
user_id = "@foo:bar"
now = 5000000
state = UserPresenceState.default(user_id)
state = state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now,
last_user_sync_ts=now,
last_federation_update_ts=now - FEDERATION_TIMEOUT - 1,
)
new_state = handle_timeout(
state, is_mine=False, syncing_user_ids=set(), now=now
)
self.assertIsNotNone(new_state)
self.assertEquals(new_state.state, PresenceState.OFFLINE)
def test_last_active(self):
user_id = "@foo:bar"
now = 5000000
state = UserPresenceState.default(user_id)
state = state.copy_and_replace(
state=PresenceState.ONLINE,
last_active_ts=now - LAST_ACTIVE_GRANULARITY - 1,
last_user_sync_ts=now,
last_federation_update_ts=now,
)
new_state = handle_timeout(
state, is_mine=True, syncing_user_ids=set(), now=now
)
self.assertIsNotNone(new_state)
self.assertEquals(state, new_state)
| 32.344186 | 82 | 0.622807 | 1,612 | 13,908 | 5.038462 | 0.08995 | 0.061069 | 0.033982 | 0.046787 | 0.862965 | 0.859394 | 0.853854 | 0.850529 | 0.850529 | 0.845605 | 0 | 0.012711 | 0.298605 | 13,908 | 429 | 83 | 32.41958 | 0.819887 | 0.041199 | 0 | 0.764706 | 0 | 0 | 0.008408 | 0 | 0 | 0 | 0 | 0 | 0.179412 | 1 | 0.041176 | false | 0 | 0.014706 | 0 | 0.061765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8a07d89364e43f17233663f288619c550e0f48c9 | 11,252 | py | Python | ietf/dbtemplate/migrations/0005_adjust_assignment_email_summary_templates_2526.py | hassanakbar4/ietfdb | cabee059092ae776015410640226064331c293b7 | [
"BSD-3-Clause"
] | 25 | 2022-03-05T08:26:52.000Z | 2022-03-30T15:45:42.000Z | ietf/dbtemplate/migrations/0005_adjust_assignment_email_summary_templates_2526.py | hassanakbar4/ietfdb | cabee059092ae776015410640226064331c293b7 | [
"BSD-3-Clause"
] | 219 | 2022-03-04T17:29:12.000Z | 2022-03-31T21:16:14.000Z | ietf/dbtemplate/migrations/0005_adjust_assignment_email_summary_templates_2526.py | hassanakbar4/ietfdb | cabee059092ae776015410640226064331c293b7 | [
"BSD-3-Clause"
] | 22 | 2022-03-04T15:34:34.000Z | 2022-03-28T13:30:59.000Z | # Copyright The IETF Trust 2019-2020, All Rights Reserved
# -*- coding: utf-8 -*-
# Generated by Django 1.11.20 on 2019-03-13 13:41
from django.db import migrations
def forward(apps, schema_editor):
DBTemplate = apps.get_model('dbtemplate','DBTemplate')
DBTemplate.objects.filter(pk=182).update(content="""{% autoescape off %}Subject: Open review assignments in {{group.acronym}}
The following reviewers have assignments:{% for r in review_assignments %}{% ifchanged r.section %}
{{r.section}}
{% if r.section == 'Early review requests:' %}Reviewer Due Draft{% else %}Reviewer LC end Draft{% endif %}{% endifchanged %}
{{ r.reviewer.person.plain_name|ljust:"22" }} {% if r.section == 'Early review requests:' %}{{ r.review_request.deadline|date:"Y-m-d" }}{% else %}{{ r.lastcall_ends|default:"None " }}{% endif %} {{ r.review_request.doc.name }}-{% if r.review_request.requested_rev %}{{ r.review_request.requested_rev }}{% else %}{{ r.review_request.doc.rev }}{% endif %} {{ r.earlier_reviews }}{% endfor %}
{% if rotation_list %}Next in the reviewer rotation:
{% for p in rotation_list %} {{ p }}
{% endfor %}{% endif %}{% endautoescape %}
""")
DBTemplate.objects.filter(pk=183).update(content="""{% autoescape off %}Subject: Review Assignments
Hi all,
The following reviewers have assignments:{% for r in review_assignments %}{% ifchanged r.section %}
{{r.section}}
{% if r.section == 'Early review requests:' %}Reviewer Due Draft{% else %}Reviewer Type LC end Draft{% endif %}{% endifchanged %}
{{ r.reviewer.person.plain_name|ljust:"22" }} {% if r.section == 'Early review requests:' %}{{ r.review_request.deadline|date:"Y-m-d" }}{% else %}{{r.review_request.type.name|ljust:"10"}}{{ r.lastcall_ends|default:"None " }}{% endif %} {{ r.review_request.doc.name }}-{% if r.review_request.requested_rev %}{{ r.review_request.requested_rev }}{% else %}{{ r.review_request.doc.rev }}{% endif %}{% if r.earlier_reviews %} {{ r.earlier_reviews }}{% endif %}{% endfor %}
{% if rotation_list %}Next in the reviewer rotation:
{% for p in rotation_list %} {{ p }}
{% endfor %}{% endif %}
The LC and Telechat review templates are included below:
-------------------------------------------------------
-- Begin LC Template --
I am the assigned Gen-ART reviewer for this draft. The General Area
Review Team (Gen-ART) reviews all IETF documents being processed
by the IESG for the IETF Chair. Please treat these comments just
like any other last call comments.
For more information, please see the FAQ at
<https://trac.ietf.org/trac/gen/wiki/GenArtfaq>.
Document:
Reviewer:
Review Date:
IETF LC End Date:
IESG Telechat date: (if known)
Summary:
Major issues:
Minor issues:
Nits/editorial comments:
-- End LC Template --
-- Begin Telechat Template --
I am the assigned Gen-ART reviewer for this draft. The General Area
Review Team (Gen-ART) reviews all IETF documents being processed
by the IESG for the IETF Chair. Please wait for direction from your
document shepherd or AD before posting a new version of the draft.
For more information, please see the FAQ at
<https://trac.ietf.org/trac/gen/wiki/GenArtfaq>.
Document:
Reviewer:
Review Date:
IETF LC End Date:
IESG Telechat date: (if known)
Summary:
Major issues:
Minor issues:
Nits/editorial comments:
-- End Telechat Template --
{% endautoescape %}
""")
DBTemplate.objects.filter(pk=184).update(content="""{% autoescape off %}Subject: Assignments
Review instructions and related resources are at:
http://tools.ietf.org/area/sec/trac/wiki/SecDirReview{% for r in review_assignments %}{% ifchanged r.section %}
{{r.section}}
{% if r.section == 'Early review requests:' %}Reviewer Due Draft{% else %}Reviewer LC end Draft{% endif %}{% endifchanged %}
{{ r.reviewer.person.plain_name|ljust:"22" }}{{ r.earlier_review|yesno:'R, , ' }}{% if r.section == 'Early review requests:' %}{{ r.review_request.deadline|date:"Y-m-d" }}{% else %}{{ r.lastcall_ends|default:"None " }}{% endif %} {{ r.review_request.doc.name }}-{% if r.review_request.requested_rev %}{{ r.review_request.requested_rev }}{% else %}{{ r.review_request.doc.rev }}{% endif %}{% endfor %}
{% if rotation_list %}Next in the reviewer rotation:
{% for p in rotation_list %} {{ p }}
{% endfor %}{% endif %}{% endautoescape %}
""")
DBTemplate.objects.filter(pk=185).update(content="""{% autoescape off %}Subject: Open review assignments in {{group.acronym}}
Review instructions and related resources are at:
<https://trac.ietf.org/trac/ops/wiki/Directorates>
The following reviewers have assignments:{% for r in review_assignments %}{% ifchanged r.section %}
{{r.section}}
{% if r.section == 'Early review requests:' %}Reviewer Due Draft{% else %}Reviewer LC end Draft{% endif %}{% endifchanged %}
{{ r.reviewer.person.plain_name|ljust:"22" }} {% if r.section == 'Early review requests:' %}{{ r.review_request.deadline|date:"Y-m-d" }}{% else %}{{ r.lastcall_ends|default:"None " }}{% endif %} {{ r.review_request.doc.name }}-{% if r.review_request.requested_rev %}{{ r.review_request.requested_rev }}{% else %}{{ r.review_request.doc.rev }}{% endif %} {{ r.earlier_reviews }}{% endfor %}
{% if rotation_list %}Next in the reviewer rotation:
{% for p in rotation_list %} {{ p }}
{% endfor %}{% endif %}{% endautoescape %}
""")
def reverse(apps, schema_editor):
DBTemplate = apps.get_model('dbtemplate','DBTemplate')
DBTemplate.objects.filter(pk=182).update(content="""{% autoescape off %}Subject: Open review assignments in {{group.acronym}}
The following reviewers have assignments:{% for r in review_assignments %}{% ifchanged r.section %}
{{r.section}}
{% if r.section == 'Early review requests:' %}Reviewer Due Draft{% else %}Reviewer LC end Draft{% endif %}{% endifchanged %}
{{ r.reviewer.person.plain_name|ljust:"22" }} {% if r.section == 'Early review requests:' %}{{ r.review_request.deadline|date:"Y-m-d" }}{% else %}{{ r.lastcall_ends|default:"None " }}{% endif %} {{ r.review_request.doc_id }}-{% if r.review_request..requested_rev %}{{ r.review_request.requested_rev }}{% else %}{{ r.review_request..doc.rev }}{% endif %} {{ r.earlier_review_mark }}{% endfor %}
* Other revision previously reviewed
** This revision already reviewed
{% if rotation_list %}Next in the reviewer rotation:
{% for p in rotation_list %} {{ p }}
{% endfor %}{% endif %}{% endautoescape %}
""")
DBTemplate.objects.filter(pk=183).update(content="""{% autoescape off %}Subject: Review Assignments
Hi all,
The following reviewers have assignments:{% for r in review_assignments %}{% ifchanged r.section %}
{{r.section}}
{% if r.section == 'Early review requests:' %}Reviewer Due Draft{% else %}Reviewer Type LC end Draft{% endif %}{% endifchanged %}
{{ r.reviewer.person.plain_name|ljust:"22" }} {% if r.section == 'Early review requests:' %}{{ r.review_request.deadline|date:"Y-m-d" }}{% else %}{{r.review_request.type.name|ljust:"10"}}{{ r.lastcall_ends|default:"None " }}{% endif %} {{ r.review_request.doc.name }}-{% if r.review_request.requested_rev %}{{ r.review_request.requested_rev }}{% else %}{{ r.review_request.doc.rev }}{% endif %}{% if r.earlier_review_mark %} {{ r.earlier_review_mark }}{% endif %}{% endfor %}
* Other revision previously reviewed
** This revision already reviewed
{% if rotation_list %}Next in the reviewer rotation:
{% for p in rotation_list %} {{ p }}
{% endfor %}{% endif %}
The LC and Telechat review templates are included below:
-------------------------------------------------------
-- Begin LC Template --
I am the assigned Gen-ART reviewer for this draft. The General Area
Review Team (Gen-ART) reviews all IETF documents being processed
by the IESG for the IETF Chair. Please treat these comments just
like any other last call comments.
For more information, please see the FAQ at
<https://trac.ietf.org/trac/gen/wiki/GenArtfaq>.
Document:
Reviewer:
Review Date:
IETF LC End Date:
IESG Telechat date: (if known)
Summary:
Major issues:
Minor issues:
Nits/editorial comments:
-- End LC Template --
-- Begin Telechat Template --
I am the assigned Gen-ART reviewer for this draft. The General Area
Review Team (Gen-ART) reviews all IETF documents being processed
by the IESG for the IETF Chair. Please wait for direction from your
document shepherd or AD before posting a new version of the draft.
For more information, please see the FAQ at
<https://trac.ietf.org/trac/gen/wiki/GenArtfaq>.
Document:
Reviewer:
Review Date:
IETF LC End Date:
IESG Telechat date: (if known)
Summary:
Major issues:
Minor issues:
Nits/editorial comments:
-- End Telechat Template --
{% endautoescape %}
""")
DBTemplate.objects.filter(pk=184).update(content="""{% autoescape off %}Subject: Assignments
Review instructions and related resources are at:
http://tools.ietf.org/area/sec/trac/wiki/SecDirReview{% for r in review_assignments %}{% ifchanged r.section %}
{{r.section}}
{% if r.section == 'Early review requests:' %}Reviewer Due Draft{% else %}Reviewer LC end Draft{% endif %}{% endifchanged %}
{{ r.reviewer.person.plain_name|ljust:"22" }}{{ r.earlier_review|yesno:'R, , ' }}{% if r.section == 'Early review requests:' %}{{ r.review_request.deadline|date:"Y-m-d" }}{% else %}{{ r.lastcall_ends|default:"None " }}{% endif %} {{ r.review_request.doc.name }}-{% if r.review_request.requested_rev %}{{ r.review_request.requested_rev }}{% else %}{{ r.review_request.doc.rev }}{% endif %}{% endfor %}
{% if rotation_list %}Next in the reviewer rotation:
{% for p in rotation_list %} {{ p }}
{% endfor %}{% endif %}{% endautoescape %}
""")
DBTemplate.objects.filter(pk=185).update(content="""{% autoescape off %}Subject: Open review assignments in {{group.acronym}}
Review instructions and related resources are at:
<https://trac.ietf.org/trac/ops/wiki/Directorates>
The following reviewers have assignments:{% for r in review_assignments %}{% ifchanged r.section %}
{{r.section}}
{% if r.section == 'Early review requests:' %}Reviewer Due Draft{% else %}Reviewer LC end Draft{% endif %}{% endifchanged %}
{{ r.reviewer.person.plain_name|ljust:"22" }} {% if r.section == 'Early review requests:' %}{{ r.review_request.deadline|date:"Y-m-d" }}{% else %}{{ r.lastcall_ends|default:"None " }}{% endif %} {{ r.review_request.doc.name }}-{% if r.review_request.requested_rev %}{{ r.review_request.requested_rev }}{% else %}{{ r.review_request.doc.rev }}{% endif %} {{ r.earlier_review_mark }}{% endfor %}
* Other revision previously reviewed
** This revision already reviewed
{% if rotation_list %}Next in the reviewer rotation:
{% for p in rotation_list %} {{ p }}
{% endfor %}{% endif %}{% endautoescape %}
""")
class Migration(migrations.Migration):
dependencies = [
('dbtemplate', '0004_adjust_assignment_email_summary_templates'),
]
operations = [
migrations.RunPython(forward,reverse),
]
| 39.900709 | 480 | 0.668059 | 1,484 | 11,252 | 4.983154 | 0.126011 | 0.039757 | 0.079513 | 0.032454 | 0.954564 | 0.954564 | 0.954564 | 0.954564 | 0.954564 | 0.954564 | 0 | 0.007878 | 0.165215 | 11,252 | 281 | 481 | 40.042705 | 0.77941 | 0.011109 | 0 | 0.918605 | 1 | 0.104651 | 0.923492 | 0.188978 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011628 | false | 0 | 0.005814 | 0 | 0.034884 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8a3af1a8aebbde4d434723f74aad8b4ee0bb3743 | 2,828 | py | Python | realtimenet/feature_extractors/efficientnet.py | floriandotpy/20bn-realtimenet | 6cf6359606ccb3cb205fb65dd102402bc84255e2 | [
"MIT"
] | 2 | 2021-03-03T09:36:49.000Z | 2022-03-18T06:36:54.000Z | realtimenet/feature_extractors/efficientnet.py | mc261670164/20bn-realtimenet | 6d1e21c3ccd3ff7d15af2927a31f1012ae9853e9 | [
"MIT"
] | 1 | 2021-03-10T08:38:03.000Z | 2021-03-10T10:48:13.000Z | realtimenet/feature_extractors/efficientnet.py | floriandotpy/20bn-realtimenet | 6cf6359606ccb3cb205fb65dd102402bc84255e2 | [
"MIT"
] | 1 | 2022-01-26T02:45:18.000Z | 2022-01-26T02:45:18.000Z | import torch.nn as nn
from .mobilenet import StridedInflatedMobileNetV2, InvertedResidual, ConvReLU
class StridedInflatedEfficientNet(StridedInflatedMobileNetV2):
def __init__(self):
super().__init__()
self.cnn = nn.Sequential(
ConvReLU(3, 32, 3, stride=2),
InvertedResidual(32, 24, 3, spatial_stride=1),
InvertedResidual(24, 32, 3, spatial_stride=2, expand_ratio=6),
InvertedResidual(32, 32, 3, spatial_stride=1, expand_ratio=6, temporal_shift=True),
InvertedResidual(32, 32, 3, spatial_stride=1, expand_ratio=6),
InvertedResidual(32, 32, 3, spatial_stride=1, expand_ratio=6),
InvertedResidual(32, 56, 5, spatial_stride=2, expand_ratio=6),
InvertedResidual(56, 56, 5, spatial_stride=1, expand_ratio=6, temporal_shift=True, temporal_stride=True),
InvertedResidual(56, 56, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(56, 56, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(56, 112, 3, spatial_stride=2, expand_ratio=6),
InvertedResidual(112, 112, 3, spatial_stride=1, expand_ratio=6, temporal_shift=True),
InvertedResidual(112, 112, 3, spatial_stride=1, expand_ratio=6),
InvertedResidual(112, 112, 3, spatial_stride=1, expand_ratio=6),
InvertedResidual(112, 112, 3, spatial_stride=1, expand_ratio=6, temporal_shift=True, temporal_stride=True),
InvertedResidual(112, 112, 3, spatial_stride=1, expand_ratio=6),
InvertedResidual(112, 160, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(160, 160, 5, spatial_stride=1, expand_ratio=6, temporal_shift=True),
InvertedResidual(160, 160, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(160, 160, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(160, 160, 5, spatial_stride=1, expand_ratio=6, temporal_shift=True),
InvertedResidual(160, 160, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(160, 272, 5, spatial_stride=2, expand_ratio=6),
InvertedResidual(272, 272, 5, spatial_stride=1, expand_ratio=6, temporal_shift=True),
InvertedResidual(272, 272, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(272, 272, 5, spatial_stride=1, expand_ratio=6, temporal_shift=True),
InvertedResidual(272, 272, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(272, 272, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(272, 272, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(272, 272, 5, spatial_stride=1, expand_ratio=6),
InvertedResidual(272, 448, 3, spatial_stride=1, expand_ratio=6),
ConvReLU(448, 1280, 1)
)
| 61.478261 | 119 | 0.673975 | 360 | 2,828 | 5.080556 | 0.102778 | 0.213231 | 0.190268 | 0.273373 | 0.844724 | 0.844724 | 0.844724 | 0.829962 | 0.797157 | 0.797157 | 0 | 0.119101 | 0.213225 | 2,828 | 45 | 120 | 62.844444 | 0.702921 | 0 | 0 | 0.487179 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025641 | false | 0 | 0.051282 | 0 | 0.102564 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8a72f7317007c72b0290e8c5a34edfcbc56f597f | 17,177 | py | Python | core/jobs/transforms/validation/collection_validation_test.py | yash10019coder/oppia | 8c349c61ac723a2fd507046b20957934cba70e3a | [
"Apache-2.0"
] | 5,422 | 2015-08-14T01:56:44.000Z | 2022-03-31T23:31:56.000Z | core/jobs/transforms/validation/collection_validation_test.py | yash10019coder/oppia | 8c349c61ac723a2fd507046b20957934cba70e3a | [
"Apache-2.0"
] | 14,178 | 2015-08-14T05:21:45.000Z | 2022-03-31T23:54:10.000Z | core/jobs/transforms/validation/collection_validation_test.py | yash10019coder/oppia | 8c349c61ac723a2fd507046b20957934cba70e3a | [
"Apache-2.0"
] | 3,574 | 2015-08-14T04:20:06.000Z | 2022-03-29T01:52:37.000Z | # coding: utf-8
#
# Copyright 2021 The Oppia Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS-IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unit tests for jobs.transforms.collection_validation."""
from __future__ import annotations
from core.domain import rights_domain
from core.jobs import job_test_utils
from core.jobs.decorators import validation_decorators
from core.jobs.transforms.validation import collection_validation
from core.jobs.types import base_validation_errors
from core.platform import models
from core.tests import test_utils
import apache_beam as beam
(base_models, collection_models) = models.Registry.import_models(
[models.NAMES.base_model, models.NAMES.collection])
class ValidateCollectionSnapshotMetadataModelTests(
job_test_utils.PipelinedTestBase):
def test_validate_change_domain_implemented(self):
invalid_commit_cmd_model = (
collection_models.CollectionSnapshotMetadataModel(
id='model_id-1',
committer_id='committer_id',
commit_type='delete',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
commit_cmds=[{
'cmd': base_models.VersionedModel.CMD_DELETE_COMMIT}])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation.ValidateCollectionSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [])
def test_collection_change_object_with_missing_cmd(self):
invalid_commit_cmd_model = (
collection_models.CollectionSnapshotMetadataModel(
id='123',
committer_id='committer_id',
commit_type='create',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
commit_cmds=[{'invalid': 'data'}])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation.ValidateCollectionSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
{'invalid': 'data'},
'Missing cmd key in change dict')
])
def test_collection_change_object_with_invalid_cmd(self):
invalid_commit_cmd_model = (
collection_models.CollectionSnapshotMetadataModel(
id='123',
committer_id='committer_id',
commit_type='create',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
commit_cmds=[{'cmd': 'invalid'}])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation.ValidateCollectionSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
{'cmd': 'invalid'},
'Command invalid is not allowed')
])
def test_collection_change_object_with_missing_attribute_in_cmd(self):
invalid_commit_cmd_model = (
collection_models.CollectionSnapshotMetadataModel(
id='123',
committer_id='committer_id',
commit_type='create',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
commit_cmds=[{
'cmd': 'edit_collection_node_property',
'property_name': 'category',
'old_value': 'old_value'
}])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation.ValidateCollectionSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
{
'cmd': 'edit_collection_node_property',
'property_name': 'category',
'old_value': 'old_value'
},
'The following required attributes are missing: '
'exploration_id, new_value')
])
def test_collection_change_object_with_extra_attribute_in_cmd(self):
invalid_commit_cmd_model = (
collection_models.CollectionSnapshotMetadataModel(
id='123',
committer_id='committer_id',
commit_type='create',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
commit_cmds=[{
'cmd': 'edit_collection_node_property',
'exploration_id': 'exploration_id',
'property_name': 'category',
'old_value': 'old_value',
'new_value': 'new_value',
'invalid': 'invalid'
}])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation.ValidateCollectionSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
{
'cmd': 'edit_collection_node_property',
'exploration_id': 'exploration_id',
'property_name': 'category',
'old_value': 'old_value',
'new_value': 'new_value',
'invalid': 'invalid'
},
'The following extra attributes are present: invalid')
])
def test_collection_change_object_with_invalid_collection_property(self):
invalid_commit_cmd_model = (
collection_models.CollectionSnapshotMetadataModel(
id='123',
committer_id='committer_id',
commit_type='create',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
commit_cmds=[{
'cmd': 'edit_collection_property',
'property_name': 'invalid',
'old_value': 'old_value',
'new_value': 'new_value',
}])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation.ValidateCollectionSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
{
'cmd': 'edit_collection_property',
'property_name': 'invalid',
'old_value': 'old_value',
'new_value': 'new_value',
},
'Value for property_name in cmd edit_collection_property: '
'invalid is not allowed')
])
class RelationshipsOfTests(test_utils.TestBase):
def test_collection_summary_model_relationships(self):
self.assertItemsEqual(
validation_decorators.RelationshipsOf.get_model_kind_references(
'CollectionSummaryModel', 'id'),
['CollectionModel', 'CollectionRightsModel'])
class ValidateCollectionRightsSnapshotMetadataModelTests(
job_test_utils.PipelinedTestBase):
def test_collection_rights_change_object_with_missing_cmd(self):
commit_dict = {'invalid': 'data'}
invalid_commit_cmd_model = (
collection_models.CollectionRightsSnapshotMetadataModel(
id='123',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
committer_id='committer_id',
commit_type='create',
commit_cmds=[commit_dict])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation
.ValidateCollectionRightsSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
commit_dict,
'Missing cmd key in change dict')
])
def test_collection_rights_change_object_with_invalid_cmd(self):
commit_dict = {'cmd': 'invalid'}
invalid_commit_cmd_model = (
collection_models.CollectionRightsSnapshotMetadataModel(
id='123',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
committer_id='committer_id',
commit_type='create',
commit_cmds=[commit_dict])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation
.ValidateCollectionRightsSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
commit_dict,
'Command invalid is not allowed')
])
def test_collection_rights_change_object_with_missing_attribute_in_cmd(
self):
commit_dict = {
'cmd': 'change_role',
'assignee_id': 'assignee_id',
}
invalid_commit_cmd_model = (
collection_models.CollectionRightsSnapshotMetadataModel(
id='123',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
committer_id='committer_id',
commit_type='edit',
commit_cmds=[commit_dict])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation
.ValidateCollectionRightsSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
commit_dict,
'The following required attributes are missing: '
'new_role, old_role')
])
def test_collection_rights_change_object_with_extra_attribute_in_cmd(self):
commit_dict = {
'cmd': 'change_private_viewability',
'old_viewable_if_private': 'old_viewable_if_private',
'new_viewable_if_private': 'new_viewable_if_private',
'invalid': 'invalid'
}
invalid_commit_cmd_model = (
collection_models.CollectionRightsSnapshotMetadataModel(
id='123',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
committer_id='committer_id',
commit_type='edit',
commit_cmds=[commit_dict])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation
.ValidateCollectionRightsSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
commit_dict,
'The following extra attributes are present: invalid')
])
def test_collection_rights_change_object_with_invalid_role(self):
commit_dict = {
'cmd': 'change_role',
'assignee_id': 'assignee_id',
'old_role': rights_domain.ROLE_OWNER,
'new_role': 'invalid',
}
invalid_commit_cmd_model = (
collection_models.CollectionRightsSnapshotMetadataModel(
id='123',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
committer_id='committer_id',
commit_type='edit',
commit_cmds=[commit_dict])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation
.ValidateCollectionRightsSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
commit_dict,
'Value for new_role in cmd change_role: '
'invalid is not allowed')
])
def test_collection_rights_change_object_with_invalid_status(self):
commit_dict = {
'cmd': 'change_collection_status',
'old_status': rights_domain.ACTIVITY_STATUS_PRIVATE,
'new_status': 'invalid'
}
invalid_commit_cmd_model = (
collection_models.CollectionRightsSnapshotMetadataModel(
id='123',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
committer_id='committer_id',
commit_type='edit',
commit_cmds=[commit_dict])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation
.ValidateCollectionRightsSnapshotMetadataModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsValidateError(
invalid_commit_cmd_model,
commit_dict,
'Value for new_status in cmd change_collection_status: '
'invalid is not allowed')
])
class ValidateCollectionCommitLogEntryModelTests(
job_test_utils.PipelinedTestBase):
def test_validate_rights_model(self):
invalid_commit_cmd_model = (
collection_models.CollectionCommitLogEntryModel(
id='rights_id123',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
collection_id='collection_id',
user_id='',
commit_type='test-type',
post_commit_status='private',
commit_cmds=[{'cmd': 'create_new'}])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation.ValidateCollectionCommitLogEntryModel())
)
self.assert_pcoll_equal(output, [])
def test_validate_collection_model(self):
invalid_commit_cmd_model = (
collection_models.CollectionCommitLogEntryModel(
id='collection_id123',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
collection_id='collection_id',
user_id='',
commit_type='test-type',
post_commit_status='private',
commit_cmds=[{
'cmd': base_models.VersionedModel.CMD_DELETE_COMMIT}])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation.ValidateCollectionCommitLogEntryModel())
)
self.assert_pcoll_equal(output, [])
def test_raises_commit_cmd_none_error(self):
invalid_commit_cmd_model = (
collection_models.CollectionCommitLogEntryModel(
id='model_id123',
created_on=self.YEAR_AGO,
last_updated=self.NOW,
collection_id='collection_id',
user_id='',
commit_type='test-type',
post_commit_status='private',
commit_cmds=[{'cmd': 'create_new'}])
)
output = (
self.pipeline
| beam.Create([invalid_commit_cmd_model])
| beam.ParDo(
collection_validation.ValidateCollectionCommitLogEntryModel())
)
self.assert_pcoll_equal(output, [
base_validation_errors.CommitCmdsNoneError(invalid_commit_cmd_model)
])
| 35.343621 | 80 | 0.574082 | 1,490 | 17,177 | 6.234228 | 0.12349 | 0.041662 | 0.072344 | 0.094951 | 0.806222 | 0.797179 | 0.779955 | 0.752611 | 0.719884 | 0.695123 | 0 | 0.004633 | 0.346626 | 17,177 | 485 | 81 | 35.416495 | 0.823042 | 0.037608 | 0 | 0.753659 | 0 | 0 | 0.118769 | 0.024166 | 0 | 0 | 0 | 0 | 0.039024 | 1 | 0.039024 | false | 0 | 0.02439 | 0 | 0.073171 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8a92aa6f71a5061fb11c7845c36f103d9d40bb5d | 44,139 | py | Python | tests/integration/validation/test_petstore.py | Yarn-e/openapi-core | fda9fbd3bc1c0879818e00445e1ad0731f80b065 | [
"BSD-3-Clause"
] | 160 | 2017-11-20T13:39:04.000Z | 2022-03-31T14:48:27.000Z | tests/integration/validation/test_petstore.py | Yarn-e/openapi-core | fda9fbd3bc1c0879818e00445e1ad0731f80b065 | [
"BSD-3-Clause"
] | 384 | 2017-09-21T12:42:31.000Z | 2022-03-21T17:21:05.000Z | tests/integration/validation/test_petstore.py | Yarn-e/openapi-core | fda9fbd3bc1c0879818e00445e1ad0731f80b065 | [
"BSD-3-Clause"
] | 100 | 2017-11-21T08:07:01.000Z | 2022-01-20T20:32:52.000Z | import json
from base64 import b64encode
from datetime import datetime
from uuid import UUID
import pytest
from isodate.tzinfo import UTC
from openapi_core.casting.schemas.exceptions import CastError
from openapi_core.deserializing.exceptions import DeserializeError
from openapi_core.deserializing.parameters.exceptions import (
EmptyQueryParameterValue,
)
from openapi_core.exceptions import MissingRequiredHeader
from openapi_core.exceptions import MissingRequiredParameter
from openapi_core.extensions.models.models import BaseModel
from openapi_core.shortcuts import create_spec
from openapi_core.shortcuts import spec_validate_body
from openapi_core.shortcuts import spec_validate_data
from openapi_core.shortcuts import spec_validate_headers
from openapi_core.shortcuts import spec_validate_parameters
from openapi_core.shortcuts import spec_validate_security
from openapi_core.templating.media_types.exceptions import MediaTypeNotFound
from openapi_core.templating.paths.exceptions import ServerNotFound
from openapi_core.testing import MockRequest
from openapi_core.testing import MockResponse
from openapi_core.unmarshalling.schemas.exceptions import InvalidSchemaValue
from openapi_core.validation.request.datatypes import Parameters
from openapi_core.validation.request.validators import RequestValidator
from openapi_core.validation.response.validators import ResponseValidator
class TestPetstore:
api_key = "12345"
@property
def api_key_encoded(self):
api_key_bytes = self.api_key.encode("utf8")
api_key_bytes_enc = b64encode(api_key_bytes)
return str(api_key_bytes_enc, "utf8")
@pytest.fixture(scope="module")
def spec_uri(self):
return "file://tests/integration/data/v3.0/petstore.yaml"
@pytest.fixture(scope="module")
def spec_dict(self, factory):
return factory.spec_from_file("data/v3.0/petstore.yaml")
@pytest.fixture(scope="module")
def spec(self, spec_dict, spec_uri):
return create_spec(spec_dict, spec_uri)
@pytest.fixture(scope="module")
def request_validator(self, spec):
return RequestValidator(spec)
@pytest.fixture(scope="module")
def response_validator(self, spec):
return ResponseValidator(spec)
def test_get_pets(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": "20",
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters(
query={
"limit": 20,
"page": 1,
"search": "",
}
)
assert body is None
data_json = {
"data": [],
}
data = json.dumps(data_json)
headers = {
"Content-Type": "application/json",
"x-next": "next-url",
}
response = MockResponse(data, headers=headers)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert response_result.data.data == []
assert response_result.headers == {
"x-next": "next-url",
}
def test_get_pets_response(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": "20",
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters(
query={
"limit": 20,
"page": 1,
"search": "",
}
)
assert body is None
data_json = {
"data": [
{
"id": 1,
"name": "Cat",
"ears": {
"healthy": True,
},
}
],
}
data = json.dumps(data_json)
response = MockResponse(data)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert len(response_result.data.data) == 1
assert response_result.data.data[0].id == 1
assert response_result.data.data[0].name == "Cat"
def test_get_pets_response_no_schema(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": "20",
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters(
query={
"limit": 20,
"page": 1,
"search": "",
}
)
assert body is None
data = "<html></html>"
response = MockResponse(data, status_code=404, mimetype="text/html")
with pytest.warns(UserWarning):
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert response_result.data == data
def test_get_pets_invalid_response(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": "20",
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters(
query={
"limit": 20,
"page": 1,
"search": "",
}
)
assert body is None
response_data_json = {
"data": [
{
"id": 1,
"name": {
"first_name": "Cat",
},
}
],
}
response_data = json.dumps(response_data_json)
response = MockResponse(response_data)
with pytest.raises(InvalidSchemaValue):
spec_validate_data(spec, request, response)
response_result = response_validator.validate(request, response)
schema_errors = response_result.errors[0].schema_errors
assert response_result.errors == [
InvalidSchemaValue(
type="object",
value=response_data_json,
schema_errors=schema_errors,
),
]
assert response_result.data is None
def test_get_pets_ids_param(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": "20",
"ids": ["12", "13"],
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters(
query={
"limit": 20,
"page": 1,
"search": "",
"ids": [12, 13],
}
)
assert body is None
data_json = {
"data": [],
}
data = json.dumps(data_json)
response = MockResponse(data)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert response_result.data.data == []
def test_get_pets_tags_param(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = [
("limit", "20"),
("tags", "cats,dogs"),
]
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters(
query={
"limit": 20,
"page": 1,
"search": "",
"tags": ["cats", "dogs"],
}
)
assert body is None
data_json = {
"data": [],
}
data = json.dumps(data_json)
response = MockResponse(data)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert response_result.data.data == []
def test_get_pets_parameter_deserialization_error(self, spec):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": 1,
"tags": 12,
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
with pytest.raises(DeserializeError):
spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert body is None
def test_get_pets_wrong_parameter_type(self, spec):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": "twenty",
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
with pytest.raises(CastError):
spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert body is None
def test_get_pets_raises_missing_required_param(self, spec):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
)
with pytest.warns(DeprecationWarning):
with pytest.raises(MissingRequiredParameter):
spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert body is None
def test_get_pets_empty_value(self, spec):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": "",
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
with pytest.raises(EmptyQueryParameterValue):
spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert body is None
def test_get_pets_allow_empty_value(self, spec):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": 20,
"search": "",
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
query={
"page": 1,
"limit": 20,
"search": "",
}
)
body = spec_validate_body(spec, request)
assert body is None
def test_get_pets_none_value(self, spec):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": None,
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
query={
"limit": None,
"page": 1,
"search": "",
}
)
body = spec_validate_body(spec, request)
assert body is None
def test_get_pets_param_order(self, spec):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
query_params = {
"limit": None,
"order": "desc",
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
query={
"limit": None,
"order": "desc",
"page": 1,
"search": "",
}
)
body = spec_validate_body(spec, request)
assert body is None
def test_get_pets_param_coordinates(self, spec):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets"
coordinates = {
"lat": 1.12,
"lon": 32.12,
}
query_params = {
"limit": None,
"coordinates": json.dumps(coordinates),
}
request = MockRequest(
host_url,
"GET",
"/pets",
path_pattern=path_pattern,
args=query_params,
)
with pytest.warns(DeprecationWarning):
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
query={
"limit": None,
"page": 1,
"search": "",
"coordinates": coordinates,
}
)
body = spec_validate_body(spec, request)
assert body is None
def test_post_birds(self, spec, spec_dict):
host_url = "https://staging.gigantic-server.com/v1"
path_pattern = "/v1/pets"
pet_name = "Cat"
pet_tag = "cats"
pet_street = "Piekna"
pet_city = "Warsaw"
pet_healthy = False
data_json = {
"name": pet_name,
"tag": pet_tag,
"position": 2,
"address": {
"street": pet_street,
"city": pet_city,
},
"healthy": pet_healthy,
"wings": {
"healthy": pet_healthy,
},
}
data = json.dumps(data_json)
headers = {
"api-key": self.api_key_encoded,
}
userdata = {
"name": "user1",
}
userdata_json = json.dumps(userdata)
cookies = {
"user": "123",
"userdata": userdata_json,
}
request = MockRequest(
host_url,
"POST",
"/pets",
path_pattern=path_pattern,
data=data,
headers=headers,
cookies=cookies,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
header={
"api-key": self.api_key,
},
cookie={
"user": 123,
"userdata": {
"name": "user1",
},
},
)
body = spec_validate_body(spec, request)
schemas = spec_dict["components"]["schemas"]
pet_model = schemas["PetCreate"]["x-model"]
address_model = schemas["Address"]["x-model"]
assert body.__class__.__name__ == pet_model
assert body.name == pet_name
assert body.tag == pet_tag
assert body.position == 2
assert body.address.__class__.__name__ == address_model
assert body.address.street == pet_street
assert body.address.city == pet_city
assert body.healthy == pet_healthy
security = spec_validate_security(spec, request)
assert security == {}
def test_post_cats(self, spec, spec_dict):
host_url = "https://staging.gigantic-server.com/v1"
path_pattern = "/v1/pets"
pet_name = "Cat"
pet_tag = "cats"
pet_street = "Piekna"
pet_city = "Warsaw"
pet_healthy = False
data_json = {
"name": pet_name,
"tag": pet_tag,
"position": 2,
"address": {
"street": pet_street,
"city": pet_city,
},
"healthy": pet_healthy,
"ears": {
"healthy": pet_healthy,
},
}
data = json.dumps(data_json)
headers = {
"api-key": self.api_key_encoded,
}
cookies = {
"user": "123",
}
request = MockRequest(
host_url,
"POST",
"/pets",
path_pattern=path_pattern,
data=data,
headers=headers,
cookies=cookies,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
header={
"api-key": self.api_key,
},
cookie={
"user": 123,
},
)
body = spec_validate_body(spec, request)
schemas = spec_dict["components"]["schemas"]
pet_model = schemas["PetCreate"]["x-model"]
address_model = schemas["Address"]["x-model"]
assert body.__class__.__name__ == pet_model
assert body.name == pet_name
assert body.tag == pet_tag
assert body.position == 2
assert body.address.__class__.__name__ == address_model
assert body.address.street == pet_street
assert body.address.city == pet_city
assert body.healthy == pet_healthy
def test_post_cats_boolean_string(self, spec, spec_dict):
host_url = "https://staging.gigantic-server.com/v1"
path_pattern = "/v1/pets"
pet_name = "Cat"
pet_tag = "cats"
pet_street = "Piekna"
pet_city = "Warsaw"
pet_healthy = False
data_json = {
"name": pet_name,
"tag": pet_tag,
"position": 2,
"address": {
"street": pet_street,
"city": pet_city,
},
"healthy": pet_healthy,
"ears": {
"healthy": pet_healthy,
},
}
data = json.dumps(data_json)
headers = {
"api-key": self.api_key_encoded,
}
cookies = {
"user": "123",
}
request = MockRequest(
host_url,
"POST",
"/pets",
path_pattern=path_pattern,
data=data,
headers=headers,
cookies=cookies,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
header={
"api-key": self.api_key,
},
cookie={
"user": 123,
},
)
body = spec_validate_body(spec, request)
schemas = spec_dict["components"]["schemas"]
pet_model = schemas["PetCreate"]["x-model"]
address_model = schemas["Address"]["x-model"]
assert body.__class__.__name__ == pet_model
assert body.name == pet_name
assert body.tag == pet_tag
assert body.position == 2
assert body.address.__class__.__name__ == address_model
assert body.address.street == pet_street
assert body.address.city == pet_city
assert body.healthy is False
def test_post_no_one_of_schema(self, spec, spec_dict):
host_url = "https://staging.gigantic-server.com/v1"
path_pattern = "/v1/pets"
pet_name = "Cat"
alias = "kitty"
data_json = {
"name": pet_name,
"alias": alias,
}
data = json.dumps(data_json)
headers = {
"api-key": self.api_key_encoded,
}
cookies = {
"user": "123",
}
request = MockRequest(
host_url,
"POST",
"/pets",
path_pattern=path_pattern,
data=data,
headers=headers,
cookies=cookies,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
header={
"api-key": self.api_key,
},
cookie={
"user": 123,
},
)
with pytest.raises(InvalidSchemaValue):
spec_validate_body(spec, request)
def test_post_cats_only_required_body(self, spec, spec_dict):
host_url = "https://staging.gigantic-server.com/v1"
path_pattern = "/v1/pets"
pet_name = "Cat"
pet_healthy = True
data_json = {
"name": pet_name,
"ears": {
"healthy": pet_healthy,
},
}
data = json.dumps(data_json)
headers = {
"api-key": self.api_key_encoded,
}
cookies = {
"user": "123",
}
request = MockRequest(
host_url,
"POST",
"/pets",
path_pattern=path_pattern,
data=data,
headers=headers,
cookies=cookies,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
header={
"api-key": self.api_key,
},
cookie={
"user": 123,
},
)
body = spec_validate_body(spec, request)
schemas = spec_dict["components"]["schemas"]
pet_model = schemas["PetCreate"]["x-model"]
assert body.__class__.__name__ == pet_model
assert body.name == pet_name
assert not hasattr(body, "tag")
assert not hasattr(body, "address")
def test_post_pets_raises_invalid_mimetype(self, spec):
host_url = "https://staging.gigantic-server.com/v1"
path_pattern = "/v1/pets"
data_json = {
"name": "Cat",
"tag": "cats",
}
data = json.dumps(data_json)
headers = {
"api-key": self.api_key_encoded,
}
cookies = {
"user": "123",
}
request = MockRequest(
host_url,
"POST",
"/pets",
path_pattern=path_pattern,
data=data,
mimetype="text/html",
headers=headers,
cookies=cookies,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
header={
"api-key": self.api_key,
},
cookie={
"user": 123,
},
)
with pytest.raises(MediaTypeNotFound):
spec_validate_body(spec, request)
def test_post_pets_missing_cookie(self, spec, spec_dict):
host_url = "https://staging.gigantic-server.com/v1"
path_pattern = "/v1/pets"
pet_name = "Cat"
pet_healthy = True
data_json = {
"name": pet_name,
"ears": {
"healthy": pet_healthy,
},
}
data = json.dumps(data_json)
headers = {
"api-key": self.api_key_encoded,
}
request = MockRequest(
host_url,
"POST",
"/pets",
path_pattern=path_pattern,
data=data,
headers=headers,
)
with pytest.raises(MissingRequiredParameter):
spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
schemas = spec_dict["components"]["schemas"]
pet_model = schemas["PetCreate"]["x-model"]
assert body.__class__.__name__ == pet_model
assert body.name == pet_name
assert not hasattr(body, "tag")
assert not hasattr(body, "address")
def test_post_pets_missing_header(self, spec, spec_dict):
host_url = "https://staging.gigantic-server.com/v1"
path_pattern = "/v1/pets"
pet_name = "Cat"
pet_healthy = True
data_json = {
"name": pet_name,
"ears": {
"healthy": pet_healthy,
},
}
data = json.dumps(data_json)
cookies = {
"user": "123",
}
request = MockRequest(
host_url,
"POST",
"/pets",
path_pattern=path_pattern,
data=data,
cookies=cookies,
)
with pytest.raises(MissingRequiredParameter):
spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
schemas = spec_dict["components"]["schemas"]
pet_model = schemas["PetCreate"]["x-model"]
assert body.__class__.__name__ == pet_model
assert body.name == pet_name
assert not hasattr(body, "tag")
assert not hasattr(body, "address")
def test_post_pets_raises_invalid_server_error(self, spec):
host_url = "http://flowerstore.swagger.io/v1"
path_pattern = "/v1/pets"
data_json = {
"name": "Cat",
"tag": "cats",
}
data = json.dumps(data_json)
headers = {
"api-key": "12345",
}
cookies = {
"user": "123",
}
request = MockRequest(
host_url,
"POST",
"/pets",
path_pattern=path_pattern,
data=data,
mimetype="text/html",
headers=headers,
cookies=cookies,
)
with pytest.raises(ServerNotFound):
spec_validate_parameters(spec, request)
with pytest.raises(ServerNotFound):
spec_validate_body(spec, request)
data_id = 1
data_name = "test"
data_json = {
"data": {
"id": data_id,
"name": data_name,
"ears": {
"healthy": True,
},
},
}
data = json.dumps(data_json)
response = MockResponse(data)
with pytest.raises(ServerNotFound):
spec_validate_data(spec, request, response)
def test_get_pet(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets/{petId}"
view_args = {
"petId": "1",
}
auth = "authuser"
headers = {
"Authorization": f"Basic {auth}",
}
request = MockRequest(
host_url,
"GET",
"/pets/1",
path_pattern=path_pattern,
view_args=view_args,
headers=headers,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
path={
"petId": 1,
}
)
body = spec_validate_body(spec, request)
assert body is None
security = spec_validate_security(spec, request)
assert security == {
"petstore_auth": auth,
}
data_id = 1
data_name = "test"
data_json = {
"data": {
"id": data_id,
"name": data_name,
"ears": {
"healthy": True,
},
},
}
data = json.dumps(data_json)
response = MockResponse(data)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert isinstance(response_result.data.data, BaseModel)
assert response_result.data.data.id == data_id
assert response_result.data.data.name == data_name
def test_get_pet_not_found(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets/{petId}"
view_args = {
"petId": "1",
}
request = MockRequest(
host_url,
"GET",
"/pets/1",
path_pattern=path_pattern,
view_args=view_args,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
path={
"petId": 1,
}
)
body = spec_validate_body(spec, request)
assert body is None
code = 404
message = "Not found"
rootCause = "Pet not found"
data_json = {
"code": 404,
"message": message,
"rootCause": rootCause,
}
data = json.dumps(data_json)
response = MockResponse(data, status_code=404)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert response_result.data.code == code
assert response_result.data.message == message
assert response_result.data.rootCause == rootCause
def test_get_pet_wildcard(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/pets/{petId}"
view_args = {
"petId": "1",
}
request = MockRequest(
host_url,
"GET",
"/pets/1",
path_pattern=path_pattern,
view_args=view_args,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters(
path={
"petId": 1,
}
)
body = spec_validate_body(spec, request)
assert body is None
data = b"imagedata"
response = MockResponse(data, mimetype="image/png")
with pytest.warns(UserWarning):
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert response_result.data == data
def test_get_tags(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
request = MockRequest(
host_url,
"GET",
"/tags",
path_pattern=path_pattern,
)
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters()
assert body is None
data_json = ["cats", "birds"]
data = json.dumps(data_json)
response = MockResponse(data)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert response_result.data == data_json
def test_post_tags_extra_body_properties(self, spec, spec_dict):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
pet_name = "Dog"
alias = "kitty"
data_json = {
"name": pet_name,
"alias": alias,
}
data = json.dumps(data_json)
request = MockRequest(
host_url,
"POST",
"/tags",
path_pattern=path_pattern,
data=data,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters()
with pytest.raises(InvalidSchemaValue):
spec_validate_body(spec, request)
def test_post_tags_empty_body(self, spec, spec_dict):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
data_json = {}
data = json.dumps(data_json)
request = MockRequest(
host_url,
"POST",
"/tags",
path_pattern=path_pattern,
data=data,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters()
with pytest.raises(InvalidSchemaValue):
spec_validate_body(spec, request)
def test_post_tags_wrong_property_type(self, spec):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
tag_name = 123
data = json.dumps(tag_name)
request = MockRequest(
host_url,
"POST",
"/tags",
path_pattern=path_pattern,
data=data,
)
parameters = spec_validate_parameters(spec, request)
assert parameters == Parameters()
with pytest.raises(InvalidSchemaValue):
spec_validate_body(spec, request)
def test_post_tags_additional_properties(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
pet_name = "Dog"
data_json = {
"name": pet_name,
}
data = json.dumps(data_json)
request = MockRequest(
host_url,
"POST",
"/tags",
path_pattern=path_pattern,
data=data,
)
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters()
assert isinstance(body, BaseModel)
assert body.name == pet_name
code = 400
message = "Bad request"
rootCause = "Tag already exist"
additionalinfo = "Tag Dog already exist"
data_json = {
"code": code,
"message": message,
"rootCause": rootCause,
"additionalinfo": additionalinfo,
}
data = json.dumps(data_json)
response = MockResponse(data, status_code=404)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert response_result.data.code == code
assert response_result.data.message == message
assert response_result.data.rootCause == rootCause
assert response_result.data.additionalinfo == additionalinfo
def test_post_tags_created_now(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
created = "now"
pet_name = "Dog"
data_json = {
"created": created,
"name": pet_name,
}
data = json.dumps(data_json)
request = MockRequest(
host_url,
"POST",
"/tags",
path_pattern=path_pattern,
data=data,
)
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters()
assert isinstance(body, BaseModel)
assert body.created == created
assert body.name == pet_name
code = 400
message = "Bad request"
rootCause = "Tag already exist"
additionalinfo = "Tag Dog already exist"
data_json = {
"code": 400,
"message": "Bad request",
"rootCause": "Tag already exist",
"additionalinfo": "Tag Dog already exist",
}
data = json.dumps(data_json)
response = MockResponse(data, status_code=404)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert response_result.data.code == code
assert response_result.data.message == message
assert response_result.data.rootCause == rootCause
assert response_result.data.additionalinfo == additionalinfo
def test_post_tags_created_datetime(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
created = "2016-04-16T16:06:05Z"
pet_name = "Dog"
data_json = {
"created": created,
"name": pet_name,
}
data = json.dumps(data_json)
request = MockRequest(
host_url,
"POST",
"/tags",
path_pattern=path_pattern,
data=data,
)
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters()
assert isinstance(body, BaseModel)
assert body.created == datetime(2016, 4, 16, 16, 6, 5, tzinfo=UTC)
assert body.name == pet_name
code = 400
message = "Bad request"
rootCause = "Tag already exist"
additionalinfo = "Tag Dog already exist"
response_data_json = {
"code": code,
"message": message,
"rootCause": rootCause,
"additionalinfo": additionalinfo,
}
response_data = json.dumps(response_data_json)
response = MockResponse(response_data, status_code=404)
data = spec_validate_data(spec, request, response)
assert isinstance(data, BaseModel)
assert data.code == code
assert data.message == message
assert data.rootCause == rootCause
assert data.additionalinfo == additionalinfo
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert response_result.data.code == code
assert response_result.data.message == message
assert response_result.data.rootCause == rootCause
assert response_result.data.additionalinfo == additionalinfo
def test_post_tags_created_invalid_type(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
created = "long time ago"
pet_name = "Dog"
data_json = {
"created": created,
"name": pet_name,
}
data = json.dumps(data_json)
request = MockRequest(
host_url,
"POST",
"/tags",
path_pattern=path_pattern,
data=data,
)
parameters = spec_validate_parameters(spec, request)
with pytest.raises(InvalidSchemaValue):
spec_validate_body(spec, request)
assert parameters == Parameters()
code = 400
message = "Bad request"
correlationId = UUID("a8098c1a-f86e-11da-bd1a-00112444be1e")
rootCause = "Tag already exist"
additionalinfo = "Tag Dog already exist"
data_json = {
"message": message,
"correlationId": str(correlationId),
"rootCause": rootCause,
"additionalinfo": additionalinfo,
}
data = json.dumps(data_json)
response = MockResponse(data, status_code=404)
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert isinstance(response_result.data, BaseModel)
assert response_result.data.code == code
assert response_result.data.message == message
assert response_result.data.correlationId == correlationId
assert response_result.data.rootCause == rootCause
assert response_result.data.additionalinfo == additionalinfo
def test_delete_tags_with_requestbody(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
ids = [1, 2, 3]
data_json = {
"ids": ids,
}
data = json.dumps(data_json)
request = MockRequest(
host_url,
"DELETE",
"/tags",
path_pattern=path_pattern,
data=data,
)
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters()
assert isinstance(body, BaseModel)
assert body.ids == ids
data = None
headers = {
"x-delete-confirm": "true",
}
response = MockResponse(data, status_code=200, headers=headers)
with pytest.warns(DeprecationWarning):
response_result = response_validator.validate(request, response)
assert response_result.errors == []
assert response_result.data is None
with pytest.warns(DeprecationWarning):
response_headers = spec_validate_headers(spec, request, response)
assert response_headers == {
"x-delete-confirm": True,
}
def test_delete_tags_no_requestbody(self, spec, response_validator):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
request = MockRequest(
host_url,
"DELETE",
"/tags",
path_pattern=path_pattern,
)
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters()
assert body is None
def test_delete_tags_raises_missing_required_response_header(
self, spec, response_validator
):
host_url = "http://petstore.swagger.io/v1"
path_pattern = "/v1/tags"
request = MockRequest(
host_url,
"DELETE",
"/tags",
path_pattern=path_pattern,
)
parameters = spec_validate_parameters(spec, request)
body = spec_validate_body(spec, request)
assert parameters == Parameters()
assert body is None
data = None
response = MockResponse(data, status_code=200)
with pytest.warns(DeprecationWarning):
response_result = response_validator.validate(request, response)
assert response_result.errors == [
MissingRequiredHeader(name="x-delete-confirm"),
]
assert response_result.data is None
| 28.867888 | 77 | 0.545209 | 4,218 | 44,139 | 5.46752 | 0.058084 | 0.052944 | 0.043361 | 0.024066 | 0.86597 | 0.841124 | 0.819964 | 0.800147 | 0.790651 | 0.786922 | 0 | 0.011068 | 0.355196 | 44,139 | 1,528 | 78 | 28.88678 | 0.799234 | 0 | 0 | 0.730949 | 0 | 0 | 0.089988 | 0.002424 | 0 | 0 | 0 | 0 | 0.129082 | 1 | 0.033437 | false | 0 | 0.020218 | 0.003888 | 0.059876 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
76d93bd3d5f62f4be4bfa09591feb56f20f0ffb0 | 83,663 | py | Python | wagtailmenus/tests/test_menu_rendering.py | danbentley/wagtailmenus | 27c02cdbc99c4facb5d57a4e609fdb802d2a99a1 | [
"MIT"
] | null | null | null | wagtailmenus/tests/test_menu_rendering.py | danbentley/wagtailmenus | 27c02cdbc99c4facb5d57a4e609fdb802d2a99a1 | [
"MIT"
] | null | null | null | wagtailmenus/tests/test_menu_rendering.py | danbentley/wagtailmenus | 27c02cdbc99c4facb5d57a4e609fdb802d2a99a1 | [
"MIT"
] | 1 | 2019-05-23T01:40:28.000Z | 2019-05-23T01:40:28.000Z | from bs4 import BeautifulSoup
from django.test import TestCase, override_settings
from wagtail.core.models import Site
from wagtailmenus.errors import SubMenuUsageError
from wagtailmenus.models import MainMenu, FlatMenu
from wagtailmenus.templatetags.menu_tags import validate_supplied_values
class TestTemplateTags(TestCase):
fixtures = ['test.json']
maxDiff = None
def test_main_menu_created_when_not_exists(self):
menu = MainMenu.objects.get(pk=1)
self.assertEqual(menu.__str__(), 'Main menu for wagtailmenus (co.uk)')
menu.delete()
response = self.client.get('/')
self.assertEqual(response.status_code, 200)
menu = MainMenu.objects.first()
self.assertTrue(menu)
self.assertEqual(menu.__str__(), 'Main menu for wagtailmenus (co.uk)')
def test_flat_menu_get_for_site_with_default_fallback(self):
site_one = Site.objects.get(pk=1)
site_two = Site.objects.get(pk=2)
# Site one (default) definitiely has a menu defined with the handle
# `footer`
menu = FlatMenu.get_for_site('footer', site_one)
site_one_menu_pk = menu.pk
self.assertIsNotNone(menu)
# Site two doesn't have any menus defined, so this should return None
menu = FlatMenu.get_for_site('footer', site_two)
self.assertIsNone(menu)
# But if we use the `use_default_site_menu_as_fallback` flag to fetch
# from the default site, we should get the one defined for site_one
menu = FlatMenu.get_for_site('footer', site_two, True)
self.assertIsNotNone(menu)
self.assertEqual(menu.pk, site_one_menu_pk)
def test_validate_supplied_values(self):
with self.assertRaisesMessage(ValueError, 'The `main_menu` tag expects `max_levels` to be an integer value between 1 and 5. Please review your template.'):
validate_supplied_values(tag='main_menu', max_levels=9)
with self.assertRaisesMessage(ValueError, 'The `main_menu` tag expects `max_levels` to be an integer value between 1 and 5. Please review your template.'):
validate_supplied_values(tag='main_menu', max_levels='1')
with self.assertRaisesMessage(ValueError, 'The `main_menu` tag expects `use_specific` to be an integer value between 0 and 3. Please review your template.'):
validate_supplied_values(tag='main_menu', use_specific=5)
with self.assertRaisesMessage(ValueError, 'The `main_menu` tag expects `use_specific` to be an integer value between 0 and 3. Please review your template.'):
validate_supplied_values(tag='main_menu', use_specific='2')
with self.assertRaises(ValueError):
validate_supplied_values(tag='main_menu', parent_page=False)
with self.assertRaises(ValueError):
validate_supplied_values(tag='main_menu', menuitem_or_page=5)
def test_homepage(self):
"""
Test that homepage (based on `MenuPage`) renders without errors.
"""
response = self.client.get('/')
self.assertEqual(response.status_code, 200)
@override_settings(WAGTAILMENUS_SITE_SPECIFIC_TEMPLATE_DIRS=True,)
def test_about_us(self):
"""
Test that 'About us' page (based on `MenuPage`), with
`repeat_in_subnav=True`, renders without errors.
The `WAGTAILMENUS_SITE_SPECIFIC_TEMPLATE_DIRS` setting is also
applied to increase coverage in get_template() and
get_sub_menu_template() methods.
"""
response = self.client.get('/about-us/')
self.assertEqual(response.status_code, 200)
def test_meet_the_team(self):
"""
Test that 'Meet the team' page (based on `Page`), and within a
section with subnav, renders without errors.
"""
response = self.client.get('/about-us/meet-the-team/')
self.assertEqual(response.status_code, 200)
def test_marvel_comics(self):
"""
Test that 'Marvel comics' page (based on `Page`), and within a
section with subnav, renders without errors.
"""
response = self.client.get('/superheroes/marvel-comics/')
self.assertEqual(response.status_code, 200)
def test_staff_vacancies(self):
"""
Test that 'Staff vacancies' page (based on `Page`), with
`show_in_menus=False`, and within a section with subnav, renders
without errors.
"""
response = self.client.get('/about-us/staff-vacancies/')
self.assertEqual(response.status_code, 200)
def test_non_page(self):
"""
Test that there are no errors when rendering page template without
the `wagtailmenus.wagtail_hooks.wagtailmenu_params_helper()` method
having run to add helpful bits to the context.
"""
response = self.client.get('/custom-url/')
self.assertEqual(response.status_code, 200)
def test_homepage_main_menu_two_levels(self):
"""
Test '{{ main_menu }}' output for homepage
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='main-menu-two-levels').decode()
expected_menu_html = """
<div id="main-menu-two-levels">
<ul class="nav navbar-nav">
<li class="active"><a href="/">Home</a></li>
<li class=" dropdown top-level">
<a href="/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_6">
<li class=" top-level"><a href="/about-us/">Section home</a></li>
<li class=" low-level"><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=" low-level"><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=" low-level"><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_14">
<li class=" low-level"><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=" low-level"><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=" low-level"><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a href="/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_18">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_homepage_main_menu_three_levels(self):
"""
Test '{{ main_menu max_levels=3 }}' output for homepage
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='main-menu-three-levels').decode()
expected_menu_html = """
<div id="main-menu-three-levels">
<ul class="nav navbar-nav">
<li class="active"><a href="/">Home</a></li>
<li class=" dropdown top-level">
<a href="/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_6">
<li class=" top-level"><a href="/about-us/">Section home</a></li>
<li class=" dropdown">
<a href="/about-us/meet-the-team/" class="dropdown-toggle" id="ddtoggle_7" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Meet the team <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_7">
<li class=""><a href="/about-us/meet-the-team/staff-member-one/">Staff member one</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-two/">Staff member two</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-three/">Staff member three</a></li>
</ul>
</li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_14">
<li class=""><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=""><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=""><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a href="/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_18">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_homepage_main_menu_absolute_urls(self):
"""
Test '{{ main_menu use_absolute_page_urls=True }}' output for homepage
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='main-menu-absolute-url').decode()
expected_menu_html = """
<div id="main-menu-absolute-url">
<ul class="nav navbar-nav">
<li class="active">
<a href="http://www.wagtailmenus.co.uk:8000/">Home</a>
</li>
<li class=" dropdown top-level">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_6">
<li class=" top-level">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/">Section home</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/meet-the-team/">Meet the team</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/our-heritage/">Our heritage</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/mission-and-values/">Our mission and values</a>
</li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="http://www.wagtailmenus.co.uk:8000/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_14">
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/news-and-events/latest-news/">Latest news</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/news-and-events/upcoming-events/">Upcoming events</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/news-and-events/press/">In the press</a>
</li>
</ul>
</li>
<li class="">
<a href="http://google.co.uk">Google</a>
</li>
<li class=" dropdown">
<a href="http://www.wagtailmenus.co.uk:8000/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_18">
<li class="support">
<a href="/contact-us/#support">Get support</a>
</li>
<li class="call">
<a href="/contact-us/#call">Speak to someone</a>
</li>
<li class="map">
<a href="/contact-us/#map">Map & directions</a>
</li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_homepage_children_menu_one_level(self):
"""
Test '{% children_menu %}' output for homepage
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
menu_html = soup.find(id='children-menu-one-level').decode()
expected_menu_html = """
<div id="children-menu-one-level">
<ul>
<li class=""><a href="/about-us/">About us</a></li>
<li class=""><a href="/news-and-events/">News & events</a></li>
<li class=""><a href="/contact-us/">Contact us</a></li>
<li class=""><a href="/legal/">Legal</a></li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_homepage_children_menu_three_levels(self):
"""
Test '{% children_menu max_levels=3 allow_repeating_parents=False %}' output for homepage
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
menu_html = soup.find(id='children-menu-three-levels').decode()
expected_menu_html = """
<div id="children-menu-three-levels">
<ul>
<li class=""><a href="/about-us/">About us</a>
<ul>
<li class="">
<a href="/about-us/meet-the-team/">Meet the team</a>
<ul>
<li class=""><a href="/about-us/meet-the-team/staff-member-one/">Staff member one</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-two/">Staff member two</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-three/">Staff member three</a></li>
</ul>
</li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class="">
<a href="/news-and-events/">News & events</a>
<ul>
<li class=""><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=""><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=""><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="/contact-us/">Contact us</a></li>
<li class="">
<a href="/legal/">Legal</a>
<ul>
<li class=""><a href="/legal/accessibility/">Accessibility</a></li>
<li class=""><a href="/legal/privacy-policy/">Privacy policy</a></li>
<li class=""><a href="/legal/terms-and-conditions/">Terms and conditions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_homepage_children_absolute_urls(self):
"""
Test '{% children_menu use_absolute_page_urls=True %}' output for homepage
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
menu_html = soup.find(id='children-menu-absolute-url').decode()
expected_menu_html = """
<div id="children-menu-absolute-url">
<ul>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/">About us</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/news-and-events/">News & events</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/contact-us/">Contact us</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/legal/">Legal</a>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_about_us_main_menu_two_levels(self):
"""
Test '{% main_menu %}' output for 'About us' page
"""
response = self.client.get('/about-us/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='main-menu-two-levels').decode()
expected_menu_html = """
<div id="main-menu-two-levels">
<ul class="nav navbar-nav">
<li class=""><a href="/">Home</a></li>
<li class="ancestor dropdown top-level">
<a href="/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_6">
<li class="active top-level"><a href="/about-us/">Section home</a></li>
<li class=" low-level"><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=" low-level"><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=" low-level"><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_14">
<li class=" low-level"><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=" low-level"><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=" low-level"><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a href="/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_18">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_about_us_main_menu_three_levels(self):
"""
Test '{% main_menu max_levels=3 %}' output for 'About us' page
"""
response = self.client.get('/about-us/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='main-menu-three-levels').decode()
expected_menu_html = """
<div id="main-menu-three-levels">
<ul class="nav navbar-nav">
<li class=""><a href="/">Home</a></li>
<li class="ancestor dropdown top-level">
<a href="/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_6">
<li class="active top-level">
<a href="/about-us/">Section home</a>
</li>
<li class=" dropdown">
<a href="/about-us/meet-the-team/" class="dropdown-toggle" id="ddtoggle_7" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Meet the team <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_7">
<li class=""><a href="/about-us/meet-the-team/staff-member-one/">Staff member one</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-two/">Staff member two</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-three/">Staff member three</a></li>
</ul>
</li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_14">
<li class=""><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=""><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=""><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a href="/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_18">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_about_us_main_menu_absolute_urls(self):
"""
Test '{{ main_menu use_absolute_page_urls=True }}' output for homepage
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='main-menu-absolute-url').decode()
expected_menu_html = """
<div id="main-menu-absolute-url">
<ul class="nav navbar-nav">
<li class="active">
<a href="http://www.wagtailmenus.co.uk:8000/">Home</a>
</li>
<li class=" dropdown top-level">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_6">
<li class=" top-level">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/">Section home</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/meet-the-team/">Meet the team</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/our-heritage/">Our heritage</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/mission-and-values/">Our mission and values</a>
</li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="http://www.wagtailmenus.co.uk:8000/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_14">
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/news-and-events/latest-news/">Latest news</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/news-and-events/upcoming-events/">Upcoming events</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/news-and-events/press/">In the press</a>
</li>
</ul>
</li>
<li class="">
<a href="http://google.co.uk">Google</a>
</li>
<li class=" dropdown">
<a href="http://www.wagtailmenus.co.uk:8000/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_18">
<li class="support">
<a href="/contact-us/#support">Get support</a>
</li>
<li class="call">
<a href="/contact-us/#call">Speak to someone</a>
</li>
<li class="map">
<a href="/contact-us/#map">Map & directions</a>
</li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_about_us_section_menu_two_levels(self):
"""
Test '{% section_menu %}' output for 'About us' page
"""
response = self.client.get('/about-us/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='section-menu-two-levels').decode()
expected_menu_html = """
<div id="section-menu-two-levels">
<nav class="nav-section" role="navigation">
<a href="/about-us/" class="ancestor section_root">About us</a>
<ul>
<li class="active"><a href="/about-us/">Section home</a></li>
<li class="">
<a href="/about-us/meet-the-team/">Meet the team</a>
<ul>
<li class=""><a href="/about-us/meet-the-team/staff-member-one/">Staff member one</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-two/">Staff member two</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-three/">Staff member three</a></li>
</ul>
</li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</nav>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_about_us_section_menu_one_level(self):
"""
Test '{% section_menu max_levels=1 %}' output for 'About us' page
"""
response = self.client.get('/about-us/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='section-menu-one-level').decode()
expected_menu_html = """
<div id="section-menu-one-level">
<nav class="nav-section" role="navigation">
<a href="/about-us/" class="ancestor section_root">About us</a>
<ul>
<li class="active"><a href="/about-us/">Section home</a></li>
<li class=""><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</nav>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_about_us_section_menu_absolute_urls(self):
"""
Test '{% section_menu use_absolute_page_urls=True %}' output for 'About us' page
"""
response = self.client.get('/about-us/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='section-menu-absolute-url').decode()
expected_menu_html = """
<div id="section-menu-absolute-url">
<nav class="nav-section" role="navigation">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/" class="ancestor section_root">About us</a>
<ul>
<li class="active">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/">Section home</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/meet-the-team/">Meet the team</a>
<ul>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/meet-the-team/staff-member-one/">Staff member one</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/meet-the-team/staff-member-two/">Staff member two</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/meet-the-team/staff-member-three/">Staff member three</a>
</li>
</ul>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/our-heritage/">Our heritage</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/mission-and-values/">Our mission and values</a>
</li>
</ul>
</nav>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_about_us_children_menu_one_level(self):
"""
Test '{{ sub_menu self }}' output for 'About us' page
"""
response = self.client.get('/about-us/')
soup = BeautifulSoup(response.content, 'html5lib')
menu_html = soup.find(id='children-menu-one-level').decode()
expected_menu_html = """
<div id="children-menu-one-level">
<ul>
<li class=""><a href="/about-us/">Section home</a></li>
<li class=""><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_about_us_children_menu_three_levels(self):
"""
Test '{% children_menu max_levels=3 allow_repeating_parents=False %}' output for 'About us' page
"""
response = self.client.get('/about-us/')
soup = BeautifulSoup(response.content, 'html5lib')
menu_html = soup.find(id='children-menu-three-levels').decode()
expected_menu_html = """
<div id="children-menu-three-levels">
<ul>
<li class="">
<a href="/about-us/meet-the-team/">Meet the team</a>
<ul>
<li class=""><a href="/about-us/meet-the-team/staff-member-one/">Staff member one</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-two/">Staff member two</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-three/">Staff member three</a></li>
</ul>
</li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_about_us_children_absolute_urls(self):
"""
Test '{{ sub_menu self }}' output for 'About us' page
"""
response = self.client.get('/about-us/')
soup = BeautifulSoup(response.content, 'html5lib')
menu_html = soup.find(id='children-menu-absolute-urls').decode()
expected_menu_html = """
<div id="children-menu-absolute-urls">
<ul>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/">Section home</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/meet-the-team/">Meet the team</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/our-heritage/">Our heritage</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/about-us/mission-and-values/">Our mission and values</a>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_marvel_comics_section_menu_two_levels(self):
"""
Test '{% section_menu %}' output for 'Marvel comics' page
"""
response = self.client.get('/superheroes/marvel-comics/')
soup = BeautifulSoup(response.content, 'html5lib')
menu_html = soup.find(id='section-menu-two-levels').decode()
expected_menu_html = """
<div id="section-menu-two-levels">
<nav class="nav-section" role="navigation">
<a href="/superheroes/" class="ancestor section_root">Superheroes</a>
<ul>
<li class="active">
<a href="/superheroes/marvel-comics/">Marvel Comics</a>
<ul>
<li class=""><a href="/superheroes/marvel-comics/iron-man/">Iron Man</a></li>
<li class=""><a href="/superheroes/marvel-comics/spiderman/">Spiderman</a></li>
</ul>
</li>
<li class="">
<a href="/superheroes/dc-comics/">D.C. Comics</a>
<ul>
<li class=""><a href="/superheroes/dc-comics/batman/">Batman</a></li>
<li class="">
<a href="/superheroes/dc-comics/wonder-woman/">Wonder Woman</a>
</li>
</ul>
</li>
</ul>
</nav>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_marvel_comics_section_menu_one_level(self):
"""
Test '{% section_menu max_levels=1 %}' output for 'Marvel comics' page
"""
response = self.client.get('/superheroes/marvel-comics/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='section-menu-one-level').decode()
expected_menu_html = """
<div id="section-menu-one-level">
<nav class="nav-section" role="navigation">
<a href="/superheroes/" class="ancestor section_root">Superheroes</a>
<ul>
<li class="active"><a href="/superheroes/marvel-comics/">Marvel Comics</a></li>
<li class=""><a href="/superheroes/dc-comics/">D.C. Comics</a></li>
</ul>
</nav>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_marvel_comics_section_absolute_urls(self):
"""
Test '{% section_menu use_absolute_page_urls=True %}' output for 'Marvel comics' page
"""
response = self.client.get('/superheroes/marvel-comics/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='section-menu-absolute-url').decode()
expected_menu_html = """
<div id="section-menu-absolute-url">
<nav class="nav-section" role="navigation">
<a href="http://www.wagtailmenus.co.uk:8000/superheroes/" class="ancestor section_root">Superheroes</a>
<ul>
<li class="active">
<a href="http://www.wagtailmenus.co.uk:8000/superheroes/marvel-comics/">Marvel Comics</a>
<ul>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/superheroes/marvel-comics/iron-man/">Iron Man</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/superheroes/marvel-comics/spiderman/">Spiderman</a>
</li>
</ul>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/superheroes/dc-comics/">D.C. Comics</a>
<ul>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/superheroes/dc-comics/batman/">Batman</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/superheroes/dc-comics/wonder-woman/">Wonder Woman</a>
</li>
</ul>
</li>
</ul>
</nav>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_contact_flat_menu_output(self):
"""
Test that the HTML output by the 'flat_menu' tag (when using the handle 'contact') renders as expected.
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='nav-contact').decode()
expected_menu_html = """<div id="nav-contact"><div class="flat-menu contact no_heading"><ul><li class=""><a href="/contact-us/#offices">Call us</a></li><li class=""><a href="#advisor-chat">Chat to an advisor</a></li><li class=""><a href="#request-callback">Request a callback</a></li></ul></div></div>"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_footer_flat_menu_output(self):
"""
Test that the HTML output by the 'flat_menu' tag (when using the handle 'footer') renders as expected.
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='nav-footer').decode()
expected_menu_html = """
<div id="nav-footer">
<div class="flat-menu footer with_heading">
<h4>Important links</h4>
<ul>
<li class=""><a href="/legal/accessibility/">Accessibility</a></li>
<li class=""><a href="/legal/privacy-policy/">Privacy policy</a></li>
<li class=""><a href="/legal/terms-and-conditions/">Terms and conditions</a></li>
<li class=""><a href="/about-us/meet-the-team/custom-url/">Meet the team's pets</a></li>
</ul>
</div>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
response = self.client.get('/legal/privacy-policy/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='nav-footer').decode()
expected_menu_html = """
<div id="nav-footer">
<div class="flat-menu footer with_heading">
<h4>Important links</h4>
<ul>
<li class=""><a href="/legal/accessibility/">Accessibility</a></li>
<li class="active"><a href="/legal/privacy-policy/">Privacy policy</a></li>
<li class=""><a href="/legal/terms-and-conditions/">Terms and conditions</a></li>
<li class=""><a href="/about-us/meet-the-team/custom-url/">Meet the team's pets</a></li>
</ul>
</div>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
response = self.client.get('/about-us/meet-the-team/custom-url/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='nav-footer').decode()
expected_menu_html = """
<div id="nav-footer">
<div class="flat-menu footer with_heading">
<h4>Important links</h4>
<ul>
<li class=""><a href="/legal/accessibility/">Accessibility</a></li>
<li class=""><a href="/legal/privacy-policy/">Privacy policy</a></li>
<li class=""><a href="/legal/terms-and-conditions/">Terms and conditions</a></li>
<li class="active"><a href="/about-us/meet-the-team/custom-url/">Meet the team's pets</a></li>
</ul>
</div>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='nav-footer-absolute-urls').decode()
expected_menu_html = """
<div id="nav-footer-absolute-urls">
<div class="flat-menu footer with_heading">
<h4>Important links</h4>
<ul>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/legal/accessibility/">Accessibility</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/legal/privacy-policy/">Privacy policy</a>
</li>
<li class="">
<a href="http://www.wagtailmenus.co.uk:8000/legal/terms-and-conditions/">Terms and conditions</a>
</li>
<li class="">
<a href="/about-us/meet-the-team/custom-url/">Meet the team's pets</a>
</li>
</ul>
</div>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_custom_page_menu_output(self):
response = self.client.get('/custom-url/')
soup = BeautifulSoup(response.content, 'html5lib')
main_menu_html = soup.find(id='main-menu-two-levels').decode()
expected_menu_html = """
<div id="main-menu-two-levels">
<ul class="nav navbar-nav">
<li class=""><a href="/">Home</a></li>
<li class=" dropdown top-level">
<a href="/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_6">
<li class=" top-level"><a href="/about-us/">Section home</a></li>
<li class=" low-level"><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=" low-level"><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=" low-level"><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_14">
<li class=" low-level"><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=" low-level"><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=" low-level"><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a href="/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_18">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(main_menu_html, expected_menu_html)
section_menu_html = soup.find(id='section-menu-two-levels').decode()
expected_menu_html = """<div id="section-menu-two-levels"></div>"""
self.assertHTMLEqual(section_menu_html, expected_menu_html)
def test_custom_about_us_url_section_menu_two_levels(self):
"""
Test '{% section_menu max_levels=2 %}' output for a custom url that
looks like a page from the 'about us' section, but isn't.
'about-us' and 'meet-the-team' items should be identified as
'ancestors', as indicated by the request path.
"""
response = self.client.get('/about-us/meet-the-team/custom-url/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='section-menu-two-levels').decode()
expected_menu_html = """
<div id="section-menu-two-levels">
<nav class="nav-section" role="navigation">
<a href="/about-us/" class="ancestor section_root">About us</a>
<ul>
<li class=""><a href="/about-us/">Section home</a></li>
<li class="ancestor">
<a href="/about-us/meet-the-team/">Meet the team</a>
<ul>
<li class=""><a href="/about-us/meet-the-team/staff-member-one/">Staff member one</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-two/">Staff member two</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-three/">Staff member three</a></li>
</ul>
</li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</nav>
</div>
"""
self.assertEqual(response.status_code, 200)
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_custom_about_us_url_main_menu_two_levels(self):
"""
Test '{% main_menu max_levels=2 %}' output for a custom url that
looks like a page from the 'about us' section, but isn't.
'about-us' and 'meet-the-team' items should be identified as
'ancestors', as indicated by the request path.
"""
response = self.client.get('/about-us/meet-the-team/custom-url/')
self.assertEqual(response.status_code, 200)
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='main-menu-two-levels').decode()
expected_menu_html = """
<div id="main-menu-two-levels">
<ul class="nav navbar-nav">
<li class=""><a href="/">Home</a></li>
<li class="ancestor dropdown top-level">
<a href="/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_6">
<li class=" top-level"><a href="/about-us/">Section home</a></li>
<li class="ancestor low-level"><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=" low-level"><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=" low-level"><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_14">
<li class=" low-level"><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=" low-level"><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=" low-level"><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a href="/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_18">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_custom_superheroes_url_section_menu_two_levels(self):
"""
Test '{% section_menu max_levels=2 %}' output for a custom url that
looks like a page from the superheroes section, but isn't.
'superheroes' and 'marvel-comics' items should be identified as
'ancestors', as indicated by the request path.
"""
response = self.client.get('/superheroes/marvel-comics/custom-man/about/')
self.assertEqual(response.status_code, 200)
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='section-menu-two-levels').decode()
expected_menu_html = """
<div id="section-menu-two-levels">
<nav class="nav-section" role="navigation">
<a href="/superheroes/" class="ancestor section_root">Superheroes</a>
<ul>
<li class="ancestor">
<a href="/superheroes/marvel-comics/">Marvel Comics</a>
<ul>
<li class=""><a href="/superheroes/marvel-comics/iron-man/">Iron Man</a></li>
<li class=""><a href="/superheroes/marvel-comics/spiderman/">Spiderman</a></li>
</ul>
</li>
<li class="">
<a href="/superheroes/dc-comics/">D.C. Comics</a>
<ul>
<li class=""><a href="/superheroes/dc-comics/batman/">Batman</a></li>
<li class="">
<a href="/superheroes/dc-comics/wonder-woman/">Wonder Woman</a>
</li>
</ul>
</li>
</ul>
</nav>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_staffmember_direct_url_main_menu(self):
"""
Test '{% main_menu max_levels=3 %}' when serving the following URL:
/about-us/meet-the-team/staff-member-one/
It's a real page in the tree, so we want to identify it and highlight
it as active, but it's not being served via Wagtail's `serve_page`, so
the page is identified using the request path.
"""
response = self.client.get('/about-us/meet-the-team/staff-member-one/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='main-menu-three-levels').decode()
expected_menu_html = """
<div id="main-menu-three-levels">
<ul class="nav navbar-nav">
<li class=""><a href="/">Home</a></li>
<li class="ancestor dropdown top-level">
<a href="/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_6">
<li class=" top-level">
<a href="/about-us/">Section home</a>
</li>
<li class="ancestor dropdown">
<a href="/about-us/meet-the-team/" class="dropdown-toggle" id="ddtoggle_7" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Meet the team <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_7">
<li class="active"><a href="/about-us/meet-the-team/staff-member-one/">Staff member one</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-two/">Staff member two</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-three/">Staff member three</a></li>
</ul>
</li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_14">
<li class=""><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=""><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=""><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a href="/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="dropdown-menu" aria-labelledby="ddtoggle_18">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_staffmember_direct_url_section_menu(self):
"""
Test '{% section_menu max_levels=2 %}' when serving the following URL:
/about-us/meet-the-team/staff-member-one/
It's a real page in the tree, so we want to identify it and highlight
it as active, but it's not being served via Wagtail's `serve_page`, so
the page is identified using the request path.
"""
response = self.client.get('/about-us/meet-the-team/staff-member-one/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='section-menu-two-levels').decode()
expected_menu_html = """
<div id="section-menu-two-levels">
<nav class="nav-section" role="navigation">
<a href="/about-us/" class="ancestor section_root">About us</a>
<ul>
<li class=""><a href="/about-us/">Section home</a></li>
<li class="ancestor">
<a href="/about-us/meet-the-team/">Meet the team</a>
<ul>
<li class="active"><a href="/about-us/meet-the-team/staff-member-one/">Staff member one</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-two/">Staff member two</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-three/">Staff member three</a></li>
</ul>
</li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</nav>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
def test_news_and_events_section_menu(self):
"""
Test '{% section_menu max_levels=2 %}' when serving the following URL:
/news-and-events/
It's a real page in the tree, so we want to identify it and highlight
it as active, but it's not being served via Wagtail's `serve_page`, so
the page is identified using the request path.
"""
response = self.client.get('/news-and-events/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='section-menu-two-levels').decode()
expected_menu_html = """
<div id="section-menu-two-levels">
<nav class="nav-section" role="navigation">
<a href="/news-and-events/" class="active section_root">News & events</a>
<ul>
<li class=""><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=""><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=""><a href="/news-and-events/press/">In the press</a></li>
</ul>
</nav>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
@override_settings(WAGTAILMENUS_SITE_SPECIFIC_TEMPLATE_DIRS=True)
def test_use_specific_off(self):
"""
The below URL is a custom URL, but the URL matches a real page,
which will be indicated in the menus being output. It's using a
template where use_specific=0 is supplied to all menu tags, so
there should be no repeating items, no programatically added
items, and no additional classes present on <li> elements
"""
response = self.client.get('/superheroes/marvel-comics/iron-man/')
soup = BeautifulSoup(response.content, 'html5lib')
main_menu_html = soup.find(id='main-menu').decode()
expected_main_menu_html = """
<div id="main-menu">
<ul class="nav navbar-nav">
<li class="">
<a href="/">Home</a>
</li>
<li class=" dropdown">
<a href="/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul aria-labelledby="ddtoggle_6" class="dropdown-menu">
<li class=""><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown">
<a href="/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul aria-labelledby="ddtoggle_14" class="dropdown-menu">
<li class=""><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=""><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=""><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=""><a href="/contact-us/">Contact us</a></li>
</ul>
</div>
"""
self.assertHTMLEqual(main_menu_html, expected_main_menu_html)
seconday_nav_html = soup.find(id='secondary-nav').decode()
expected_seconday_nav_html = """
<div id="secondary-nav">
<ul>
<li class="">
<a href="/about-us/">About us</a>
<ul class="sub-menu" data-level="2">
<li class=""><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=""><a href="/superheroes/marvel-comics/">Marvel Comics</a></li>
<li class=""><a href="/superheroes/dc-comics/">D.C. Comics</a></li>
</ul>
</div>
"""
self.assertHTMLEqual(seconday_nav_html, expected_seconday_nav_html)
section_menu_html = soup.find(id='section-menu').decode()
expected_section_menu_html = """
<div id="section-menu">
<a href="/superheroes/" class="ancestor section_root">Superheroes</a>
<ul>
<li class="ancestor">
<a href="/superheroes/marvel-comics/">Marvel Comics</a>
<ul class="sub-menu" data-level="2">
<li class="active"><a href="/superheroes/marvel-comics/iron-man/">Iron Man</a></li>
<li class=""><a href="/superheroes/marvel-comics/spiderman/">Spiderman</a></li>
</ul>
</li>
<li class="">
<a href="/superheroes/dc-comics/">D.C. Comics</a>
<ul class="sub-menu" data-level="2">
<li class=""><a href="/superheroes/dc-comics/batman/">Batman</a></li>
<li class=""><a href="/superheroes/dc-comics/wonder-woman/">Wonder Woman</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(section_menu_html, expected_section_menu_html)
def test_use_specific_top_level(self):
"""
The below URL is a custom URL, but the URL matches a real page,
which will be indicated in the menus being output. It's using a
template where use_specific=2 is supplied to all menu tags, most of the
first level <li> elements should have additional classes from their
respective specific model, and should see repeated items and
programatically added items too.
"""
response = self.client.get('/superheroes/dc-comics/batman/')
soup = BeautifulSoup(response.content, 'html5lib')
main_menu_html = soup.find(id='main-menu').decode()
expected_main_menu_html = """
<div id="main-menu">
<ul class="nav navbar-nav">
<li class=""><a href="/">Home</a></li>
<li class=" dropdown top-level">
<a aria-expanded="false" aria-haspopup="true" class="dropdown-toggle" data-toggle="dropdown" href="/about-us/" id="ddtoggle_6">About <span class="caret"></span></a>
<ul aria-labelledby="ddtoggle_6" class="dropdown-menu">
<li class=" top-level"><a href="/about-us/">Section home</a></li>
<li class=""><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a aria-expanded="false" aria-haspopup="true" class="dropdown-toggle" data-toggle="dropdown" href="/news-and-events/" id="ddtoggle_14">News & events <span class="caret"></span></a>
<ul aria-labelledby="ddtoggle_14" class="dropdown-menu">
<li class=""><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=""><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=""><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a aria-expanded="false" aria-haspopup="true" class="dropdown-toggle" data-toggle="dropdown" href="/contact-us/" id="ddtoggle_18">Contact us <span class="caret"></span></a>
<ul aria-labelledby="ddtoggle_18" class="dropdown-menu">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(main_menu_html, expected_main_menu_html)
seconday_nav_html = soup.find(id='secondary-nav').decode()
expected_seconday_nav_html = """
<div id="secondary-nav">
<ul>
<li class=" top-level">
<a href="/about-us/">About us</a>
<ul class="sub-menu" data-level="2">
<li class=" top-level"><a href="/about-us/">Section home</a></li>
<li class=""><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" low-level"><a href="/superheroes/marvel-comics/">Marvel Comics</a></li>
<li class=" low-level"><a href="/superheroes/dc-comics/">D.C. Comics</a></li>
</ul>
</div>
"""
self.assertHTMLEqual(seconday_nav_html, expected_seconday_nav_html)
section_menu_html = soup.find(id='section-menu').decode()
expected_section_menu_html = """
<div id="section-menu">
<a href="/superheroes/" class="ancestor section_root top-level">Superheroes</a>
<ul>
<li class="">
<a href="/superheroes/marvel-comics/">Marvel Comics</a>
<ul class="sub-menu" data-level="2">
<li class=""><a href="/superheroes/marvel-comics/iron-man/">Iron Man</a></li>
<li class=""><a href="/superheroes/marvel-comics/spiderman/">Spiderman</a></li>
</ul>
</li>
<li class="ancestor">
<a href="/superheroes/dc-comics/">D.C. Comics</a>
<ul class="sub-menu" data-level="2">
<li class="active"><a href="/superheroes/dc-comics/batman/">Batman</a></li>
<li class=""><a href="/superheroes/dc-comics/wonder-woman/">Wonder Woman</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(section_menu_html, expected_section_menu_html)
def test_use_specific_always(self):
"""
The below URL is a custom URL, but the URL matches a real page,
which will be indicated in the menus being output. It's using a
template where use_specific=3 is supplied to all menu tags, so all
<li> elements should have additional classes from their
respective specific model.
"""
response = self.client.get('/superheroes/dc-comics/wonder-woman/')
soup = BeautifulSoup(response.content, 'html5lib')
main_menu_html = soup.find(id='main-menu').decode()
expected_main_menu_html = """
<div id="main-menu">
<ul class="nav navbar-nav">
<li class=""><a href="/">Home</a></li>
<li class=" dropdown top-level">
<a aria-expanded="false" aria-haspopup="true" class="dropdown-toggle" data-toggle="dropdown" href="/about-us/" id="ddtoggle_6">About <span class="caret"></span></a>
<ul aria-labelledby="ddtoggle_6" class="dropdown-menu">
<li class=" top-level"><a href="/about-us/">Section home</a></li>
<li class=" low-level"><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=" low-level"><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=" low-level"><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a aria-expanded="false" aria-haspopup="true" class="dropdown-toggle" data-toggle="dropdown" href="/news-and-events/" id="ddtoggle_14">News & events <span class="caret"></span></a>
<ul aria-labelledby="ddtoggle_14" class="dropdown-menu">
<li class=" low-level"><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=" low-level"><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=" low-level"><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a aria-expanded="false" aria-haspopup="true" class="dropdown-toggle" data-toggle="dropdown" href="/contact-us/" id="ddtoggle_18">Contact us <span class="caret"></span></a>
<ul aria-labelledby="ddtoggle_18" class="dropdown-menu">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(main_menu_html, expected_main_menu_html)
seconday_nav_html = soup.find(id='secondary-nav').decode()
expected_seconday_nav_html = """
<div id="secondary-nav">
<ul>
<li class=" top-level">
<a href="/about-us/">About us</a>
<ul class="sub-menu" data-level="2">
<li class=" top-level"><a href="/about-us/">Section home</a></li>
<li class=" low-level"><a href="/about-us/meet-the-team/">Meet the team</a></li>
<li class=" low-level"><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=" low-level"><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" low-level"><a href="/superheroes/marvel-comics/">Marvel Comics</a></li>
<li class=" low-level"><a href="/superheroes/dc-comics/">D.C. Comics</a></li>
</ul>
</div>
"""
self.assertHTMLEqual(seconday_nav_html, expected_seconday_nav_html)
section_menu_html = soup.find(id='section-menu').decode()
expected_section_menu_html = """
<div id="section-menu">
<a href="/superheroes/" class="ancestor section_root top-level">Superheroes</a>
<ul>
<li class=" low-level">
<a href="/superheroes/marvel-comics/">Marvel Comics</a>
<ul class="sub-menu" data-level="2">
<li class=" low-level"><a href="/superheroes/marvel-comics/iron-man/">Iron Man</a></li>
<li class=" low-level"><a href="/superheroes/marvel-comics/spiderman/">Spiderman</a></li>
</ul>
</li>
<li class="ancestor low-level">
<a href="/superheroes/dc-comics/">D.C. Comics</a>
<ul class="sub-menu" data-level="2">
<li class=" low-level"><a href="/superheroes/dc-comics/batman/">Batman</a></li>
<li class="active low-level"><a href="/superheroes/dc-comics/wonder-woman/">Wonder Woman</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(section_menu_html, expected_section_menu_html)
def test_sub_menu_tag_usage_in_non_menu_template_raises_submenuusageerror(self):
"""
The 'sub_menu' tag should raise an error if used directly (not from
within another menu template)
"""
with self.assertRaises(SubMenuUsageError):
self.client.get('/sub_menu-tag-used-directly/')
def test_main_menu_with_sub_menu_templates(self):
"""
Test '{% main_menu %}' output for 'Home' page when 'sub_menu_templates'
is used to specify different templates for each level
"""
response = self.client.get('/')
soup = BeautifulSoup(response.content, 'html5lib')
# Assertions to compare rendered HTML against expected HTML
menu_html = soup.find(id='main-menu-sub-menu-templates').decode()
expected_menu_html = """
<div id="main-menu-sub-menu-templates">
<ul class="nav navbar-nav">
<li class="active"><a href="/">Home</a></li>
<li class=" dropdown top-level">
<a href="/about-us/" class="dropdown-toggle" id="ddtoggle_6" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">About <span class="caret"></span></a>
<ul class="sub-menu-level-2" data-level="2">
<li class=" top-level"><a href="/about-us/">Section home</a></li>
<li class="">
<a href="/about-us/meet-the-team/">Meet the team</a>
<ul class="sub-menu-level-3" data-level="3">
<li class=""><a href="/about-us/meet-the-team/staff-member-one/">Staff member one</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-two/">Staff member two</a></li>
<li class=""><a href="/about-us/meet-the-team/staff-member-three/">Staff member three</a></li>
</ul>
</li>
<li class=""><a href="/about-us/our-heritage/">Our heritage</a></li>
<li class=""><a href="/about-us/mission-and-values/">Our mission and values</a></li>
</ul>
</li>
<li class=" dropdown top-level">
<a href="/news-and-events/" class="dropdown-toggle" id="ddtoggle_14" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">News & events <span class="caret"></span></a>
<ul class="sub-menu-level-2" data-level="2">
<li class=""><a href="/news-and-events/latest-news/">Latest news</a></li>
<li class=""><a href="/news-and-events/upcoming-events/">Upcoming events</a></li>
<li class=""><a href="/news-and-events/press/">In the press</a></li>
</ul>
</li>
<li class=""><a href="http://google.co.uk">Google</a></li>
<li class=" dropdown">
<a href="/contact-us/" class="dropdown-toggle" id="ddtoggle_18" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">Contact us <span class="caret"></span></a>
<ul class="sub-menu-level-2" data-level="2">
<li class="support"><a href="/contact-us/#support">Get support</a></li>
<li class="call"><a href="/contact-us/#call">Speak to someone</a></li>
<li class="map"><a href="/contact-us/#map">Map & directions</a></li>
</ul>
</li>
</ul>
</div>
"""
self.assertHTMLEqual(menu_html, expected_menu_html)
| 54.717462 | 312 | 0.523039 | 9,886 | 83,663 | 4.349686 | 0.032369 | 0.041859 | 0.051696 | 0.046511 | 0.945559 | 0.942653 | 0.936211 | 0.926978 | 0.917444 | 0.914072 | 0 | 0.007689 | 0.315982 | 83,663 | 1,528 | 313 | 54.753272 | 0.743722 | 0.079067 | 0 | 0.888353 | 0 | 0.235341 | 0.807152 | 0.244376 | 0 | 0 | 0 | 0 | 0.052209 | 1 | 0.032932 | false | 0 | 0.008032 | 0 | 0.043373 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
0a4d864da7ef3a78be6f4a73524a12a0353a1f7d | 70 | py | Python | quest/gardening.py | sodapopinsky/dfk | be48e89d4b054ad8abbb009d0e1ea4c10f559af5 | [
"MIT"
] | 90 | 2021-10-17T19:36:45.000Z | 2022-03-31T17:19:43.000Z | quest/gardening.py | sodapopinsky/dfk | be48e89d4b054ad8abbb009d0e1ea4c10f559af5 | [
"MIT"
] | 13 | 2021-11-13T00:19:31.000Z | 2022-03-20T15:13:22.000Z | quest/gardening.py | sodapopinsky/dfk | be48e89d4b054ad8abbb009d0e1ea4c10f559af5 | [
"MIT"
] | 71 | 2021-11-05T03:00:41.000Z | 2022-03-30T06:16:25.000Z | QUEST_CONTRACT_ADDRESS = '0xe4154B6E5D240507F9699C730a496790A722DF19'
| 35 | 69 | 0.914286 | 4 | 70 | 15.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.462687 | 0.042857 | 70 | 1 | 70 | 70 | 0.462687 | 0 | 0 | 0 | 0 | 0 | 0.6 | 0.6 | 0 | 0 | 0.6 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0a62fea18bacb7cf77d7ae6a3aaa1630a2f81cec | 286,737 | py | Python | tests/hwsim/test_ap_eap.py | zhijianli88/hostap | 6d49aeb76247c4145cb4f7c05afb7b35f27150c1 | [
"Unlicense"
] | null | null | null | tests/hwsim/test_ap_eap.py | zhijianli88/hostap | 6d49aeb76247c4145cb4f7c05afb7b35f27150c1 | [
"Unlicense"
] | 1 | 2018-01-09T16:46:00.000Z | 2018-01-09T16:46:00.000Z | tests/hwsim/test_ap_eap.py | zhijianli88/hostap | 6d49aeb76247c4145cb4f7c05afb7b35f27150c1 | [
"Unlicense"
] | null | null | null | # -*- coding: utf-8 -*-
# WPA2-Enterprise tests
# Copyright (c) 2013-2015, Jouni Malinen <j@w1.fi>
#
# This software may be distributed under the terms of the BSD license.
# See README for more details.
import base64
import binascii
import time
import subprocess
import logging
logger = logging.getLogger()
import os
import socket
import SocketServer
import struct
import tempfile
import hwsim_utils
import hostapd
from utils import HwsimSkip, alloc_fail, fail_test, skip_with_fips, wait_fail_trigger
from wpasupplicant import WpaSupplicant
from test_ap_psk import check_mib, find_wpas_process, read_process_memory, verify_not_present, get_key_locations, set_test_assoc_ie
try:
import OpenSSL
openssl_imported = True
except ImportError:
openssl_imported = False
def check_hlr_auc_gw_support():
if not os.path.exists("/tmp/hlr_auc_gw.sock"):
raise HwsimSkip("No hlr_auc_gw available")
def check_eap_capa(dev, method):
res = dev.get_capability("eap")
if method not in res:
raise HwsimSkip("EAP method %s not supported in the build" % method)
def check_subject_match_support(dev):
tls = dev.request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("subject_match not supported with this TLS library: " + tls)
def check_altsubject_match_support(dev):
tls = dev.request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("altsubject_match not supported with this TLS library: " + tls)
def check_domain_match(dev):
tls = dev.request("GET tls_library")
if tls.startswith("internal"):
raise HwsimSkip("domain_match not supported with this TLS library: " + tls)
def check_domain_suffix_match(dev):
tls = dev.request("GET tls_library")
if tls.startswith("internal"):
raise HwsimSkip("domain_suffix_match not supported with this TLS library: " + tls)
def check_domain_match_full(dev):
tls = dev.request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("domain_suffix_match requires full match with this TLS library: " + tls)
def check_cert_probe_support(dev):
tls = dev.request("GET tls_library")
if not tls.startswith("OpenSSL") and not tls.startswith("internal"):
raise HwsimSkip("Certificate probing not supported with this TLS library: " + tls)
def check_ext_cert_check_support(dev):
tls = dev.request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("ext_cert_check not supported with this TLS library: " + tls)
def check_ocsp_support(dev):
tls = dev.request("GET tls_library")
#if tls.startswith("internal"):
# raise HwsimSkip("OCSP not supported with this TLS library: " + tls)
#if "BoringSSL" in tls:
# raise HwsimSkip("OCSP not supported with this TLS library: " + tls)
def check_pkcs5_v15_support(dev):
tls = dev.request("GET tls_library")
if "BoringSSL" in tls:
raise HwsimSkip("PKCS#5 v1.5 not supported with this TLS library: " + tls)
def check_ocsp_multi_support(dev):
tls = dev.request("GET tls_library")
if not tls.startswith("internal"):
raise HwsimSkip("OCSP-multi not supported with this TLS library: " + tls)
as_hapd = hostapd.Hostapd("as")
res = as_hapd.request("GET tls_library")
del as_hapd
if not res.startswith("internal"):
raise HwsimSkip("Authentication server does not support ocsp_multi")
def check_pkcs12_support(dev):
tls = dev.request("GET tls_library")
#if tls.startswith("internal"):
# raise HwsimSkip("PKCS#12 not supported with this TLS library: " + tls)
def check_dh_dsa_support(dev):
tls = dev.request("GET tls_library")
if tls.startswith("internal"):
raise HwsimSkip("DH DSA not supported with this TLS library: " + tls)
def read_pem(fname):
with open(fname, "r") as f:
lines = f.readlines()
copy = False
cert = ""
for l in lines:
if "-----END" in l:
break
if copy:
cert = cert + l
if "-----BEGIN" in l:
copy = True
return base64.b64decode(cert)
def eap_connect(dev, hapd, method, identity,
sha256=False, expect_failure=False, local_error_report=False,
maybe_local_error=False, **kwargs):
id = dev.connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap=method, identity=identity,
wait_connect=False, scan_freq="2412", ieee80211w="1",
**kwargs)
eap_check_auth(dev, method, True, sha256=sha256,
expect_failure=expect_failure,
local_error_report=local_error_report,
maybe_local_error=maybe_local_error)
if expect_failure:
return id
ev = hapd.wait_event([ "AP-STA-CONNECTED" ], timeout=5)
if ev is None:
raise Exception("No connection event received from hostapd")
return id
def eap_check_auth(dev, method, initial, rsn=True, sha256=False,
expect_failure=False, local_error_report=False,
maybe_local_error=False):
ev = dev.wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev.wait_event(["CTRL-EVENT-EAP-METHOD",
"CTRL-EVENT-EAP-FAILURE"], timeout=10)
if ev is None:
raise Exception("EAP method selection timed out")
if "CTRL-EVENT-EAP-FAILURE" in ev:
if maybe_local_error:
return
raise Exception("Could not select EAP method")
if method not in ev:
raise Exception("Unexpected EAP method")
if expect_failure:
ev = dev.wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("EAP failure timed out")
ev = dev.wait_disconnected(timeout=10)
if maybe_local_error and "locally_generated=1" in ev:
return
if not local_error_report:
if "reason=23" not in ev:
raise Exception("Proper reason code for disconnection not reported")
return
ev = dev.wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
if initial:
ev = dev.wait_event(["CTRL-EVENT-CONNECTED"], timeout=10)
else:
ev = dev.wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Association with the AP timed out")
status = dev.get_status()
if status["wpa_state"] != "COMPLETED":
raise Exception("Connection not completed")
if status["suppPortStatus"] != "Authorized":
raise Exception("Port not authorized")
if "selectedMethod" not in status:
logger.info("Status: " + str(status))
raise Exception("No selectedMethod in status")
if method not in status["selectedMethod"]:
raise Exception("Incorrect EAP method status")
if sha256:
e = "WPA2-EAP-SHA256"
elif rsn:
e = "WPA2/IEEE 802.1X/EAP"
else:
e = "WPA/IEEE 802.1X/EAP"
if status["key_mgmt"] != e:
raise Exception("Unexpected key_mgmt status: " + status["key_mgmt"])
return status
def eap_reauth(dev, method, rsn=True, sha256=False, expect_failure=False):
dev.request("REAUTHENTICATE")
return eap_check_auth(dev, method, False, rsn=rsn, sha256=sha256,
expect_failure=expect_failure)
def test_ap_wpa2_eap_sim(dev, apdev):
"""WPA2-Enterprise connection using EAP-SIM"""
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "SIM")
eap_connect(dev[1], hapd, "SIM", "1232010000000001",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581")
eap_connect(dev[2], hapd, "SIM", "1232010000000002",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
expect_failure=True)
logger.info("Negative test with incorrect key")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
expect_failure=True)
logger.info("Invalid GSM-Milenage key")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a",
expect_failure=True)
logger.info("Invalid GSM-Milenage key(2)")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a8q:cb9cccc4b9258e6dca4760379fb82581",
expect_failure=True)
logger.info("Invalid GSM-Milenage key(3)")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb8258q",
expect_failure=True)
logger.info("Invalid GSM-Milenage key(4)")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a89qcb9cccc4b9258e6dca4760379fb82581",
expect_failure=True)
logger.info("Missing key configuration")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
expect_failure=True)
def test_ap_wpa2_eap_sim_sql(dev, apdev, params):
"""WPA2-Enterprise connection using EAP-SIM (SQL)"""
check_hlr_auc_gw_support()
try:
import sqlite3
except ImportError:
raise HwsimSkip("No sqlite3 module available")
con = sqlite3.connect(os.path.join(params['logdir'], "hostapd.db"))
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params['auth_server_port'] = "1814"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581")
logger.info("SIM fast re-authentication")
eap_reauth(dev[0], "SIM")
logger.info("SIM full auth with pseudonym")
with con:
cur = con.cursor()
cur.execute("DELETE FROM reauth WHERE permanent='1232010000000000'")
eap_reauth(dev[0], "SIM")
logger.info("SIM full auth with permanent identity")
with con:
cur = con.cursor()
cur.execute("DELETE FROM reauth WHERE permanent='1232010000000000'")
cur.execute("DELETE FROM pseudonyms WHERE permanent='1232010000000000'")
eap_reauth(dev[0], "SIM")
logger.info("SIM reauth with mismatching MK")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET mk='0000000000000000000000000000000000000000' WHERE permanent='1232010000000000'")
eap_reauth(dev[0], "SIM", expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET counter='10' WHERE permanent='1232010000000000'")
eap_reauth(dev[0], "SIM")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET counter='10' WHERE permanent='1232010000000000'")
logger.info("SIM reauth with mismatching counter")
eap_reauth(dev[0], "SIM")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET counter='1001' WHERE permanent='1232010000000000'")
logger.info("SIM reauth with max reauth count reached")
eap_reauth(dev[0], "SIM")
def test_ap_wpa2_eap_sim_config(dev, apdev):
"""EAP-SIM configuration options"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="SIM",
identity="1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
phase1="sim_min_num_chal=1",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["EAP: Failed to initialize EAP method: vendor 0 method 18 (SIM)"], timeout=10)
if ev is None:
raise Exception("No EAP error message seen")
dev[0].request("REMOVE_NETWORK all")
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="SIM",
identity="1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
phase1="sim_min_num_chal=4",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["EAP: Failed to initialize EAP method: vendor 0 method 18 (SIM)"], timeout=10)
if ev is None:
raise Exception("No EAP error message seen (2)")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
phase1="sim_min_num_chal=2")
eap_connect(dev[1], hapd, "SIM", "1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
anonymous_identity="345678")
def test_ap_wpa2_eap_sim_ext(dev, apdev):
"""WPA2-Enterprise connection using EAP-SIM and external GSM auth"""
try:
_test_ap_wpa2_eap_sim_ext(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_sim_ext(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="SIM", key_mgmt="WPA-EAP",
identity="1232010000000000",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=15)
if ev is None:
raise Exception("Network connected timed out")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# IK:CK:RES
resp = "00112233445566778899aabbccddeeff:00112233445566778899aabbccddeeff:0011223344"
# This will fail during processing, but the ctrl_iface command succeeds
dev[0].request("CTRL-RSP-SIM-" + rid + ":UMTS-AUTH:" + resp)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(0.1)
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# This will fail during GSM auth validation
if "OK" not in dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:q"):
raise Exception("CTRL-RSP-SIM failed")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(0.1)
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# This will fail during GSM auth validation
if "OK" not in dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:34"):
raise Exception("CTRL-RSP-SIM failed")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(0.1)
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# This will fail during GSM auth validation
if "OK" not in dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:0011223344556677"):
raise Exception("CTRL-RSP-SIM failed")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(0.1)
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# This will fail during GSM auth validation
if "OK" not in dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:0011223344556677:q"):
raise Exception("CTRL-RSP-SIM failed")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(0.1)
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# This will fail during GSM auth validation
if "OK" not in dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:0011223344556677:00112233"):
raise Exception("CTRL-RSP-SIM failed")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(0.1)
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# This will fail during GSM auth validation
if "OK" not in dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:0011223344556677:00112233:q"):
raise Exception("CTRL-RSP-SIM failed")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
def test_ap_wpa2_eap_sim_ext_replace_sim(dev, apdev):
"""EAP-SIM with external GSM auth and replacing SIM without clearing pseudonym id"""
try:
_test_ap_wpa2_eap_sim_ext_replace_sim(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_sim_ext_replace_sim(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="SIM", key_mgmt="WPA-EAP",
identity="1232010000000000",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
rand = p[2].split(' ')[0]
res = subprocess.check_output(["../../hostapd/hlr_auc_gw",
"-m",
"auth_serv/hlr_auc_gw.milenage_db",
"GSM-AUTH-REQ 232010000000000 " + rand])
if "GSM-AUTH-RESP" not in res:
raise Exception("Unexpected hlr_auc_gw response")
resp = res.split(' ')[2].rstrip()
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
dev[0].wait_connected(timeout=15)
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
# Replace SIM, but forget to drop the previous pseudonym identity
dev[0].set_network_quoted(id, "identity", "1232010000000009")
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
rand = p[2].split(' ')[0]
res = subprocess.check_output(["../../hostapd/hlr_auc_gw",
"-m",
"auth_serv/hlr_auc_gw.milenage_db",
"GSM-AUTH-REQ 232010000000009 " + rand])
if "GSM-AUTH-RESP" not in res:
raise Exception("Unexpected hlr_auc_gw response")
resp = res.split(' ')[2].rstrip()
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP-Failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_sim_ext_replace_sim2(dev, apdev):
"""EAP-SIM with external GSM auth and replacing SIM and clearing pseudonym identity"""
try:
_test_ap_wpa2_eap_sim_ext_replace_sim2(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_sim_ext_replace_sim2(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="SIM", key_mgmt="WPA-EAP",
identity="1232010000000000",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
rand = p[2].split(' ')[0]
res = subprocess.check_output(["../../hostapd/hlr_auc_gw",
"-m",
"auth_serv/hlr_auc_gw.milenage_db",
"GSM-AUTH-REQ 232010000000000 " + rand])
if "GSM-AUTH-RESP" not in res:
raise Exception("Unexpected hlr_auc_gw response")
resp = res.split(' ')[2].rstrip()
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
dev[0].wait_connected(timeout=15)
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
# Replace SIM and drop the previous pseudonym identity
dev[0].set_network_quoted(id, "identity", "1232010000000009")
dev[0].set_network(id, "anonymous_identity", "NULL")
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
rand = p[2].split(' ')[0]
res = subprocess.check_output(["../../hostapd/hlr_auc_gw",
"-m",
"auth_serv/hlr_auc_gw.milenage_db",
"GSM-AUTH-REQ 232010000000009 " + rand])
if "GSM-AUTH-RESP" not in res:
raise Exception("Unexpected hlr_auc_gw response")
resp = res.split(' ')[2].rstrip()
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_sim_ext_replace_sim3(dev, apdev):
"""EAP-SIM with external GSM auth, replacing SIM, and no identity in config"""
try:
_test_ap_wpa2_eap_sim_ext_replace_sim3(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_sim_ext_replace_sim3(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="SIM", key_mgmt="WPA-EAP",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-IDENTITY"])
if ev is None:
raise Exception("Request for identity timed out")
rid = ev.split(':')[0].split('-')[-1]
dev[0].request("CTRL-RSP-IDENTITY-" + rid + ":1232010000000000")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
rand = p[2].split(' ')[0]
res = subprocess.check_output(["../../hostapd/hlr_auc_gw",
"-m",
"auth_serv/hlr_auc_gw.milenage_db",
"GSM-AUTH-REQ 232010000000000 " + rand])
if "GSM-AUTH-RESP" not in res:
raise Exception("Unexpected hlr_auc_gw response")
resp = res.split(' ')[2].rstrip()
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
dev[0].wait_connected(timeout=15)
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
# Replace SIM and drop the previous permanent and pseudonym identities
dev[0].set_network(id, "identity", "NULL")
dev[0].set_network(id, "anonymous_identity", "NULL")
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-IDENTITY"])
if ev is None:
raise Exception("Request for identity timed out")
rid = ev.split(':')[0].split('-')[-1]
dev[0].request("CTRL-RSP-IDENTITY-" + rid + ":1232010000000009")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
rand = p[2].split(' ')[0]
res = subprocess.check_output(["../../hostapd/hlr_auc_gw",
"-m",
"auth_serv/hlr_auc_gw.milenage_db",
"GSM-AUTH-REQ 232010000000009 " + rand])
if "GSM-AUTH-RESP" not in res:
raise Exception("Unexpected hlr_auc_gw response")
resp = res.split(' ')[2].rstrip()
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_sim_ext_auth_fail(dev, apdev):
"""EAP-SIM with external GSM auth and auth failing"""
try:
_test_ap_wpa2_eap_sim_ext_auth_fail(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_sim_ext_auth_fail(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="SIM", key_mgmt="WPA-EAP",
identity="1232010000000000",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
rid = p[0].split('-')[3]
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-FAIL")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=5)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_sim_change_bssid(dev, apdev):
"""EAP-SIM and external GSM auth to check fast reauth with bssid change"""
try:
_test_ap_wpa2_eap_sim_change_bssid(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_sim_change_bssid(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="SIM", key_mgmt="WPA-EAP",
identity="1232010000000000",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
rand = p[2].split(' ')[0]
res = subprocess.check_output(["../../hostapd/hlr_auc_gw",
"-m",
"auth_serv/hlr_auc_gw.milenage_db",
"GSM-AUTH-REQ 232010000000000 " + rand])
if "GSM-AUTH-RESP" not in res:
raise Exception("Unexpected hlr_auc_gw response")
resp = res.split(' ')[2].rstrip()
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
dev[0].wait_connected(timeout=15)
# Verify that EAP-SIM Reauthentication can be used after a profile change
# that does not affect EAP parameters.
dev[0].set_network(id, "bssid", "any")
eap_reauth(dev[0], "SIM")
def test_ap_wpa2_eap_sim_no_change_set(dev, apdev):
"""EAP-SIM and external GSM auth to check fast reauth with no-change SET_NETWORK"""
try:
_test_ap_wpa2_eap_sim_no_change_set(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_sim_no_change_set(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="SIM", key_mgmt="WPA-EAP",
identity="1232010000000000",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
rand = p[2].split(' ')[0]
res = subprocess.check_output(["../../hostapd/hlr_auc_gw",
"-m",
"auth_serv/hlr_auc_gw.milenage_db",
"GSM-AUTH-REQ 232010000000000 " + rand])
if "GSM-AUTH-RESP" not in res:
raise Exception("Unexpected hlr_auc_gw response")
resp = res.split(' ')[2].rstrip()
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
dev[0].wait_connected(timeout=15)
# Verify that EAP-SIM Reauthentication can be used after network profile
# SET_NETWORK commands that do not actually change previously set
# parameter values.
dev[0].set_network(id, "key_mgmt", "WPA-EAP")
dev[0].set_network(id, "eap", "SIM")
dev[0].set_network_quoted(id, "identity", "1232010000000000")
dev[0].set_network_quoted(id, "ssid", "test-wpa2-eap")
eap_reauth(dev[0], "SIM")
def test_ap_wpa2_eap_sim_oom(dev, apdev):
"""EAP-SIM and OOM"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
tests = [ (1, "milenage_f2345"),
(2, "milenage_f2345"),
(3, "milenage_f2345"),
(4, "milenage_f2345"),
(5, "milenage_f2345"),
(6, "milenage_f2345"),
(7, "milenage_f2345"),
(8, "milenage_f2345"),
(9, "milenage_f2345"),
(10, "milenage_f2345"),
(11, "milenage_f2345"),
(12, "milenage_f2345") ]
for count, func in tests:
with fail_test(dev[0], count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="SIM",
identity="1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=5)
if ev is None:
raise Exception("EAP method not selected")
dev[0].wait_disconnected()
dev[0].request("REMOVE_NETWORK all")
def test_ap_wpa2_eap_aka(dev, apdev):
"""WPA2-Enterprise connection using EAP-AKA"""
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "AKA")
logger.info("Negative test with incorrect key")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123",
expect_failure=True)
logger.info("Invalid Milenage key")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a",
expect_failure=True)
logger.info("Invalid Milenage key(2)")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a8q:cb9cccc4b9258e6dca4760379fb82581:000000000123",
expect_failure=True)
logger.info("Invalid Milenage key(3)")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb8258q:000000000123",
expect_failure=True)
logger.info("Invalid Milenage key(4)")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:00000000012q",
expect_failure=True)
logger.info("Invalid Milenage key(5)")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581q000000000123",
expect_failure=True)
logger.info("Invalid Milenage key(6)")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="ffdca4eda45b53cf0f12d7c9c3bc6a89qcb9cccc4b9258e6dca4760379fb82581q000000000123",
expect_failure=True)
logger.info("Missing key configuration")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
expect_failure=True)
def test_ap_wpa2_eap_aka_sql(dev, apdev, params):
"""WPA2-Enterprise connection using EAP-AKA (SQL)"""
check_hlr_auc_gw_support()
try:
import sqlite3
except ImportError:
raise HwsimSkip("No sqlite3 module available")
con = sqlite3.connect(os.path.join(params['logdir'], "hostapd.db"))
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params['auth_server_port'] = "1814"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123")
logger.info("AKA fast re-authentication")
eap_reauth(dev[0], "AKA")
logger.info("AKA full auth with pseudonym")
with con:
cur = con.cursor()
cur.execute("DELETE FROM reauth WHERE permanent='0232010000000000'")
eap_reauth(dev[0], "AKA")
logger.info("AKA full auth with permanent identity")
with con:
cur = con.cursor()
cur.execute("DELETE FROM reauth WHERE permanent='0232010000000000'")
cur.execute("DELETE FROM pseudonyms WHERE permanent='0232010000000000'")
eap_reauth(dev[0], "AKA")
logger.info("AKA reauth with mismatching MK")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET mk='0000000000000000000000000000000000000000' WHERE permanent='0232010000000000'")
eap_reauth(dev[0], "AKA", expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET counter='10' WHERE permanent='0232010000000000'")
eap_reauth(dev[0], "AKA")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET counter='10' WHERE permanent='0232010000000000'")
logger.info("AKA reauth with mismatching counter")
eap_reauth(dev[0], "AKA")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET counter='1001' WHERE permanent='0232010000000000'")
logger.info("AKA reauth with max reauth count reached")
eap_reauth(dev[0], "AKA")
def test_ap_wpa2_eap_aka_config(dev, apdev):
"""EAP-AKA configuration options"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123",
anonymous_identity="2345678")
def test_ap_wpa2_eap_aka_ext(dev, apdev):
"""WPA2-Enterprise connection using EAP-AKA and external UMTS auth"""
try:
_test_ap_wpa2_eap_aka_ext(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_aka_ext(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="AKA", key_mgmt="WPA-EAP",
identity="0232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=15)
if ev is None:
raise Exception("Network connected timed out")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "UMTS-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# IK:CK:RES
resp = "00112233445566778899aabbccddeeff:00112233445566778899aabbccddeeff:0011223344"
# This will fail during processing, but the ctrl_iface command succeeds
dev[0].request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(0.1)
dev[0].dump_monitor()
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "UMTS-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# This will fail during UMTS auth validation
if "OK" not in dev[0].request("CTRL-RSP-SIM-" + rid + ":UMTS-AUTS:112233445566778899aabbccddee"):
raise Exception("CTRL-RSP-SIM failed")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "UMTS-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# This will fail during UMTS auth validation
if "OK" not in dev[0].request("CTRL-RSP-SIM-" + rid + ":UMTS-AUTS:12"):
raise Exception("CTRL-RSP-SIM failed")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(0.1)
dev[0].dump_monitor()
tests = [ ":UMTS-AUTH:00112233445566778899aabbccddeeff:00112233445566778899aabbccddeeff:0011223344",
":UMTS-AUTH:34",
":UMTS-AUTH:00112233445566778899aabbccddeeff.00112233445566778899aabbccddeeff:0011223344",
":UMTS-AUTH:00112233445566778899aabbccddeeff:00112233445566778899aabbccddee:0011223344",
":UMTS-AUTH:00112233445566778899aabbccddeeff:00112233445566778899aabbccddeeff.0011223344",
":UMTS-AUTH:00112233445566778899aabbccddeeff:00112233445566778899aabbccddeeff:00112233445566778899aabbccddeeff0011223344",
":UMTS-AUTH:00112233445566778899aabbccddeeff:00112233445566778899aabbccddeeff:001122334q" ]
for t in tests:
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "UMTS-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
# This will fail during UMTS auth validation
if "OK" not in dev[0].request("CTRL-RSP-SIM-" + rid + t):
raise Exception("CTRL-RSP-SIM failed")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(0.1)
dev[0].dump_monitor()
def test_ap_wpa2_eap_aka_ext_auth_fail(dev, apdev):
"""EAP-AKA with external UMTS auth and auth failing"""
try:
_test_ap_wpa2_eap_aka_ext_auth_fail(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_aka_ext_auth_fail(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="AKA", key_mgmt="WPA-EAP",
identity="0232010000000000",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
rid = p[0].split('-')[3]
dev[0].request("CTRL-RSP-SIM-" + rid + ":UMTS-FAIL")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=5)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_aka_prime(dev, apdev):
"""WPA2-Enterprise connection using EAP-AKA'"""
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "AKA'", "6555444333222111",
password="5122250214c33e723a5dd523fc145fc0:981d464c7c52eb6e5036234984ad0bcf:000000000123")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "AKA'")
logger.info("EAP-AKA' bidding protection when EAP-AKA enabled as well")
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="AKA' AKA",
identity="6555444333222111@both",
password="5122250214c33e723a5dd523fc145fc0:981d464c7c52eb6e5036234984ad0bcf:000000000123",
wait_connect=False, scan_freq="2412")
dev[1].wait_connected(timeout=15)
logger.info("Negative test with incorrect key")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA'", "6555444333222111",
password="ff22250214c33e723a5dd523fc145fc0:981d464c7c52eb6e5036234984ad0bcf:000000000123",
expect_failure=True)
def test_ap_wpa2_eap_aka_prime_sql(dev, apdev, params):
"""WPA2-Enterprise connection using EAP-AKA' (SQL)"""
check_hlr_auc_gw_support()
try:
import sqlite3
except ImportError:
raise HwsimSkip("No sqlite3 module available")
con = sqlite3.connect(os.path.join(params['logdir'], "hostapd.db"))
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params['auth_server_port'] = "1814"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "AKA'", "6555444333222111",
password="5122250214c33e723a5dd523fc145fc0:981d464c7c52eb6e5036234984ad0bcf:000000000123")
logger.info("AKA' fast re-authentication")
eap_reauth(dev[0], "AKA'")
logger.info("AKA' full auth with pseudonym")
with con:
cur = con.cursor()
cur.execute("DELETE FROM reauth WHERE permanent='6555444333222111'")
eap_reauth(dev[0], "AKA'")
logger.info("AKA' full auth with permanent identity")
with con:
cur = con.cursor()
cur.execute("DELETE FROM reauth WHERE permanent='6555444333222111'")
cur.execute("DELETE FROM pseudonyms WHERE permanent='6555444333222111'")
eap_reauth(dev[0], "AKA'")
logger.info("AKA' reauth with mismatching k_aut")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET k_aut='0000000000000000000000000000000000000000000000000000000000000000' WHERE permanent='6555444333222111'")
eap_reauth(dev[0], "AKA'", expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA'", "6555444333222111",
password="5122250214c33e723a5dd523fc145fc0:981d464c7c52eb6e5036234984ad0bcf:000000000123")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET counter='10' WHERE permanent='6555444333222111'")
eap_reauth(dev[0], "AKA'")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET counter='10' WHERE permanent='6555444333222111'")
logger.info("AKA' reauth with mismatching counter")
eap_reauth(dev[0], "AKA'")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA'", "6555444333222111",
password="5122250214c33e723a5dd523fc145fc0:981d464c7c52eb6e5036234984ad0bcf:000000000123")
with con:
cur = con.cursor()
cur.execute("UPDATE reauth SET counter='1001' WHERE permanent='6555444333222111'")
logger.info("AKA' reauth with max reauth count reached")
eap_reauth(dev[0], "AKA'")
def test_ap_wpa2_eap_aka_prime_ext_auth_fail(dev, apdev):
"""EAP-AKA' with external UMTS auth and auth failing"""
try:
_test_ap_wpa2_eap_aka_prime_ext_auth_fail(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def _test_ap_wpa2_eap_aka_prime_ext_auth_fail(dev, apdev):
check_hlr_auc_gw_support()
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
id = dev[0].connect("test-wpa2-eap", eap="AKA'", key_mgmt="WPA-EAP",
identity="6555444333222111",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
rid = p[0].split('-')[3]
dev[0].request("CTRL-RSP-SIM-" + rid + ":UMTS-FAIL")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=5)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_ttls_pap(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/PAP"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
key_mgmt = hapd.get_config()['key_mgmt']
if key_mgmt.split(' ')[0] != "WPA-EAP":
raise Exception("Unexpected GET_CONFIG(key_mgmt): " + key_mgmt)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "TTLS")
check_mib(dev[0], [ ("dot11RSNAAuthenticationSuiteRequested", "00-0f-ac-1"),
("dot11RSNAAuthenticationSuiteSelected", "00-0f-ac-1") ])
def test_ap_wpa2_eap_ttls_pap_subject_match(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/PAP and (alt)subject_match"""
check_subject_match_support(dev[0])
check_altsubject_match_support(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP",
subject_match="/C=FI/O=w1.fi/CN=server.w1.fi",
altsubject_match="EMAIL:noone@example.com;DNS:server.w1.fi;URI:http://example.com/")
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_ttls_pap_incorrect_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/PAP - incorrect password"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="wrong",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP",
expect_failure=True)
eap_connect(dev[1], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP",
expect_failure=True)
def test_ap_wpa2_eap_ttls_chap(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/CHAP"""
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "chap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.der", phase2="auth=CHAP")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_ttls_chap_altsubject_match(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/CHAP"""
skip_with_fips(dev[0])
check_altsubject_match_support(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "chap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.der", phase2="auth=CHAP",
altsubject_match="EMAIL:noone@example.com;URI:http://example.com/;DNS:server.w1.fi")
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_ttls_chap_incorrect_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/CHAP - incorrect password"""
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "chap user",
anonymous_identity="ttls", password="wrong",
ca_cert="auth_serv/ca.pem", phase2="auth=CHAP",
expect_failure=True)
eap_connect(dev[1], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=CHAP",
expect_failure=True)
def test_ap_wpa2_eap_ttls_mschap(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/MSCHAP"""
skip_with_fips(dev[0])
check_domain_suffix_match(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "mschap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
domain_suffix_match="server.w1.fi")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "TTLS")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "TTLS", "mschap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
fragment_size="200")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
eap_connect(dev[0], hapd, "TTLS", "mschap user",
anonymous_identity="ttls",
password_hex="hash:8846f7eaee8fb117ad06bdd830b7586c",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP")
def test_ap_wpa2_eap_ttls_mschap_incorrect_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/MSCHAP - incorrect password"""
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "mschap user",
anonymous_identity="ttls", password="wrong",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
expect_failure=True)
eap_connect(dev[1], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
expect_failure=True)
eap_connect(dev[2], hapd, "TTLS", "no such user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
expect_failure=True)
def test_ap_wpa2_eap_ttls_mschapv2(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/MSCHAPv2"""
check_domain_suffix_match(dev[0])
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "DOMAIN\mschapv2 user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
domain_suffix_match="server.w1.fi")
hwsim_utils.test_connectivity(dev[0], hapd)
sta1 = hapd.get_sta(dev[0].p2p_interface_addr())
eapol1 = hapd.get_sta(dev[0].p2p_interface_addr(), info="eapol")
eap_reauth(dev[0], "TTLS")
sta2 = hapd.get_sta(dev[0].p2p_interface_addr())
eapol2 = hapd.get_sta(dev[0].p2p_interface_addr(), info="eapol")
if int(sta2['dot1xAuthEapolFramesRx']) <= int(sta1['dot1xAuthEapolFramesRx']):
raise Exception("dot1xAuthEapolFramesRx did not increase")
if int(eapol2['authAuthEapStartsWhileAuthenticated']) < 1:
raise Exception("authAuthEapStartsWhileAuthenticated did not increase")
if int(eapol2['backendAuthSuccesses']) <= int(eapol1['backendAuthSuccesses']):
raise Exception("backendAuthSuccesses did not increase")
logger.info("Password as hash value")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "TTLS", "DOMAIN\mschapv2 user",
anonymous_identity="ttls",
password_hex="hash:8846f7eaee8fb117ad06bdd830b7586c",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
def test_ap_wpa2_eap_ttls_invalid_phase2(dev, apdev):
"""EAP-TTLS with invalid phase2 parameter values"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
tests = [ "auth=MSCHAPv2", "auth=MSCHAPV2 autheap=MD5",
"autheap=MD5 auth=MSCHAPV2", "auth=PAP auth=CHAP",
"autheap=MD5 autheap=FOO autheap=MSCHAPV2" ]
for t in tests:
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2=t,
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-PROPOSED-METHOD"], timeout=10)
if ev is None or "method=21" not in ev:
raise Exception("EAP-TTLS not started")
ev = dev[0].wait_event(["EAP: Failed to initialize EAP method",
"CTRL-EVENT-CONNECTED"], timeout=5)
if ev is None or "CTRL-EVENT-CONNECTED" in ev:
raise Exception("No EAP-TTLS failure reported for phase2=" + t)
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
dev[0].dump_monitor()
def test_ap_wpa2_eap_ttls_mschapv2_suffix_match(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/MSCHAPv2"""
check_domain_match_full(dev[0])
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "DOMAIN\mschapv2 user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
domain_suffix_match="w1.fi")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_ttls_mschapv2_domain_match(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/MSCHAPv2 (domain_match)"""
check_domain_match(dev[0])
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "DOMAIN\mschapv2 user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
domain_match="Server.w1.fi")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_ttls_mschapv2_incorrect_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/MSCHAPv2 - incorrect password"""
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "DOMAIN\mschapv2 user",
anonymous_identity="ttls", password="password1",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
expect_failure=True)
eap_connect(dev[1], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
expect_failure=True)
def test_ap_wpa2_eap_ttls_mschapv2_utf8(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/MSCHAPv2 and UTF-8 password"""
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "utf8-user-hash",
anonymous_identity="ttls", password="secret-åäö-€-password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
eap_connect(dev[1], hapd, "TTLS", "utf8-user",
anonymous_identity="ttls",
password_hex="hash:bd5844fad2489992da7fe8c5a01559cf",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
for p in [ "80", "41c041e04141e041", 257*"41" ]:
dev[2].connect("test-wpa2-eap", key_mgmt="WPA-EAP",
eap="TTLS", identity="utf8-user-hash",
anonymous_identity="ttls", password_hex=p,
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
wait_connect=False, scan_freq="2412")
ev = dev[2].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=1)
if ev is None:
raise Exception("No failure reported")
dev[2].request("REMOVE_NETWORK all")
dev[2].wait_disconnected()
def test_ap_wpa2_eap_ttls_eap_gtc(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-GTC"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=GTC")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_ttls_eap_gtc_incorrect_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-GTC - incorrect password"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="wrong",
ca_cert="auth_serv/ca.pem", phase2="autheap=GTC",
expect_failure=True)
def test_ap_wpa2_eap_ttls_eap_gtc_no_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-GTC - no password"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "user-no-passwd",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=GTC",
expect_failure=True)
def test_ap_wpa2_eap_ttls_eap_gtc_server_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-GTC - server OOM"""
params = int_eap_server_params()
hapd = hostapd.add_ap(apdev[0], params)
with alloc_fail(hapd, 1, "eap_gtc_init"):
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=GTC",
expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
with alloc_fail(hapd, 1, "eap_gtc_buildReq"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="TTLS", identity="user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=GTC",
wait_connect=False, scan_freq="2412")
# This would eventually time out, but we can stop after having reached
# the allocation failure.
for i in range(20):
time.sleep(0.1)
if hapd.request("GET_ALLOC_FAIL").startswith('0'):
break
def test_ap_wpa2_eap_ttls_eap_gtc_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-GTC (OOM)"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
tests = [ "eap_gtc_init",
"eap_msg_alloc;eap_gtc_process" ]
for func in tests:
with alloc_fail(dev[0], 1, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP",
scan_freq="2412",
eap="TTLS", identity="user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=GTC",
wait_connect=False)
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_ttls_eap_md5(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-MD5"""
check_eap_capa(dev[0], "MD5")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=MD5")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_ttls_eap_md5_incorrect_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-MD5 - incorrect password"""
check_eap_capa(dev[0], "MD5")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="wrong",
ca_cert="auth_serv/ca.pem", phase2="autheap=MD5",
expect_failure=True)
def test_ap_wpa2_eap_ttls_eap_md5_no_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-MD5 - no password"""
check_eap_capa(dev[0], "MD5")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "user-no-passwd",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=MD5",
expect_failure=True)
def test_ap_wpa2_eap_ttls_eap_md5_server_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-MD5 - server OOM"""
check_eap_capa(dev[0], "MD5")
params = int_eap_server_params()
hapd = hostapd.add_ap(apdev[0], params)
with alloc_fail(hapd, 1, "eap_md5_init"):
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=MD5",
expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
with alloc_fail(hapd, 1, "eap_md5_buildReq"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="TTLS", identity="user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=MD5",
wait_connect=False, scan_freq="2412")
# This would eventually time out, but we can stop after having reached
# the allocation failure.
for i in range(20):
time.sleep(0.1)
if hapd.request("GET_ALLOC_FAIL").startswith('0'):
break
def test_ap_wpa2_eap_ttls_eap_mschapv2(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-MSCHAPv2"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=MSCHAPV2")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "TTLS")
logger.info("Negative test with incorrect password")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password1",
ca_cert="auth_serv/ca.pem", phase2="autheap=MSCHAPV2",
expect_failure=True)
def test_ap_wpa2_eap_ttls_eap_mschapv2_no_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-MSCHAPv2 - no password"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "user-no-passwd",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=MSCHAPV2",
expect_failure=True)
def test_ap_wpa2_eap_ttls_eap_mschapv2_server_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-MSCHAPv2 - server OOM"""
check_eap_capa(dev[0], "MSCHAPV2")
params = int_eap_server_params()
hapd = hostapd.add_ap(apdev[0], params)
with alloc_fail(hapd, 1, "eap_mschapv2_init"):
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=MSCHAPV2",
expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
with alloc_fail(hapd, 1, "eap_mschapv2_build_challenge"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="TTLS", identity="user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=MSCHAPV2",
wait_connect=False, scan_freq="2412")
# This would eventually time out, but we can stop after having reached
# the allocation failure.
for i in range(20):
time.sleep(0.1)
if hapd.request("GET_ALLOC_FAIL").startswith('0'):
break
dev[0].request("REMOVE_NETWORK all")
with alloc_fail(hapd, 1, "eap_mschapv2_build_success_req"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="TTLS", identity="user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=MSCHAPV2",
wait_connect=False, scan_freq="2412")
# This would eventually time out, but we can stop after having reached
# the allocation failure.
for i in range(20):
time.sleep(0.1)
if hapd.request("GET_ALLOC_FAIL").startswith('0'):
break
dev[0].request("REMOVE_NETWORK all")
with alloc_fail(hapd, 1, "eap_mschapv2_build_failure_req"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="TTLS", identity="user",
anonymous_identity="ttls", password="wrong",
ca_cert="auth_serv/ca.pem", phase2="autheap=MSCHAPV2",
wait_connect=False, scan_freq="2412")
# This would eventually time out, but we can stop after having reached
# the allocation failure.
for i in range(20):
time.sleep(0.1)
if hapd.request("GET_ALLOC_FAIL").startswith('0'):
break
dev[0].request("REMOVE_NETWORK all")
def test_ap_wpa2_eap_ttls_eap_sim(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-SIM"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "1232010000000000",
anonymous_identity="1232010000000000@ttls",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
ca_cert="auth_serv/ca.pem", phase2="autheap=SIM")
eap_reauth(dev[0], "TTLS")
def run_ext_sim_auth(dev):
ev = dev.wait_event(["CTRL-REQ-SIM"], timeout=15)
if ev is None:
raise Exception("Wait for external SIM processing request timed out")
p = ev.split(':', 2)
if p[1] != "GSM-AUTH":
raise Exception("Unexpected CTRL-REQ-SIM type")
rid = p[0].split('-')[3]
rand = p[2].split(' ')[0]
res = subprocess.check_output(["../../hostapd/hlr_auc_gw",
"-m",
"auth_serv/hlr_auc_gw.milenage_db",
"GSM-AUTH-REQ 232010000000000 " + rand])
if "GSM-AUTH-RESP" not in res:
raise Exception("Unexpected hlr_auc_gw response")
resp = res.split(' ')[2].rstrip()
dev.request("CTRL-RSP-SIM-" + rid + ":GSM-AUTH:" + resp)
dev.wait_connected(timeout=15)
dev.dump_monitor()
dev.request("REAUTHENTICATE")
ev = dev.wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=5)
if ev is None:
raise Exception("EAP reauthentication did not succeed")
ev = dev.wait_event(["WPA: Key negotiation completed"], timeout=5)
if ev is None:
raise Exception("Key negotiation did not complete")
dev.dump_monitor()
def test_ap_wpa2_eap_ttls_eap_sim_ext(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-SIM and external GSM auth"""
check_hlr_auc_gw_support()
try:
run_ap_wpa2_eap_ttls_eap_sim_ext(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def run_ap_wpa2_eap_ttls_eap_sim_ext(dev, apdev):
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
dev[0].connect("test-wpa2-eap", eap="TTLS", key_mgmt="WPA-EAP",
identity="1232010000000000",
anonymous_identity="1232010000000000@ttls",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
ca_cert="auth_serv/ca.pem", phase2="autheap=SIM",
wait_connect=False, scan_freq="2412")
run_ext_sim_auth(dev[0])
def test_ap_wpa2_eap_peap_eap_sim(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAP/EAP-SIM"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PEAP", "1232010000000000",
anonymous_identity="1232010000000000@peap",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
ca_cert="auth_serv/ca.pem", phase2="auth=SIM")
eap_reauth(dev[0], "PEAP")
def test_ap_wpa2_eap_peap_eap_sim_ext(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAP/EAP-SIM and external GSM auth"""
check_hlr_auc_gw_support()
try:
run_ap_wpa2_eap_peap_eap_sim_ext(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def run_ap_wpa2_eap_peap_eap_sim_ext(dev, apdev):
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
dev[0].connect("test-wpa2-eap", eap="PEAP", key_mgmt="WPA-EAP",
identity="1232010000000000",
anonymous_identity="1232010000000000@peap",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
ca_cert="auth_serv/ca.pem", phase2="auth=SIM",
wait_connect=False, scan_freq="2412")
run_ext_sim_auth(dev[0])
def test_ap_wpa2_eap_fast_eap_sim(dev, apdev):
"""WPA2-Enterprise connection using EAP-FAST/EAP-SIM"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "FAST", "1232010000000000",
anonymous_identity="1232010000000000@fast",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
phase1="fast_provisioning=2",
pac_file="blob://fast_pac_auth_sim",
ca_cert="auth_serv/ca.pem", phase2="auth=SIM")
eap_reauth(dev[0], "FAST")
def test_ap_wpa2_eap_fast_eap_sim_ext(dev, apdev):
"""WPA2-Enterprise connection using EAP-FAST/EAP-SIM and external GSM auth"""
check_hlr_auc_gw_support()
try:
run_ap_wpa2_eap_fast_eap_sim_ext(dev, apdev)
finally:
dev[0].request("SET external_sim 0")
def run_ap_wpa2_eap_fast_eap_sim_ext(dev, apdev):
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].request("SET external_sim 1")
dev[0].connect("test-wpa2-eap", eap="PEAP", key_mgmt="WPA-EAP",
identity="1232010000000000",
anonymous_identity="1232010000000000@peap",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
phase1="fast_provisioning=2",
pac_file="blob://fast_pac_auth_sim",
ca_cert="auth_serv/ca.pem", phase2="auth=SIM",
wait_connect=False, scan_freq="2412")
run_ext_sim_auth(dev[0])
def test_ap_wpa2_eap_ttls_eap_aka(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/EAP-AKA"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "0232010000000000",
anonymous_identity="0232010000000000@ttls",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123",
ca_cert="auth_serv/ca.pem", phase2="autheap=AKA")
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_peap_eap_aka(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAP/EAP-AKA"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PEAP", "0232010000000000",
anonymous_identity="0232010000000000@peap",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123",
ca_cert="auth_serv/ca.pem", phase2="auth=AKA")
eap_reauth(dev[0], "PEAP")
def test_ap_wpa2_eap_fast_eap_aka(dev, apdev):
"""WPA2-Enterprise connection using EAP-FAST/EAP-AKA"""
check_eap_capa(dev[0], "FAST")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "FAST", "0232010000000000",
anonymous_identity="0232010000000000@fast",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123",
phase1="fast_provisioning=2",
pac_file="blob://fast_pac_auth_aka",
ca_cert="auth_serv/ca.pem", phase2="auth=AKA")
eap_reauth(dev[0], "FAST")
def test_ap_wpa2_eap_peap_eap_mschapv2(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAP/EAP-MSCHAPv2"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PEAP", "user",
anonymous_identity="peap", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "PEAP")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "PEAP", "user",
anonymous_identity="peap", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
fragment_size="200")
logger.info("Password as hash value")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "PEAP", "user",
anonymous_identity="peap",
password_hex="hash:8846f7eaee8fb117ad06bdd830b7586c",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
logger.info("Negative test with incorrect password")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "PEAP", "user",
anonymous_identity="peap", password="password1",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
expect_failure=True)
def test_ap_wpa2_eap_peap_eap_mschapv2_domain(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAP/EAP-MSCHAPv2 with domain"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PEAP", "DOMAIN\user3",
anonymous_identity="peap", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "PEAP")
def test_ap_wpa2_eap_peap_eap_mschapv2_incorrect_password(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAP/EAP-MSCHAPv2 - incorrect password"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PEAP", "user",
anonymous_identity="peap", password="wrong",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
expect_failure=True)
def test_ap_wpa2_eap_peap_crypto_binding(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAPv0/EAP-MSCHAPv2 and crypto binding"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PEAP", "user", password="password",
ca_cert="auth_serv/ca.pem",
phase1="peapver=0 crypto_binding=2",
phase2="auth=MSCHAPV2")
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "PEAP")
eap_connect(dev[1], hapd, "PEAP", "user", password="password",
ca_cert="auth_serv/ca.pem",
phase1="peapver=0 crypto_binding=1",
phase2="auth=MSCHAPV2")
eap_connect(dev[2], hapd, "PEAP", "user", password="password",
ca_cert="auth_serv/ca.pem",
phase1="peapver=0 crypto_binding=0",
phase2="auth=MSCHAPV2")
def test_ap_wpa2_eap_peap_crypto_binding_server_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAPv0/EAP-MSCHAPv2 and crypto binding with server OOM"""
check_eap_capa(dev[0], "MSCHAPV2")
params = int_eap_server_params()
hapd = hostapd.add_ap(apdev[0], params)
with alloc_fail(hapd, 1, "eap_mschapv2_getKey"):
eap_connect(dev[0], hapd, "PEAP", "user", password="password",
ca_cert="auth_serv/ca.pem",
phase1="peapver=0 crypto_binding=2",
phase2="auth=MSCHAPV2",
expect_failure=True, local_error_report=True)
def test_ap_wpa2_eap_peap_params(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAPv0/EAP-MSCHAPv2 and various parameters"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PEAP", "user",
anonymous_identity="peap", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="peapver=0 peaplabel=1",
expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PEAP",
identity="user",
anonymous_identity="peap", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="peap_outer_success=0",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=15)
if ev is None:
raise Exception("No EAP success seen")
# This won't succeed to connect with peap_outer_success=0, so stop here.
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
eap_connect(dev[1], hapd, "PEAP", "user", password="password",
ca_cert="auth_serv/ca.pem",
phase1="peap_outer_success=1",
phase2="auth=MSCHAPV2")
eap_connect(dev[2], hapd, "PEAP", "user", password="password",
ca_cert="auth_serv/ca.pem",
phase1="peap_outer_success=2",
phase2="auth=MSCHAPV2")
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PEAP",
identity="user",
anonymous_identity="peap", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="peapver=1 peaplabel=1",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=15)
if ev is None:
raise Exception("No EAP success seen")
ev = dev[0].wait_event(["CTRL-EVENT-CONNECTED"], timeout=1)
if ev is not None:
raise Exception("Unexpected connection")
tests = [ ("peap-ver0", ""),
("peap-ver1", ""),
("peap-ver0", "peapver=0"),
("peap-ver1", "peapver=1") ]
for anon,phase1 in tests:
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PEAP",
identity="user", anonymous_identity=anon,
password="password", phase1=phase1,
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
scan_freq="2412")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ ("peap-ver0", "peapver=1"),
("peap-ver1", "peapver=0") ]
for anon,phase1 in tests:
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PEAP",
identity="user", anonymous_identity=anon,
password="password", phase1=phase1,
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("No EAP-Failure seen")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
eap_connect(dev[0], hapd, "PEAP", "user", password="password",
ca_cert="auth_serv/ca.pem",
phase1="tls_allow_md5=1 tls_disable_session_ticket=1 tls_disable_tlsv1_0=0 tls_disable_tlsv1_1=0 tls_disable_tlsv1_2=0 tls_ext_cert_check=0",
phase2="auth=MSCHAPV2")
def test_ap_wpa2_eap_peap_eap_tls(dev, apdev):
"""WPA2-Enterprise connection using EAP-PEAP/EAP-TLS"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PEAP", "cert user",
ca_cert="auth_serv/ca.pem", phase2="auth=TLS",
ca_cert2="auth_serv/ca.pem",
client_cert2="auth_serv/user.pem",
private_key2="auth_serv/user.key")
eap_reauth(dev[0], "PEAP")
def test_ap_wpa2_eap_tls(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
eap_reauth(dev[0], "TLS")
def test_eap_tls_pkcs8_pkcs5_v2_des3(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and PKCS #8, PKCS #5 v2 DES3 key"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key.pkcs8",
private_key_passwd="whatever")
def test_eap_tls_pkcs8_pkcs5_v15(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and PKCS #8, PKCS #5 v1.5 key"""
check_pkcs5_v15_support(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key.pkcs8.pkcs5v15",
private_key_passwd="whatever")
def test_ap_wpa2_eap_tls_blob(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and config blobs"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
cert = read_pem("auth_serv/ca.pem")
if "OK" not in dev[0].request("SET blob cacert " + cert.encode("hex")):
raise Exception("Could not set cacert blob")
cert = read_pem("auth_serv/user.pem")
if "OK" not in dev[0].request("SET blob usercert " + cert.encode("hex")):
raise Exception("Could not set usercert blob")
key = read_pem("auth_serv/user.rsa-key")
if "OK" not in dev[0].request("SET blob userkey " + key.encode("hex")):
raise Exception("Could not set cacert blob")
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="blob://cacert",
client_cert="blob://usercert",
private_key="blob://userkey")
def test_ap_wpa2_eap_tls_blob_missing(dev, apdev):
"""EAP-TLS and config blob missing"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user",
ca_cert="blob://testing-blob-does-not-exist",
client_cert="blob://testing-blob-does-not-exist",
private_key="blob://testing-blob-does-not-exist",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["EAP: Failed to initialize EAP method"], timeout=10)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_tls_with_tls_len(dev, apdev):
"""EAP-TLS and TLS Message Length in unfragmented packets"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
phase1="include_tls_length=1",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
def test_ap_wpa2_eap_tls_pkcs12(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and PKCS#12"""
check_pkcs12_support(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user",
ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-PASSPHRASE"])
if ev is None:
raise Exception("Request for private key passphrase timed out")
id = ev.split(':')[0].split('-')[-1]
dev[0].request("CTRL-RSP-PASSPHRASE-" + id + ":whatever")
dev[0].wait_connected(timeout=10)
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
# Run this twice to verify certificate chain handling with OpenSSL. Use two
# different files to cover both cases of the extra certificate being the
# one that signed the client certificate and it being unrelated to the
# client certificate.
for pkcs12 in "auth_serv/user2.pkcs12", "auth_serv/user3.pkcs12":
for i in range(2):
eap_connect(dev[0], hapd, "TLS", "tls user",
ca_cert="auth_serv/ca.pem",
private_key=pkcs12,
private_key_passwd="whatever")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_tls_pkcs12_blob(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and PKCS#12 from configuration blob"""
check_pkcs12_support(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
cert = read_pem("auth_serv/ca.pem")
if "OK" not in dev[0].request("SET blob cacert " + cert.encode("hex")):
raise Exception("Could not set cacert blob")
with open("auth_serv/user.pkcs12", "rb") as f:
if "OK" not in dev[0].request("SET blob pkcs12 " + f.read().encode("hex")):
raise Exception("Could not set pkcs12 blob")
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="blob://cacert",
private_key="blob://pkcs12",
private_key_passwd="whatever")
def test_ap_wpa2_eap_tls_neg_incorrect_trust_root(dev, apdev):
"""WPA2-Enterprise negative test - incorrect trust root"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
cert = read_pem("auth_serv/ca-incorrect.pem")
if "OK" not in dev[0].request("SET blob cacert " + cert.encode("hex")):
raise Exception("Could not set cacert blob")
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="blob://cacert",
wait_connect=False, scan_freq="2412")
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="auth_serv/ca-incorrect.pem",
wait_connect=False, scan_freq="2412")
for dev in (dev[0], dev[1]):
ev = dev.wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev.wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=10)
if ev is None:
raise Exception("EAP method selection timed out")
if "TTLS" not in ev:
raise Exception("Unexpected EAP method")
ev = dev.wait_event(["CTRL-EVENT-EAP-TLS-CERT-ERROR",
"CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result timed out")
if "CTRL-EVENT-EAP-TLS-CERT-ERROR" not in ev:
raise Exception("TLS certificate error not reported")
ev = dev.wait_event(["CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(2) timed out")
if "CTRL-EVENT-EAP-FAILURE" not in ev:
raise Exception("EAP failure not reported")
ev = dev.wait_event(["CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(3) timed out")
if "CTRL-EVENT-DISCONNECTED" not in ev:
raise Exception("Disconnection not reported")
ev = dev.wait_event(["CTRL-EVENT-SSID-TEMP-DISABLED"], timeout=10)
if ev is None:
raise Exception("Network block disabling not reported")
def test_ap_wpa2_eap_tls_diff_ca_trust(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/PAP and different CA trust"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", anonymous_identity="ttls",
password="password", phase2="auth=PAP",
ca_cert="auth_serv/ca.pem",
wait_connect=True, scan_freq="2412")
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", anonymous_identity="ttls",
password="password", phase2="auth=PAP",
ca_cert="auth_serv/ca-incorrect.pem",
only_add_network=True, scan_freq="2412")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].dump_monitor()
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-PROPOSED-METHOD vendor=0 method=21"], timeout=15)
if ev is None:
raise Exception("EAP-TTLS not re-started")
ev = dev[0].wait_disconnected(timeout=15)
if "reason=23" not in ev:
raise Exception("Proper reason code for disconnection not reported")
def test_ap_wpa2_eap_tls_diff_ca_trust2(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/PAP and different CA trust"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", anonymous_identity="ttls",
password="password", phase2="auth=PAP",
wait_connect=True, scan_freq="2412")
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", anonymous_identity="ttls",
password="password", phase2="auth=PAP",
ca_cert="auth_serv/ca-incorrect.pem",
only_add_network=True, scan_freq="2412")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].dump_monitor()
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-PROPOSED-METHOD vendor=0 method=21"], timeout=15)
if ev is None:
raise Exception("EAP-TTLS not re-started")
ev = dev[0].wait_disconnected(timeout=15)
if "reason=23" not in ev:
raise Exception("Proper reason code for disconnection not reported")
def test_ap_wpa2_eap_tls_diff_ca_trust3(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/PAP and different CA trust"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", anonymous_identity="ttls",
password="password", phase2="auth=PAP",
ca_cert="auth_serv/ca.pem",
wait_connect=True, scan_freq="2412")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].dump_monitor()
dev[0].set_network_quoted(id, "ca_cert", "auth_serv/ca-incorrect.pem")
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-PROPOSED-METHOD vendor=0 method=21"], timeout=15)
if ev is None:
raise Exception("EAP-TTLS not re-started")
ev = dev[0].wait_disconnected(timeout=15)
if "reason=23" not in ev:
raise Exception("Proper reason code for disconnection not reported")
def test_ap_wpa2_eap_tls_neg_suffix_match(dev, apdev):
"""WPA2-Enterprise negative test - domain suffix mismatch"""
check_domain_suffix_match(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="auth_serv/ca.pem",
domain_suffix_match="incorrect.example.com",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=10)
if ev is None:
raise Exception("EAP method selection timed out")
if "TTLS" not in ev:
raise Exception("Unexpected EAP method")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-TLS-CERT-ERROR",
"CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result timed out")
if "CTRL-EVENT-EAP-TLS-CERT-ERROR" not in ev:
raise Exception("TLS certificate error not reported")
if "Domain suffix mismatch" not in ev:
raise Exception("Domain suffix mismatch not reported")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(2) timed out")
if "CTRL-EVENT-EAP-FAILURE" not in ev:
raise Exception("EAP failure not reported")
ev = dev[0].wait_event(["CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(3) timed out")
if "CTRL-EVENT-DISCONNECTED" not in ev:
raise Exception("Disconnection not reported")
ev = dev[0].wait_event(["CTRL-EVENT-SSID-TEMP-DISABLED"], timeout=10)
if ev is None:
raise Exception("Network block disabling not reported")
def test_ap_wpa2_eap_tls_neg_domain_match(dev, apdev):
"""WPA2-Enterprise negative test - domain mismatch"""
check_domain_match(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="auth_serv/ca.pem",
domain_match="w1.fi",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=10)
if ev is None:
raise Exception("EAP method selection timed out")
if "TTLS" not in ev:
raise Exception("Unexpected EAP method")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-TLS-CERT-ERROR",
"CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result timed out")
if "CTRL-EVENT-EAP-TLS-CERT-ERROR" not in ev:
raise Exception("TLS certificate error not reported")
if "Domain mismatch" not in ev:
raise Exception("Domain mismatch not reported")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(2) timed out")
if "CTRL-EVENT-EAP-FAILURE" not in ev:
raise Exception("EAP failure not reported")
ev = dev[0].wait_event(["CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(3) timed out")
if "CTRL-EVENT-DISCONNECTED" not in ev:
raise Exception("Disconnection not reported")
ev = dev[0].wait_event(["CTRL-EVENT-SSID-TEMP-DISABLED"], timeout=10)
if ev is None:
raise Exception("Network block disabling not reported")
def test_ap_wpa2_eap_tls_neg_subject_match(dev, apdev):
"""WPA2-Enterprise negative test - subject mismatch"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="auth_serv/ca.pem",
subject_match="/C=FI/O=w1.fi/CN=example.com",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD",
"EAP: Failed to initialize EAP method"], timeout=10)
if ev is None:
raise Exception("EAP method selection timed out")
if "EAP: Failed to initialize EAP method" in ev:
tls = dev[0].request("GET tls_library")
if tls.startswith("OpenSSL"):
raise Exception("Failed to select EAP method")
logger.info("subject_match not supported - connection failed, so test succeeded")
return
if "TTLS" not in ev:
raise Exception("Unexpected EAP method")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-TLS-CERT-ERROR",
"CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result timed out")
if "CTRL-EVENT-EAP-TLS-CERT-ERROR" not in ev:
raise Exception("TLS certificate error not reported")
if "Subject mismatch" not in ev:
raise Exception("Subject mismatch not reported")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(2) timed out")
if "CTRL-EVENT-EAP-FAILURE" not in ev:
raise Exception("EAP failure not reported")
ev = dev[0].wait_event(["CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(3) timed out")
if "CTRL-EVENT-DISCONNECTED" not in ev:
raise Exception("Disconnection not reported")
ev = dev[0].wait_event(["CTRL-EVENT-SSID-TEMP-DISABLED"], timeout=10)
if ev is None:
raise Exception("Network block disabling not reported")
def test_ap_wpa2_eap_tls_neg_altsubject_match(dev, apdev):
"""WPA2-Enterprise negative test - altsubject mismatch"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
tests = [ "incorrect.example.com",
"DNS:incorrect.example.com",
"DNS:w1.fi",
"DNS:erver.w1.fi" ]
for match in tests:
_test_ap_wpa2_eap_tls_neg_altsubject_match(dev, apdev, match)
def _test_ap_wpa2_eap_tls_neg_altsubject_match(dev, apdev, match):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="auth_serv/ca.pem",
altsubject_match=match,
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD",
"EAP: Failed to initialize EAP method"], timeout=10)
if ev is None:
raise Exception("EAP method selection timed out")
if "EAP: Failed to initialize EAP method" in ev:
tls = dev[0].request("GET tls_library")
if tls.startswith("OpenSSL"):
raise Exception("Failed to select EAP method")
logger.info("altsubject_match not supported - connection failed, so test succeeded")
return
if "TTLS" not in ev:
raise Exception("Unexpected EAP method")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-TLS-CERT-ERROR",
"CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result timed out")
if "CTRL-EVENT-EAP-TLS-CERT-ERROR" not in ev:
raise Exception("TLS certificate error not reported")
if "AltSubject mismatch" not in ev:
raise Exception("altsubject mismatch not reported")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS",
"CTRL-EVENT-EAP-FAILURE",
"CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(2) timed out")
if "CTRL-EVENT-EAP-FAILURE" not in ev:
raise Exception("EAP failure not reported")
ev = dev[0].wait_event(["CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("EAP result(3) timed out")
if "CTRL-EVENT-DISCONNECTED" not in ev:
raise Exception("Disconnection not reported")
ev = dev[0].wait_event(["CTRL-EVENT-SSID-TEMP-DISABLED"], timeout=10)
if ev is None:
raise Exception("Network block disabling not reported")
dev[0].request("REMOVE_NETWORK all")
def test_ap_wpa2_eap_unauth_tls(dev, apdev):
"""WPA2-Enterprise connection using UNAUTH-TLS"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "UNAUTH-TLS", "unauth-tls",
ca_cert="auth_serv/ca.pem")
eap_reauth(dev[0], "UNAUTH-TLS")
def test_ap_wpa2_eap_ttls_server_cert_hash(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS and server certificate hash"""
check_cert_probe_support(dev[0])
skip_with_fips(dev[0])
srv_cert_hash = "bdb9cb55d3df278e52a071abf58e7f0238fbec3ad8fb2c254742f63562628272"
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="probe", ca_cert="probe://",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-PEER-CERT depth=0"], timeout=10)
if ev is None:
raise Exception("No peer server certificate event seen")
if "hash=" + srv_cert_hash not in ev:
raise Exception("Expected server certificate hash not reported")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-TLS-CERT-ERROR"], timeout=10)
if ev is None:
raise Exception("EAP result timed out")
if "Server certificate chain probe" not in ev:
raise Exception("Server certificate probe not reported")
dev[0].wait_disconnected(timeout=10)
dev[0].request("REMOVE_NETWORK all")
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="hash://server/sha256/5a1bc1296205e6fdbe3979728efe3920798885c1c4590b5f90f43222d239ca6a",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-TLS-CERT-ERROR"], timeout=10)
if ev is None:
raise Exception("EAP result timed out")
if "Server certificate mismatch" not in ev:
raise Exception("Server certificate mismatch not reported")
dev[0].wait_disconnected(timeout=10)
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "TTLS", "DOMAIN\mschapv2 user",
anonymous_identity="ttls", password="password",
ca_cert="hash://server/sha256/" + srv_cert_hash,
phase2="auth=MSCHAPV2")
def test_ap_wpa2_eap_ttls_server_cert_hash_invalid(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS and server certificate hash (invalid config)"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="hash://server/md5/5a1bc1296205e6fdbe3979728efe3920798885c1c4590b5f90f43222d239ca6a",
wait_connect=False, scan_freq="2412")
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="hash://server/sha256/5a1bc1296205e6fdbe3979728efe3920798885c1c4590b5f90f43222d239ca",
wait_connect=False, scan_freq="2412")
dev[2].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="DOMAIN\mschapv2 user", anonymous_identity="ttls",
password="password", phase2="auth=MSCHAPV2",
ca_cert="hash://server/sha256/5a1bc1296205e6fdbe3979728efe3920798885c1c4590b5f90f43222d239ca6Q",
wait_connect=False, scan_freq="2412")
for i in range(0, 3):
ev = dev[i].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev[i].wait_event(["EAP: Failed to initialize EAP method: vendor 0 method 21 (TTLS)"], timeout=5)
if ev is None:
raise Exception("Did not report EAP method initialization failure")
def test_ap_wpa2_eap_pwd(dev, apdev):
"""WPA2-Enterprise connection using EAP-pwd"""
check_eap_capa(dev[0], "PWD")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PWD", "pwd user", password="secret password")
eap_reauth(dev[0], "PWD")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[1], hapd, "PWD",
"pwd.user@test123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890.example.com",
password="secret password",
fragment_size="90")
logger.info("Negative test with incorrect password")
eap_connect(dev[2], hapd, "PWD", "pwd user", password="secret-password",
expect_failure=True, local_error_report=True)
eap_connect(dev[0], hapd, "PWD",
"pwd.user@test123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890.example.com",
password="secret password",
fragment_size="31")
def test_ap_wpa2_eap_pwd_nthash(dev, apdev):
"""WPA2-Enterprise connection using EAP-pwd and NTHash"""
check_eap_capa(dev[0], "PWD")
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PWD", "pwd-hash", password="secret password")
eap_connect(dev[1], hapd, "PWD", "pwd-hash",
password_hex="hash:e3718ece8ab74792cbbfffd316d2d19a")
eap_connect(dev[2], hapd, "PWD", "pwd user",
password_hex="hash:e3718ece8ab74792cbbfffd316d2d19a",
expect_failure=True, local_error_report=True)
def test_ap_wpa2_eap_pwd_groups(dev, apdev):
"""WPA2-Enterprise connection using various EAP-pwd groups"""
check_eap_capa(dev[0], "PWD")
tls = dev[0].request("GET tls_library")
params = { "ssid": "test-wpa2-eap", "wpa": "2", "wpa_key_mgmt": "WPA-EAP",
"rsn_pairwise": "CCMP", "ieee8021x": "1",
"eap_server": "1", "eap_user_file": "auth_serv/eap_user.conf" }
groups = [ 19, 20, 21, 25, 26 ]
if tls.startswith("OpenSSL") and "build=OpenSSL 1.0.2" in tls and "run=OpenSSL 1.0.2" in tls:
logger.info("Add Brainpool EC groups since OpenSSL is new enough")
groups += [ 27, 28, 29, 30 ]
for i in groups:
logger.info("Group %d" % i)
params['pwd_group'] = str(i)
hapd = hostapd.add_ap(apdev[0], params)
try:
eap_connect(dev[0], hapd, "PWD", "pwd user",
password="secret password")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
dev[0].dump_monitor()
except:
if "BoringSSL" in tls and i in [ 25 ]:
logger.info("Ignore connection failure with group %d with BoringSSL" % i)
dev[0].request("DISCONNECT")
time.sleep(0.1)
dev[0].request("REMOVE_NETWORK all")
dev[0].dump_monitor()
continue
raise
def test_ap_wpa2_eap_pwd_invalid_group(dev, apdev):
"""WPA2-Enterprise connection using invalid EAP-pwd group"""
check_eap_capa(dev[0], "PWD")
params = { "ssid": "test-wpa2-eap", "wpa": "2", "wpa_key_mgmt": "WPA-EAP",
"rsn_pairwise": "CCMP", "ieee8021x": "1",
"eap_server": "1", "eap_user_file": "auth_serv/eap_user.conf" }
params['pwd_group'] = "0"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PWD",
identity="pwd user", password="secret password",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_pwd_as_frag(dev, apdev):
"""WPA2-Enterprise connection using EAP-pwd with server fragmentation"""
check_eap_capa(dev[0], "PWD")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params = { "ssid": "test-wpa2-eap", "wpa": "2", "wpa_key_mgmt": "WPA-EAP",
"rsn_pairwise": "CCMP", "ieee8021x": "1",
"eap_server": "1", "eap_user_file": "auth_serv/eap_user.conf",
"pwd_group": "19", "fragment_size": "40" }
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PWD", "pwd user", password="secret password")
def test_ap_wpa2_eap_gpsk(dev, apdev):
"""WPA2-Enterprise connection using EAP-GPSK"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
id = eap_connect(dev[0], hapd, "GPSK", "gpsk user",
password="abcdefghijklmnop0123456789abcdef")
eap_reauth(dev[0], "GPSK")
logger.info("Test forced algorithm selection")
for phase1 in [ "cipher=1", "cipher=2" ]:
dev[0].set_network_quoted(id, "phase1", phase1)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
dev[0].wait_connected(timeout=10)
logger.info("Test failed algorithm negotiation")
dev[0].set_network_quoted(id, "phase1", "cipher=9")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=10)
if ev is None:
raise Exception("EAP failure timed out")
logger.info("Negative test with incorrect password")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "GPSK", "gpsk user",
password="ffcdefghijklmnop0123456789abcdef",
expect_failure=True)
def test_ap_wpa2_eap_sake(dev, apdev):
"""WPA2-Enterprise connection using EAP-SAKE"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "SAKE", "sake user",
password_hex="0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef")
eap_reauth(dev[0], "SAKE")
logger.info("Negative test with incorrect password")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "SAKE", "sake user",
password_hex="ff23456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
expect_failure=True)
def test_ap_wpa2_eap_eke(dev, apdev):
"""WPA2-Enterprise connection using EAP-EKE"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
id = eap_connect(dev[0], hapd, "EKE", "eke user", password="hello")
eap_reauth(dev[0], "EKE")
logger.info("Test forced algorithm selection")
for phase1 in [ "dhgroup=5 encr=1 prf=2 mac=2",
"dhgroup=4 encr=1 prf=2 mac=2",
"dhgroup=3 encr=1 prf=2 mac=2",
"dhgroup=3 encr=1 prf=1 mac=1" ]:
dev[0].set_network_quoted(id, "phase1", phase1)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
dev[0].wait_connected(timeout=10)
logger.info("Test failed algorithm negotiation")
dev[0].set_network_quoted(id, "phase1", "dhgroup=9 encr=9 prf=9 mac=9")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=10)
if ev is None:
raise Exception("EAP failure timed out")
logger.info("Negative test with incorrect password")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "EKE", "eke user", password="hello1",
expect_failure=True)
def test_ap_wpa2_eap_eke_many(dev, apdev, params):
"""WPA2-Enterprise connection using EAP-EKE (many connections) [long]"""
if not params['long']:
raise HwsimSkip("Skip test case with long duration due to --long not specified")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
success = 0
fail = 0
for i in range(100):
for j in range(3):
dev[j].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="EKE",
identity="eke user", password="hello",
phase1="dhgroup=3 encr=1 prf=1 mac=1",
scan_freq="2412", wait_connect=False)
for j in range(3):
ev = dev[j].wait_event(["CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED"], timeout=15)
if ev is None:
raise Exception("No connected/disconnected event")
if "CTRL-EVENT-DISCONNECTED" in ev:
fail += 1
# The RADIUS server limits on active sessions can be hit when
# going through this test case, so try to give some more time
# for the server to remove sessions.
logger.info("Failed to connect i=%d j=%d" % (i, j))
dev[j].request("REMOVE_NETWORK all")
time.sleep(1)
else:
success += 1
dev[j].request("REMOVE_NETWORK all")
dev[j].wait_disconnected()
dev[j].dump_monitor()
logger.info("Total success=%d failure=%d" % (success, fail))
def test_ap_wpa2_eap_eke_serverid_nai(dev, apdev):
"""WPA2-Enterprise connection using EAP-EKE with serverid NAI"""
params = int_eap_server_params()
params['server_id'] = 'example.server@w1.fi'
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "EKE", "eke user", password="hello")
def test_ap_wpa2_eap_eke_server_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP-EKE with server OOM"""
params = int_eap_server_params()
hapd = hostapd.add_ap(apdev[0], params)
dev[0].scan_for_bss(apdev[0]['bssid'], freq=2412)
for count,func in [ (1, "eap_eke_build_commit"),
(2, "eap_eke_build_commit"),
(3, "eap_eke_build_commit"),
(1, "eap_eke_build_confirm"),
(2, "eap_eke_build_confirm"),
(1, "eap_eke_process_commit"),
(2, "eap_eke_process_commit"),
(1, "eap_eke_process_confirm"),
(1, "eap_eke_process_identity"),
(2, "eap_eke_process_identity"),
(3, "eap_eke_process_identity"),
(4, "eap_eke_process_identity") ]:
with alloc_fail(hapd, count, func):
eap_connect(dev[0], hapd, "EKE", "eke user", password="hello",
expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
for count,func,pw in [ (1, "eap_eke_init", "hello"),
(1, "eap_eke_get_session_id", "hello"),
(1, "eap_eke_getKey", "hello"),
(1, "eap_eke_build_msg", "hello"),
(1, "eap_eke_build_failure", "wrong"),
(1, "eap_eke_build_identity", "hello"),
(2, "eap_eke_build_identity", "hello") ]:
with alloc_fail(hapd, count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="EKE", identity="eke user", password=pw,
wait_connect=False, scan_freq="2412")
# This would eventually time out, but we can stop after having
# reached the allocation failure.
for i in range(20):
time.sleep(0.1)
if hapd.request("GET_ALLOC_FAIL").startswith('0'):
break
dev[0].request("REMOVE_NETWORK all")
for count in range(1, 1000):
try:
with alloc_fail(hapd, count, "eap_server_sm_step"):
dev[0].connect("test-wpa2-eap",
key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="EKE", identity="eke user", password=pw,
wait_connect=False, scan_freq="2412")
# This would eventually time out, but we can stop after having
# reached the allocation failure.
for i in range(10):
time.sleep(0.1)
if hapd.request("GET_ALLOC_FAIL").startswith('0'):
break
dev[0].request("REMOVE_NETWORK all")
except Exception, e:
if str(e) == "Allocation failure did not trigger":
if count < 30:
raise Exception("Too few allocation failures")
logger.info("%d allocation failures tested" % (count - 1))
break
raise e
def test_ap_wpa2_eap_ikev2(dev, apdev):
"""WPA2-Enterprise connection using EAP-IKEv2"""
check_eap_capa(dev[0], "IKEV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "IKEV2", "ikev2 user",
password="ike password")
eap_reauth(dev[0], "IKEV2")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "IKEV2", "ikev2 user",
password="ike password", fragment_size="50")
logger.info("Negative test with incorrect password")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "IKEV2", "ikev2 user",
password="ike-password", expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "IKEV2", "ikev2 user",
password="ike password", fragment_size="0")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_ikev2_as_frag(dev, apdev):
"""WPA2-Enterprise connection using EAP-IKEv2 with server fragmentation"""
check_eap_capa(dev[0], "IKEV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params = { "ssid": "test-wpa2-eap", "wpa": "2", "wpa_key_mgmt": "WPA-EAP",
"rsn_pairwise": "CCMP", "ieee8021x": "1",
"eap_server": "1", "eap_user_file": "auth_serv/eap_user.conf",
"fragment_size": "50" }
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "IKEV2", "ikev2 user",
password="ike password")
eap_reauth(dev[0], "IKEV2")
def test_ap_wpa2_eap_ikev2_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP-IKEv2 and OOM"""
check_eap_capa(dev[0], "IKEV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
tests = [ (1, "dh_init"),
(2, "dh_init"),
(1, "dh_derive_shared") ]
for count, func in tests:
with alloc_fail(dev[0], count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="IKEV2",
identity="ikev2 user", password="ike password",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=5)
if ev is None:
raise Exception("EAP method not selected")
for i in range(10):
if "0:" in dev[0].request("GET_ALLOC_FAIL"):
break
time.sleep(0.02)
dev[0].request("REMOVE_NETWORK all")
tests = [ (1, "os_get_random;dh_init") ]
for count, func in tests:
with fail_test(dev[0], count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="IKEV2",
identity="ikev2 user", password="ike password",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=5)
if ev is None:
raise Exception("EAP method not selected")
for i in range(10):
if "0:" in dev[0].request("GET_FAIL"):
break
time.sleep(0.02)
dev[0].request("REMOVE_NETWORK all")
def test_ap_wpa2_eap_pax(dev, apdev):
"""WPA2-Enterprise connection using EAP-PAX"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PAX", "pax.user@example.com",
password_hex="0123456789abcdef0123456789abcdef")
eap_reauth(dev[0], "PAX")
logger.info("Negative test with incorrect password")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "PAX", "pax.user@example.com",
password_hex="ff23456789abcdef0123456789abcdef",
expect_failure=True)
def test_ap_wpa2_eap_psk(dev, apdev):
"""WPA2-Enterprise connection using EAP-PSK"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params["wpa_key_mgmt"] = "WPA-EAP-SHA256"
params["ieee80211w"] = "2"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PSK", "psk.user@example.com",
password_hex="0123456789abcdef0123456789abcdef", sha256=True)
eap_reauth(dev[0], "PSK", sha256=True)
check_mib(dev[0], [ ("dot11RSNAAuthenticationSuiteRequested", "00-0f-ac-5"),
("dot11RSNAAuthenticationSuiteSelected", "00-0f-ac-5") ])
bss = dev[0].get_bss(apdev[0]['bssid'])
if 'flags' not in bss:
raise Exception("Could not get BSS flags from BSS table")
if "[WPA2-EAP-SHA256-CCMP]" not in bss['flags']:
raise Exception("Unexpected BSS flags: " + bss['flags'])
logger.info("Negative test with incorrect password")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "PSK", "psk.user@example.com",
password_hex="ff23456789abcdef0123456789abcdef", sha256=True,
expect_failure=True)
def test_ap_wpa2_eap_psk_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP-PSK and OOM"""
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
tests = [ (1, "=aes_128_eax_encrypt"),
(1, "=aes_128_eax_decrypt") ]
for count, func in tests:
with alloc_fail(dev[0], count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PSK",
identity="psk.user@example.com",
password_hex="0123456789abcdef0123456789abcdef",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=5)
if ev is None:
raise Exception("EAP method not selected")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL",
note="Failure not triggered: %d:%s" % (count, func))
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ (1, "aes_ctr_encrypt;aes_128_eax_encrypt"),
(1, "omac1_aes_128;aes_128_eax_encrypt"),
(2, "omac1_aes_128;aes_128_eax_encrypt"),
(3, "omac1_aes_128;aes_128_eax_encrypt"),
(1, "omac1_aes_vector"),
(1, "omac1_aes_128;aes_128_eax_decrypt"),
(2, "omac1_aes_128;aes_128_eax_decrypt"),
(3, "omac1_aes_128;aes_128_eax_decrypt"),
(1, "aes_ctr_encrypt;aes_128_eax_decrypt") ]
for count, func in tests:
with fail_test(dev[0], count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PSK",
identity="psk.user@example.com",
password_hex="0123456789abcdef0123456789abcdef",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=5)
if ev is None:
raise Exception("EAP method not selected")
wait_fail_trigger(dev[0], "GET_FAIL",
note="Failure not triggered: %d:%s" % (count, func))
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
with fail_test(dev[0], 1, "aes_128_encrypt_block"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PSK",
identity="psk.user@example.com",
password_hex="0123456789abcdef0123456789abcdef",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=10)
if ev is None:
raise Exception("EAP method failure not reported")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa_eap_peap_eap_mschapv2(dev, apdev):
"""WPA-Enterprise connection using EAP-PEAP/EAP-MSCHAPv2"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa_eap_params(ssid="test-wpa-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="PEAP",
identity="user", password="password", phase2="auth=MSCHAPV2",
ca_cert="auth_serv/ca.pem", wait_connect=False,
scan_freq="2412")
eap_check_auth(dev[0], "PEAP", True, rsn=False)
hwsim_utils.test_connectivity(dev[0], hapd)
eap_reauth(dev[0], "PEAP", rsn=False)
check_mib(dev[0], [ ("dot11RSNAAuthenticationSuiteRequested", "00-50-f2-1"),
("dot11RSNAAuthenticationSuiteSelected", "00-50-f2-1") ])
status = dev[0].get_status(extra="VERBOSE")
if 'portControl' not in status:
raise Exception("portControl missing from STATUS-VERBOSE")
if status['portControl'] != 'Auto':
raise Exception("Unexpected portControl value: " + status['portControl'])
if 'eap_session_id' not in status:
raise Exception("eap_session_id missing from STATUS-VERBOSE")
if not status['eap_session_id'].startswith("19"):
raise Exception("Unexpected eap_session_id value: " + status['eap_session_id'])
def test_ap_wpa2_eap_interactive(dev, apdev):
"""WPA2-Enterprise connection using interactive identity/password entry"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
tests = [ ("Connection with dynamic TTLS/MSCHAPv2 password entry",
"TTLS", "ttls", "DOMAIN\mschapv2 user", "auth=MSCHAPV2",
None, "password"),
("Connection with dynamic TTLS/MSCHAPv2 identity and password entry",
"TTLS", "ttls", None, "auth=MSCHAPV2",
"DOMAIN\mschapv2 user", "password"),
("Connection with dynamic TTLS/EAP-MSCHAPv2 password entry",
"TTLS", "ttls", "user", "autheap=MSCHAPV2", None, "password"),
("Connection with dynamic TTLS/EAP-MD5 password entry",
"TTLS", "ttls", "user", "autheap=MD5", None, "password"),
("Connection with dynamic PEAP/EAP-MSCHAPv2 password entry",
"PEAP", None, "user", "auth=MSCHAPV2", None, "password"),
("Connection with dynamic PEAP/EAP-GTC password entry",
"PEAP", None, "user", "auth=GTC", None, "password") ]
for [desc,eap,anon,identity,phase2,req_id,req_pw] in tests:
logger.info(desc)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap=eap,
anonymous_identity=anon, identity=identity,
ca_cert="auth_serv/ca.pem", phase2=phase2,
wait_connect=False, scan_freq="2412")
if req_id:
ev = dev[0].wait_event(["CTRL-REQ-IDENTITY"])
if ev is None:
raise Exception("Request for identity timed out")
id = ev.split(':')[0].split('-')[-1]
dev[0].request("CTRL-RSP-IDENTITY-" + id + ":" + req_id)
ev = dev[0].wait_event(["CTRL-REQ-PASSWORD","CTRL-REQ-OTP"])
if ev is None:
raise Exception("Request for password timed out")
id = ev.split(':')[0].split('-')[-1]
type = "OTP" if "CTRL-REQ-OTP" in ev else "PASSWORD"
dev[0].request("CTRL-RSP-" + type + "-" + id + ":" + req_pw)
dev[0].wait_connected(timeout=10)
dev[0].request("REMOVE_NETWORK all")
def test_ap_wpa2_eap_ext_enable_network_while_connected(dev, apdev):
"""WPA2-Enterprise interactive identity entry and ENABLE_NETWORK"""
check_eap_capa(dev[0], "MSCHAPV2")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
id_other = dev[0].connect("other", key_mgmt="NONE", scan_freq="2412",
only_add_network=True)
req_id = "DOMAIN\mschapv2 user"
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
anonymous_identity="ttls", identity=None,
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-REQ-IDENTITY"])
if ev is None:
raise Exception("Request for identity timed out")
id = ev.split(':')[0].split('-')[-1]
dev[0].request("CTRL-RSP-IDENTITY-" + id + ":" + req_id)
dev[0].wait_connected(timeout=10)
if "OK" not in dev[0].request("ENABLE_NETWORK " + str(id_other)):
raise Exception("Failed to enable network")
ev = dev[0].wait_event(["SME: Trying to authenticate"], timeout=1)
if ev is not None:
raise Exception("Unexpected reconnection attempt on ENABLE_NETWORK")
dev[0].request("REMOVE_NETWORK all")
def test_ap_wpa2_eap_vendor_test(dev, apdev):
"""WPA2-Enterprise connection using EAP vendor test"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "VENDOR-TEST", "vendor-test")
eap_reauth(dev[0], "VENDOR-TEST")
eap_connect(dev[1], hapd, "VENDOR-TEST", "vendor-test",
password="pending")
def test_ap_wpa2_eap_vendor_test_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP vendor test (OOM)"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
tests = [ "eap_vendor_test_init",
"eap_msg_alloc;eap_vendor_test_process",
"eap_vendor_test_getKey" ]
for func in tests:
with alloc_fail(dev[0], 1, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP",
scan_freq="2412",
eap="VENDOR-TEST", identity="vendor-test",
wait_connect=False)
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_fast_mschapv2_unauth_prov(dev, apdev):
"""WPA2-Enterprise connection using EAP-FAST/MSCHAPv2 and unauthenticated provisioning"""
check_eap_capa(dev[0], "FAST")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1", pac_file="blob://fast_pac")
hwsim_utils.test_connectivity(dev[0], hapd)
res = eap_reauth(dev[0], "FAST")
if res['tls_session_reused'] != '1':
raise Exception("EAP-FAST could not use PAC session ticket")
def test_ap_wpa2_eap_fast_pac_file(dev, apdev, params):
"""WPA2-Enterprise connection using EAP-FAST/MSCHAPv2 and PAC file"""
check_eap_capa(dev[0], "FAST")
pac_file = os.path.join(params['logdir'], "fast.pac")
pac_file2 = os.path.join(params['logdir'], "fast-bin.pac")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
try:
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1", pac_file=pac_file)
with open(pac_file, "r") as f:
data = f.read()
if "wpa_supplicant EAP-FAST PAC file - version 1" not in data:
raise Exception("PAC file header missing")
if "PAC-Key=" not in data:
raise Exception("PAC-Key missing from PAC file")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
pac_file=pac_file)
eap_connect(dev[1], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1 fast_pac_format=binary",
pac_file=pac_file2)
dev[1].request("REMOVE_NETWORK all")
eap_connect(dev[1], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_pac_format=binary",
pac_file=pac_file2)
finally:
try:
os.remove(pac_file)
except:
pass
try:
os.remove(pac_file2)
except:
pass
def test_ap_wpa2_eap_fast_binary_pac(dev, apdev):
"""WPA2-Enterprise connection using EAP-FAST and binary PAC format"""
check_eap_capa(dev[0], "FAST")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1 fast_max_pac_list_len=1 fast_pac_format=binary",
pac_file="blob://fast_pac_bin")
res = eap_reauth(dev[0], "FAST")
if res['tls_session_reused'] != '1':
raise Exception("EAP-FAST could not use PAC session ticket")
# Verify fast_max_pac_list_len=0 special case
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1 fast_max_pac_list_len=0 fast_pac_format=binary",
pac_file="blob://fast_pac_bin")
def test_ap_wpa2_eap_fast_missing_pac_config(dev, apdev):
"""WPA2-Enterprise connection using EAP-FAST and missing PAC config"""
check_eap_capa(dev[0], "FAST")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
pac_file="blob://fast_pac_not_in_use",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
dev[0].request("REMOVE_NETWORK all")
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_fast_binary_pac_errors(dev, apdev):
"""EAP-FAST and binary PAC errors"""
check_eap_capa(dev[0], "FAST")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
tests = [ (1, "=eap_fast_save_pac_bin"),
(1, "eap_fast_write_pac"),
(2, "eap_fast_write_pac"), ]
for count, func in tests:
if "OK" not in dev[0].request("SET blob fast_pac_bin_errors "):
raise Exception("Could not set blob")
with alloc_fail(dev[0], count, func):
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1 fast_pac_format=binary",
pac_file="blob://fast_pac_bin_errors")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ "00", "000000000000", "6ae4920c0001",
"6ae4920c000000",
"6ae4920c0000" + "0000" + 32*"00" + "ffff" + "0000",
"6ae4920c0000" + "0000" + 32*"00" + "0001" + "0000",
"6ae4920c0000" + "0000" + 32*"00" + "0000" + "0001",
"6ae4920c0000" + "0000" + 32*"00" + "0000" + "0008" + "00040000" + "0007000100"]
for t in tests:
if "OK" not in dev[0].request("SET blob fast_pac_bin_errors " + t):
raise Exception("Could not set blob")
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1 fast_pac_format=binary",
pac_file="blob://fast_pac_bin_errors",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["EAP: Failed to initialize EAP method"],
timeout=5)
if ev is None:
raise Exception("Failure not reported")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
pac = "6ae4920c0000" + "0000" + 32*"00" + "0000" + "0000"
tests = [ (1, "eap_fast_load_pac_bin"),
(2, "eap_fast_load_pac_bin"),
(3, "eap_fast_load_pac_bin") ]
for count, func in tests:
if "OK" not in dev[0].request("SET blob fast_pac_bin_errors " + pac):
raise Exception("Could not set blob")
with alloc_fail(dev[0], count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1 fast_pac_format=binary",
pac_file="blob://fast_pac_bin_errors",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["EAP: Failed to initialize EAP method"],
timeout=5)
if ev is None:
raise Exception("Failure not reported")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
pac = "6ae4920c0000" + "0000" + 32*"00" + "0000" + "0005" + "0011223344"
if "OK" not in dev[0].request("SET blob fast_pac_bin_errors " + pac):
raise Exception("Could not set blob")
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1 fast_pac_format=binary",
pac_file="blob://fast_pac_bin_errors")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
pac = "6ae4920c0000" + "0000" + 32*"00" + "0000" + "0009" + "00040000" + "0007000100"
tests = [ (1, "eap_fast_pac_get_a_id"),
(2, "eap_fast_pac_get_a_id") ]
for count, func in tests:
if "OK" not in dev[0].request("SET blob fast_pac_bin_errors " + pac):
raise Exception("Could not set blob")
with alloc_fail(dev[0], count, func):
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1 fast_pac_format=binary",
pac_file="blob://fast_pac_bin_errors")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_fast_text_pac_errors(dev, apdev):
"""EAP-FAST and text PAC errors"""
check_eap_capa(dev[0], "FAST")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
tests = [ (1, "eap_fast_parse_hex;eap_fast_parse_pac_key"),
(1, "eap_fast_parse_hex;eap_fast_parse_pac_opaque"),
(1, "eap_fast_parse_hex;eap_fast_parse_a_id"),
(1, "eap_fast_parse_start"),
(1, "eap_fast_save_pac") ]
for count, func in tests:
dev[0].request("FLUSH")
if "OK" not in dev[0].request("SET blob fast_pac_text_errors "):
raise Exception("Could not set blob")
with alloc_fail(dev[0], count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1",
pac_file="blob://fast_pac_text_errors",
scan_freq="2412", wait_connect=False)
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
pac = "wpa_supplicant EAP-FAST PAC file - version 1\n"
pac += "START\n"
pac += "PAC-Type\n"
pac += "END\n"
if "OK" not in dev[0].request("SET blob fast_pac_text_errors " + pac.encode("hex")):
raise Exception("Could not set blob")
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1",
pac_file="blob://fast_pac_text_errors",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["EAP: Failed to initialize EAP method"], timeout=5)
if ev is None:
raise Exception("Failure not reported")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
dev[0].request("FLUSH")
if "OK" not in dev[0].request("SET blob fast_pac_text_errors "):
raise Exception("Could not set blob")
with alloc_fail(dev[0], 1, "eap_fast_add_pac_data"):
for i in range(3):
params = int_eap_server_params()
params['ssid'] = "test-wpa2-eap-2"
params['pac_opaque_encr_key'] = "000102030405060708090a0b0c0dff%02x" % i
params['eap_fast_a_id'] = "101112131415161718191a1b1c1dff%02x" % i
params['eap_fast_a_id_info'] = "test server %d" % i
hapd2 = hostapd.add_ap(apdev[1], params)
dev[0].connect("test-wpa2-eap-2", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1",
pac_file="blob://fast_pac_text_errors",
scan_freq="2412", wait_connect=False)
dev[0].wait_connected()
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
hapd2.disable()
def test_ap_wpa2_eap_fast_pac_truncate(dev, apdev):
"""EAP-FAST and PAC list truncation"""
check_eap_capa(dev[0], "FAST")
if "OK" not in dev[0].request("SET blob fast_pac_truncate "):
raise Exception("Could not set blob")
for i in range(5):
params = int_eap_server_params()
params['pac_opaque_encr_key'] = "000102030405060708090a0b0c0dff%02x" % i
params['eap_fast_a_id'] = "101112131415161718191a1b1c1dff%02x" % i
params['eap_fast_a_id_info'] = "test server %d" % i
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1 fast_max_pac_list_len=2",
pac_file="blob://fast_pac_truncate",
scan_freq="2412", wait_connect=False)
dev[0].wait_connected()
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
hapd.disable()
def test_ap_wpa2_eap_fast_pac_refresh(dev, apdev):
"""EAP-FAST and PAC refresh"""
check_eap_capa(dev[0], "FAST")
if "OK" not in dev[0].request("SET blob fast_pac_refresh "):
raise Exception("Could not set blob")
for i in range(2):
params = int_eap_server_params()
params['pac_opaque_encr_key'] = "000102030405060708090a0b0c0dff%02x" % i
params['eap_fast_a_id'] = "101112131415161718191a1b1c1dff%02x" % i
params['eap_fast_a_id_info'] = "test server %d" % i
params['pac_key_refresh_time'] = "1"
params['pac_key_lifetime'] = "10"
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1",
pac_file="blob://fast_pac_refresh",
scan_freq="2412", wait_connect=False)
dev[0].wait_connected()
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
hapd.disable()
for i in range(2):
params = int_eap_server_params()
params['pac_opaque_encr_key'] = "000102030405060708090a0b0c0dff%02x" % i
params['eap_fast_a_id'] = "101112131415161718191a1b1c1dff%02x" % i
params['eap_fast_a_id_info'] = "test server %d" % i
params['pac_key_refresh_time'] = "10"
params['pac_key_lifetime'] = "10"
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1",
pac_file="blob://fast_pac_refresh",
scan_freq="2412", wait_connect=False)
dev[0].wait_connected()
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
hapd.disable()
def test_ap_wpa2_eap_fast_pac_lifetime(dev, apdev):
"""EAP-FAST and PAC lifetime"""
check_eap_capa(dev[0], "FAST")
if "OK" not in dev[0].request("SET blob fast_pac_refresh "):
raise Exception("Could not set blob")
i = 0
params = int_eap_server_params()
params['pac_opaque_encr_key'] = "000102030405060708090a0b0c0dff%02x" % i
params['eap_fast_a_id'] = "101112131415161718191a1b1c1dff%02x" % i
params['eap_fast_a_id_info'] = "test server %d" % i
params['pac_key_refresh_time'] = "0"
params['pac_key_lifetime'] = "2"
hapd = hostapd.add_ap(apdev[0], params)
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=2",
pac_file="blob://fast_pac_refresh",
scan_freq="2412", wait_connect=False)
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
time.sleep(3)
dev[0].request("PMKSA_FLUSH")
dev[0].request("RECONNECT")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=10)
if ev is None:
raise Exception("No EAP-Failure seen after expired PAC")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].select_network(id)
dev[0].wait_connected()
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_fast_gtc_auth_prov(dev, apdev):
"""WPA2-Enterprise connection using EAP-FAST/GTC and authenticated provisioning"""
check_eap_capa(dev[0], "FAST")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=GTC",
phase1="fast_provisioning=2", pac_file="blob://fast_pac_auth")
hwsim_utils.test_connectivity(dev[0], hapd)
res = eap_reauth(dev[0], "FAST")
if res['tls_session_reused'] != '1':
raise Exception("EAP-FAST could not use PAC session ticket")
def test_ap_wpa2_eap_fast_gtc_identity_change(dev, apdev):
"""WPA2-Enterprise connection using EAP-FAST/GTC and identity changing"""
check_eap_capa(dev[0], "FAST")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
id = eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=GTC",
phase1="fast_provisioning=2",
pac_file="blob://fast_pac_auth")
dev[0].set_network_quoted(id, "identity", "user2")
dev[0].wait_disconnected()
ev = dev[0].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=15)
if ev is None:
raise Exception("EAP-FAST not started")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=5)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_fast_prf_oom(dev, apdev):
"""WPA2-Enterprise connection using EAP-FAST and OOM in PRF"""
check_eap_capa(dev[0], "FAST")
tls = dev[0].request("GET tls_library")
if tls.startswith("OpenSSL"):
func = "tls_connection_get_eap_fast_key"
count = 2
elif tls.startswith("internal"):
func = "tls_connection_prf"
count = 1
else:
raise HwsimSkip("Unsupported TLS library")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
with alloc_fail(dev[0], count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password", ca_cert="auth_serv/ca.pem",
phase2="auth=GTC",
phase1="fast_provisioning=2",
pac_file="blob://fast_pac_auth",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=15)
if ev is None:
raise Exception("EAP failure not reported")
dev[0].request("DISCONNECT")
def test_ap_wpa2_eap_fast_server_oom(dev, apdev):
"""EAP-FAST/MSCHAPv2 and server OOM"""
check_eap_capa(dev[0], "FAST")
params = int_eap_server_params()
params['dh_file'] = 'auth_serv/dh.conf'
params['pac_opaque_encr_key'] = '000102030405060708090a0b0c0d0e0f'
params['eap_fast_a_id'] = '1011'
params['eap_fast_a_id_info'] = 'another test server'
hapd = hostapd.add_ap(apdev[0], params)
with alloc_fail(hapd, 1, "tls_session_ticket_ext_cb"):
id = eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1",
pac_file="blob://fast_pac",
expect_failure=True)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=10)
if ev is None:
raise Exception("No EAP failure reported")
dev[0].wait_disconnected()
dev[0].request("DISCONNECT")
dev[0].select_network(id, freq="2412")
def test_ap_wpa2_eap_fast_cipher_suites(dev, apdev):
"""EAP-FAST and different TLS cipher suites"""
check_eap_capa(dev[0], "FAST")
tls = dev[0].request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("TLS library is not OpenSSL: " + tls)
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].request("SET blob fast_pac_ciphers ")
eap_connect(dev[0], hapd, "FAST", "user",
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=GTC",
phase1="fast_provisioning=2",
pac_file="blob://fast_pac_ciphers")
res = dev[0].get_status_field('EAP TLS cipher')
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
if res != "DHE-RSA-AES256-SHA":
raise Exception("Unexpected cipher suite for provisioning: " + res)
tests = [ "DHE-RSA-AES128-SHA",
"RC4-SHA",
"AES128-SHA",
"AES256-SHA",
"DHE-RSA-AES256-SHA" ]
for cipher in tests:
dev[0].dump_monitor()
logger.info("Testing " + cipher)
try:
eap_connect(dev[0], hapd, "FAST", "user",
openssl_ciphers=cipher,
anonymous_identity="FAST", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=GTC",
pac_file="blob://fast_pac_ciphers")
except Exception, e:
if "Could not select EAP method" in str(e) and cipher == "RC4-SHA":
tls = dev[0].request("GET tls_library")
if "run=OpenSSL 1.1" in tls:
logger.info("Allow failure due to missing TLS library support")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
continue
raise
res = dev[0].get_status_field('EAP TLS cipher')
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
if res != cipher:
raise Exception("Unexpected TLS cipher info (configured %s): %s" % (cipher, res))
def test_ap_wpa2_eap_fast_prov(dev, apdev):
"""EAP-FAST and provisioning options"""
check_eap_capa(dev[0], "FAST")
if "OK" not in dev[0].request("SET blob fast_pac_prov "):
raise Exception("Could not set blob")
i = 100
params = int_eap_server_params()
params['disable_pmksa_caching'] = '1'
params['pac_opaque_encr_key'] = "000102030405060708090a0b0c0dff%02x" % i
params['eap_fast_a_id'] = "101112131415161718191a1b1c1dff%02x" % i
params['eap_fast_a_id_info'] = "test server %d" % i
params['eap_fast_prov'] = "0"
hapd = hostapd.add_ap(apdev[0], params)
logger.info("Provisioning attempt while server has provisioning disabled")
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=2",
pac_file="blob://fast_pac_prov",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS status='completion'"],
timeout=15)
if ev is None:
raise Exception("EAP result not reported")
if "parameter='failure'" not in ev:
raise Exception("Unexpected EAP result: " + ev)
dev[0].wait_disconnected()
dev[0].request("DISCONNECT")
dev[0].dump_monitor()
hapd.disable()
logger.info("Authenticated provisioning")
hapd.set("eap_fast_prov", "2")
hapd.enable()
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS status='completion'"],
timeout=15)
if ev is None:
raise Exception("EAP result not reported")
if "parameter='success'" not in ev:
raise Exception("Unexpected EAP result: " + ev)
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].dump_monitor()
hapd.disable()
logger.info("Provisioning disabled - using previously provisioned PAC")
hapd.set("eap_fast_prov", "0")
hapd.enable()
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS status='completion'"],
timeout=15)
if ev is None:
raise Exception("EAP result not reported")
if "parameter='success'" not in ev:
raise Exception("Unexpected EAP result: " + ev)
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].dump_monitor()
logger.info("Drop PAC and verify connection failure")
if "OK" not in dev[0].request("SET blob fast_pac_prov "):
raise Exception("Could not set blob")
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS status='completion'"],
timeout=15)
if ev is None:
raise Exception("EAP result not reported")
if "parameter='failure'" not in ev:
raise Exception("Unexpected EAP result: " + ev)
dev[0].wait_disconnected()
dev[0].request("DISCONNECT")
dev[0].dump_monitor()
hapd.disable()
logger.info("Anonymous provisioning")
hapd.set("eap_fast_prov", "1")
hapd.enable()
dev[0].set_network_quoted(id, "phase1", "fast_provisioning=1")
dev[0].select_network(id, freq="2412")
# Anonymous provisioning results in EAP-Failure first
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS status='completion'"],
timeout=15)
if ev is None:
raise Exception("EAP result not reported")
if "parameter='failure'" not in ev:
raise Exception("Unexpected EAP result: " + ev)
dev[0].wait_disconnected()
# And then the actual data connection
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS status='completion'"],
timeout=15)
if ev is None:
raise Exception("EAP result not reported")
if "parameter='success'" not in ev:
raise Exception("Unexpected EAP result: " + ev)
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].dump_monitor()
hapd.disable()
logger.info("Provisioning disabled - using previously provisioned PAC")
hapd.set("eap_fast_prov", "0")
hapd.enable()
dev[0].select_network(id, freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS status='completion'"],
timeout=15)
if ev is None:
raise Exception("EAP result not reported")
if "parameter='success'" not in ev:
raise Exception("Unexpected EAP result: " + ev)
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].dump_monitor()
def test_ap_wpa2_eap_tls_ocsp(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and verifying OCSP"""
check_ocsp_support(dev[0])
check_pkcs12_support(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2)
def test_ap_wpa2_eap_tls_ocsp_multi(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and verifying OCSP-multi"""
check_ocsp_multi_support(dev[0])
check_pkcs12_support(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2)
def int_eap_server_params():
params = { "ssid": "test-wpa2-eap", "wpa": "2", "wpa_key_mgmt": "WPA-EAP",
"rsn_pairwise": "CCMP", "ieee8021x": "1",
"eap_server": "1", "eap_user_file": "auth_serv/eap_user.conf",
"ca_cert": "auth_serv/ca.pem",
"server_cert": "auth_serv/server.pem",
"private_key": "auth_serv/server.key",
"dh_file": "auth_serv/dh.conf" }
return params
def test_ap_wpa2_eap_tls_ocsp_key_id(dev, apdev, params):
"""EAP-TLS and OCSP certificate signed OCSP response using key ID"""
check_ocsp_support(dev[0])
ocsp = os.path.join(params['logdir'], "ocsp-server-cache-key-id.der")
if not os.path.exists(ocsp):
raise HwsimSkip("No OCSP response available")
params = int_eap_server_params()
params["ocsp_stapling_response"] = ocsp
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2,
scan_freq="2412")
def test_ap_wpa2_eap_tls_ocsp_ca_signed_good(dev, apdev, params):
"""EAP-TLS and CA signed OCSP response (good)"""
check_ocsp_support(dev[0])
ocsp = os.path.join(params['logdir'], "ocsp-resp-ca-signed.der")
if not os.path.exists(ocsp):
raise HwsimSkip("No OCSP response available")
params = int_eap_server_params()
params["ocsp_stapling_response"] = ocsp
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2,
scan_freq="2412")
def test_ap_wpa2_eap_tls_ocsp_ca_signed_revoked(dev, apdev, params):
"""EAP-TLS and CA signed OCSP response (revoked)"""
check_ocsp_support(dev[0])
ocsp = os.path.join(params['logdir'], "ocsp-resp-ca-signed-revoked.der")
if not os.path.exists(ocsp):
raise HwsimSkip("No OCSP response available")
params = int_eap_server_params()
params["ocsp_stapling_response"] = ocsp
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2,
wait_connect=False, scan_freq="2412")
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS"])
if ev is None:
raise Exception("Timeout on EAP status")
if 'bad certificate status response' in ev:
break
if 'certificate revoked' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_tls_ocsp_ca_signed_unknown(dev, apdev, params):
"""EAP-TLS and CA signed OCSP response (unknown)"""
check_ocsp_support(dev[0])
ocsp = os.path.join(params['logdir'], "ocsp-resp-ca-signed-unknown.der")
if not os.path.exists(ocsp):
raise HwsimSkip("No OCSP response available")
params = int_eap_server_params()
params["ocsp_stapling_response"] = ocsp
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2,
wait_connect=False, scan_freq="2412")
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS"])
if ev is None:
raise Exception("Timeout on EAP status")
if 'bad certificate status response' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_tls_ocsp_server_signed(dev, apdev, params):
"""EAP-TLS and server signed OCSP response"""
check_ocsp_support(dev[0])
ocsp = os.path.join(params['logdir'], "ocsp-resp-server-signed.der")
if not os.path.exists(ocsp):
raise HwsimSkip("No OCSP response available")
params = int_eap_server_params()
params["ocsp_stapling_response"] = ocsp
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2,
wait_connect=False, scan_freq="2412")
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS"])
if ev is None:
raise Exception("Timeout on EAP status")
if 'bad certificate status response' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_tls_ocsp_invalid_data(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and invalid OCSP data"""
check_ocsp_support(dev[0])
params = int_eap_server_params()
params["ocsp_stapling_response"] = "auth_serv/ocsp-req.der"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2,
wait_connect=False, scan_freq="2412")
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS"])
if ev is None:
raise Exception("Timeout on EAP status")
if 'bad certificate status response' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_tls_ocsp_invalid(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and invalid OCSP response"""
check_ocsp_support(dev[0])
params = int_eap_server_params()
params["ocsp_stapling_response"] = "auth_serv/ocsp-server-cache.der-invalid"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2,
wait_connect=False, scan_freq="2412")
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS"])
if ev is None:
raise Exception("Timeout on EAP status")
if 'bad certificate status response' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_tls_ocsp_unknown_sign(dev, apdev):
"""WPA2-Enterprise connection using EAP-TLS and unknown OCSP signer"""
check_ocsp_support(dev[0])
params = int_eap_server_params()
params["ocsp_stapling_response"] = "auth_serv/ocsp-server-cache.der-unknown-sign"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=2,
wait_connect=False, scan_freq="2412")
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS"])
if ev is None:
raise Exception("Timeout on EAP status")
if 'bad certificate status response' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_ttls_ocsp_revoked(dev, apdev, params):
"""WPA2-Enterprise connection using EAP-TTLS and OCSP status revoked"""
check_ocsp_support(dev[0])
ocsp = os.path.join(params['logdir'], "ocsp-server-cache-revoked.der")
if not os.path.exists(ocsp):
raise HwsimSkip("No OCSP response available")
params = int_eap_server_params()
params["ocsp_stapling_response"] = ocsp
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", ca_cert="auth_serv/ca.pem",
anonymous_identity="ttls", password="password",
phase2="auth=PAP", ocsp=2,
wait_connect=False, scan_freq="2412")
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS"])
if ev is None:
raise Exception("Timeout on EAP status")
if 'bad certificate status response' in ev:
break
if 'certificate revoked' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_ttls_ocsp_unknown(dev, apdev, params):
"""WPA2-Enterprise connection using EAP-TTLS and OCSP status revoked"""
check_ocsp_support(dev[0])
ocsp = os.path.join(params['logdir'], "ocsp-server-cache-unknown.der")
if not os.path.exists(ocsp):
raise HwsimSkip("No OCSP response available")
params = int_eap_server_params()
params["ocsp_stapling_response"] = ocsp
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", ca_cert="auth_serv/ca.pem",
anonymous_identity="ttls", password="password",
phase2="auth=PAP", ocsp=2,
wait_connect=False, scan_freq="2412")
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS"])
if ev is None:
raise Exception("Timeout on EAP status")
if 'bad certificate status response' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_ttls_optional_ocsp_unknown(dev, apdev, params):
"""WPA2-Enterprise connection using EAP-TTLS and OCSP status revoked"""
ocsp = os.path.join(params['logdir'], "ocsp-server-cache-unknown.der")
if not os.path.exists(ocsp):
raise HwsimSkip("No OCSP response available")
params = int_eap_server_params()
params["ocsp_stapling_response"] = ocsp
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", ca_cert="auth_serv/ca.pem",
anonymous_identity="ttls", password="password",
phase2="auth=PAP", ocsp=1, scan_freq="2412")
def test_ap_wpa2_eap_tls_intermediate_ca(dev, apdev, params):
"""EAP-TLS with intermediate server/user CA"""
params = int_eap_server_params()
params["ca_cert"] = "auth_serv/iCA-server/ca-and-root.pem"
params["server_cert"] = "auth_serv/iCA-server/server.pem"
params["private_key"] = "auth_serv/iCA-server/server.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user",
ca_cert="auth_serv/iCA-user/ca-and-root.pem",
client_cert="auth_serv/iCA-user/user.pem",
private_key="auth_serv/iCA-user/user.key",
scan_freq="2412")
def root_ocsp(cert):
ca = "auth_serv/ca.pem"
fd2, fn2 = tempfile.mkstemp()
os.close(fd2)
arg = [ "openssl", "ocsp", "-reqout", fn2, "-issuer", ca, "-sha256",
"-cert", cert, "-no_nonce", "-text" ]
logger.info(' '.join(arg))
cmd = subprocess.Popen(arg, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
res = cmd.stdout.read() + "\n" + cmd.stderr.read()
cmd.stdout.close()
cmd.stderr.close()
cmd.wait()
if cmd.returncode != 0:
raise Exception("bad return code from openssl ocsp\n\n" + res)
logger.info("OCSP request:\n" + res)
fd, fn = tempfile.mkstemp()
os.close(fd)
arg = [ "openssl", "ocsp", "-index", "auth_serv/rootCA/index.txt",
"-rsigner", ca, "-rkey", "auth_serv/ca-key.pem",
"-CA", ca, "-issuer", ca, "-verify_other", ca, "-trust_other",
"-ndays", "7", "-reqin", fn2, "-resp_no_certs", "-respout", fn,
"-text" ]
cmd = subprocess.Popen(arg, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
res = cmd.stdout.read() + "\n" + cmd.stderr.read()
cmd.stdout.close()
cmd.stderr.close()
cmd.wait()
if cmd.returncode != 0:
raise Exception("bad return code from openssl ocsp\n\n" + res)
logger.info("OCSP response:\n" + res)
os.unlink(fn2)
return fn
def ica_ocsp(cert, md="-sha256"):
prefix = "auth_serv/iCA-server/"
ca = prefix + "cacert.pem"
cert = prefix + cert
fd2, fn2 = tempfile.mkstemp()
os.close(fd2)
arg = [ "openssl", "ocsp", "-reqout", fn2, "-issuer", ca, md,
"-cert", cert, "-no_nonce", "-text" ]
cmd = subprocess.Popen(arg, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
res = cmd.stdout.read() + "\n" + cmd.stderr.read()
cmd.stdout.close()
cmd.stderr.close()
cmd.wait()
if cmd.returncode != 0:
raise Exception("bad return code from openssl ocsp\n\n" + res)
logger.info("OCSP request:\n" + res)
fd, fn = tempfile.mkstemp()
os.close(fd)
arg = [ "openssl", "ocsp", "-index", prefix + "index.txt",
"-rsigner", ca, "-rkey", prefix + "private/cakey.pem",
"-CA", ca, "-issuer", ca, "-verify_other", ca, "-trust_other",
"-ndays", "7", "-reqin", fn2, "-resp_no_certs", "-respout", fn,
"-text" ]
cmd = subprocess.Popen(arg, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
res = cmd.stdout.read() + "\n" + cmd.stderr.read()
cmd.stdout.close()
cmd.stderr.close()
cmd.wait()
if cmd.returncode != 0:
raise Exception("bad return code from openssl ocsp\n\n" + res)
logger.info("OCSP response:\n" + res)
os.unlink(fn2)
return fn
def test_ap_wpa2_eap_tls_intermediate_ca_ocsp(dev, apdev, params):
"""EAP-TLS with intermediate server/user CA and OCSP on server certificate"""
run_ap_wpa2_eap_tls_intermediate_ca_ocsp(dev, apdev, params, "-sha256")
def test_ap_wpa2_eap_tls_intermediate_ca_ocsp_sha1(dev, apdev, params):
"""EAP-TLS with intermediate server/user CA and OCSP on server certificate )SHA1)"""
run_ap_wpa2_eap_tls_intermediate_ca_ocsp(dev, apdev, params, "-sha1")
def run_ap_wpa2_eap_tls_intermediate_ca_ocsp(dev, apdev, params, md):
params = int_eap_server_params()
params["ca_cert"] = "auth_serv/iCA-server/ca-and-root.pem"
params["server_cert"] = "auth_serv/iCA-server/server.pem"
params["private_key"] = "auth_serv/iCA-server/server.key"
fn = ica_ocsp("server.pem", md)
params["ocsp_stapling_response"] = fn
try:
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user",
ca_cert="auth_serv/iCA-user/ca-and-root.pem",
client_cert="auth_serv/iCA-user/user.pem",
private_key="auth_serv/iCA-user/user.key",
scan_freq="2412", ocsp=2)
finally:
os.unlink(fn)
def test_ap_wpa2_eap_tls_intermediate_ca_ocsp_revoked(dev, apdev, params):
"""EAP-TLS with intermediate server/user CA and OCSP on revoked server certificate"""
run_ap_wpa2_eap_tls_intermediate_ca_ocsp_revoked(dev, apdev, params,
"-sha256")
def test_ap_wpa2_eap_tls_intermediate_ca_ocsp_revoked_sha1(dev, apdev, params):
"""EAP-TLS with intermediate server/user CA and OCSP on revoked server certificate (SHA1)"""
run_ap_wpa2_eap_tls_intermediate_ca_ocsp_revoked(dev, apdev, params,
"-sha1")
def run_ap_wpa2_eap_tls_intermediate_ca_ocsp_revoked(dev, apdev, params, md):
params = int_eap_server_params()
params["ca_cert"] = "auth_serv/iCA-server/ca-and-root.pem"
params["server_cert"] = "auth_serv/iCA-server/server-revoked.pem"
params["private_key"] = "auth_serv/iCA-server/server-revoked.key"
fn = ica_ocsp("server-revoked.pem", md)
params["ocsp_stapling_response"] = fn
try:
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user",
ca_cert="auth_serv/iCA-user/ca-and-root.pem",
client_cert="auth_serv/iCA-user/user.pem",
private_key="auth_serv/iCA-user/user.key",
scan_freq="2412", ocsp=1, wait_connect=False)
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS",
"CTRL-EVENT-EAP-SUCCESS"])
if ev is None:
raise Exception("Timeout on EAP status")
if "CTRL-EVENT-EAP-SUCCESS" in ev:
raise Exception("Unexpected EAP-Success")
if 'bad certificate status response' in ev:
break
if 'certificate revoked' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
finally:
os.unlink(fn)
def test_ap_wpa2_eap_tls_intermediate_ca_ocsp_multi_missing_resp(dev, apdev, params):
"""EAP-TLS with intermediate server/user CA and OCSP multi missing response"""
check_ocsp_support(dev[0])
check_ocsp_multi_support(dev[0])
params = int_eap_server_params()
params["ca_cert"] = "auth_serv/iCA-server/ca-and-root.pem"
params["server_cert"] = "auth_serv/iCA-server/server.pem"
params["private_key"] = "auth_serv/iCA-server/server.key"
fn = ica_ocsp("server.pem")
params["ocsp_stapling_response"] = fn
try:
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user",
ca_cert="auth_serv/iCA-user/ca-and-root.pem",
client_cert="auth_serv/iCA-user/user.pem",
private_key="auth_serv/iCA-user/user.key",
scan_freq="2412", ocsp=3, wait_connect=False)
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS",
"CTRL-EVENT-EAP-SUCCESS"])
if ev is None:
raise Exception("Timeout on EAP status")
if "CTRL-EVENT-EAP-SUCCESS" in ev:
raise Exception("Unexpected EAP-Success")
if 'bad certificate status response' in ev:
break
if 'certificate revoked' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
finally:
os.unlink(fn)
def test_ap_wpa2_eap_tls_intermediate_ca_ocsp_multi(dev, apdev, params):
"""EAP-TLS with intermediate server/user CA and OCSP multi OK"""
check_ocsp_support(dev[0])
check_ocsp_multi_support(dev[0])
params = int_eap_server_params()
params["ca_cert"] = "auth_serv/iCA-server/ca-and-root.pem"
params["server_cert"] = "auth_serv/iCA-server/server.pem"
params["private_key"] = "auth_serv/iCA-server/server.key"
fn = ica_ocsp("server.pem")
fn2 = root_ocsp("auth_serv/iCA-server/cacert.pem")
params["ocsp_stapling_response"] = fn
with open(fn, "r") as f:
resp_server = f.read()
with open(fn2, "r") as f:
resp_ica = f.read()
fd3, fn3 = tempfile.mkstemp()
try:
f = os.fdopen(fd3, 'w')
f.write(struct.pack(">L", len(resp_server))[1:4])
f.write(resp_server)
f.write(struct.pack(">L", len(resp_ica))[1:4])
f.write(resp_ica)
f.close()
params["ocsp_stapling_response_multi"] = fn3
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user",
ca_cert="auth_serv/iCA-user/ca-and-root.pem",
client_cert="auth_serv/iCA-user/user.pem",
private_key="auth_serv/iCA-user/user.key",
scan_freq="2412", ocsp=3)
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
finally:
os.unlink(fn)
os.unlink(fn2)
os.unlink(fn3)
def test_ap_wpa2_eap_tls_ocsp_multi_revoked(dev, apdev, params):
"""EAP-TLS and CA signed OCSP multi response (revoked)"""
check_ocsp_support(dev[0])
check_ocsp_multi_support(dev[0])
ocsp_revoked = os.path.join(params['logdir'],
"ocsp-resp-ca-signed-revoked.der")
if not os.path.exists(ocsp_revoked):
raise HwsimSkip("No OCSP response (revoked) available")
ocsp_unknown = os.path.join(params['logdir'],
"ocsp-resp-ca-signed-unknown.der")
if not os.path.exists(ocsp_unknown):
raise HwsimSkip("No OCSP response(unknown) available")
with open(ocsp_revoked, "r") as f:
resp_revoked = f.read()
with open(ocsp_unknown, "r") as f:
resp_unknown = f.read()
fd, fn = tempfile.mkstemp()
try:
# This is not really a valid order of the OCSPResponse items in the
# list, but this works for now to verify parsing and processing of
# multiple responses.
f = os.fdopen(fd, 'w')
f.write(struct.pack(">L", len(resp_unknown))[1:4])
f.write(resp_unknown)
f.write(struct.pack(">L", len(resp_revoked))[1:4])
f.write(resp_revoked)
f.write(struct.pack(">L", 0)[1:4])
f.write(struct.pack(">L", len(resp_unknown))[1:4])
f.write(resp_unknown)
f.close()
params = int_eap_server_params()
params["ocsp_stapling_response_multi"] = fn
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever", ocsp=1,
wait_connect=False, scan_freq="2412")
count = 0
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS",
"CTRL-EVENT-EAP-SUCCESS"])
if ev is None:
raise Exception("Timeout on EAP status")
if "CTRL-EVENT-EAP-SUCCESS" in ev:
raise Exception("Unexpected EAP-Success")
if 'bad certificate status response' in ev:
break
if 'certificate revoked' in ev:
break
count = count + 1
if count > 10:
raise Exception("Unexpected number of EAP status messages")
finally:
os.unlink(fn)
def test_ap_wpa2_eap_tls_domain_suffix_match_cn_full(dev, apdev):
"""WPA2-Enterprise using EAP-TLS and domain suffix match (CN)"""
check_domain_match_full(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-no-dnsname.pem"
params["private_key"] = "auth_serv/server-no-dnsname.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever",
domain_suffix_match="server3.w1.fi",
scan_freq="2412")
def test_ap_wpa2_eap_tls_domain_match_cn(dev, apdev):
"""WPA2-Enterprise using EAP-TLS and domainmatch (CN)"""
check_domain_match(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-no-dnsname.pem"
params["private_key"] = "auth_serv/server-no-dnsname.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever",
domain_match="server3.w1.fi",
scan_freq="2412")
def test_ap_wpa2_eap_tls_domain_suffix_match_cn(dev, apdev):
"""WPA2-Enterprise using EAP-TLS and domain suffix match (CN)"""
check_domain_match_full(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-no-dnsname.pem"
params["private_key"] = "auth_serv/server-no-dnsname.key"
hostapd.add_ap(apdev[0], params)
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever",
domain_suffix_match="w1.fi",
scan_freq="2412")
def test_ap_wpa2_eap_tls_domain_suffix_mismatch_cn(dev, apdev):
"""WPA2-Enterprise using EAP-TLS and domain suffix mismatch (CN)"""
check_domain_suffix_match(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-no-dnsname.pem"
params["private_key"] = "auth_serv/server-no-dnsname.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever",
domain_suffix_match="example.com",
wait_connect=False,
scan_freq="2412")
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever",
domain_suffix_match="erver3.w1.fi",
wait_connect=False,
scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
ev = dev[1].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report (2)")
def test_ap_wpa2_eap_tls_domain_mismatch_cn(dev, apdev):
"""WPA2-Enterprise using EAP-TLS and domain mismatch (CN)"""
check_domain_match(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-no-dnsname.pem"
params["private_key"] = "auth_serv/server-no-dnsname.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever",
domain_match="example.com",
wait_connect=False,
scan_freq="2412")
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
private_key="auth_serv/user.pkcs12",
private_key_passwd="whatever",
domain_match="w1.fi",
wait_connect=False,
scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
ev = dev[1].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report (2)")
def test_ap_wpa2_eap_ttls_expired_cert(dev, apdev):
"""WPA2-Enterprise using EAP-TTLS and expired certificate"""
skip_with_fips(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-expired.pem"
params["private_key"] = "auth_serv/server-expired.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="mschap user", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
wait_connect=False,
scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-TLS-CERT-ERROR"])
if ev is None:
raise Exception("Timeout on EAP certificate error report")
if "reason=4" not in ev or "certificate has expired" not in ev:
raise Exception("Unexpected failure reason: " + ev)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_ttls_ignore_expired_cert(dev, apdev):
"""WPA2-Enterprise using EAP-TTLS and ignore certificate expiration"""
skip_with_fips(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-expired.pem"
params["private_key"] = "auth_serv/server-expired.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="mschap user", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
phase1="tls_disable_time_checks=1",
scan_freq="2412")
def test_ap_wpa2_eap_ttls_long_duration(dev, apdev):
"""WPA2-Enterprise using EAP-TTLS and long certificate duration"""
skip_with_fips(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-long-duration.pem"
params["private_key"] = "auth_serv/server-long-duration.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="mschap user", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
scan_freq="2412")
def test_ap_wpa2_eap_ttls_server_cert_eku_client(dev, apdev):
"""WPA2-Enterprise using EAP-TTLS and server cert with client EKU"""
skip_with_fips(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-eku-client.pem"
params["private_key"] = "auth_serv/server-eku-client.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="mschap user", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
wait_connect=False,
scan_freq="2412")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("Timeout on EAP failure report")
def test_ap_wpa2_eap_ttls_server_cert_eku_client_server(dev, apdev):
"""WPA2-Enterprise using EAP-TTLS and server cert with client and server EKU"""
skip_with_fips(dev[0])
params = int_eap_server_params()
params["server_cert"] = "auth_serv/server-eku-client-server.pem"
params["private_key"] = "auth_serv/server-eku-client-server.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="mschap user", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
scan_freq="2412")
def test_ap_wpa2_eap_ttls_server_pkcs12(dev, apdev):
"""WPA2-Enterprise using EAP-TTLS and server PKCS#12 file"""
skip_with_fips(dev[0])
params = int_eap_server_params()
del params["server_cert"]
params["private_key"] = "auth_serv/server.pkcs12"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="mschap user", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
scan_freq="2412")
def test_ap_wpa2_eap_ttls_server_pkcs12_extra(dev, apdev):
"""EAP-TTLS and server PKCS#12 file with extra certs"""
skip_with_fips(dev[0])
params = int_eap_server_params()
del params["server_cert"]
params["private_key"] = "auth_serv/server-extra.pkcs12"
params["private_key_passwd"] = "whatever"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="mschap user", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
scan_freq="2412")
def test_ap_wpa2_eap_ttls_dh_params(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/CHAP and setting DH params"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.der", phase2="auth=PAP",
dh_file="auth_serv/dh.conf")
def test_ap_wpa2_eap_ttls_dh_params_dsa(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS and setting DH params (DSA)"""
check_dh_dsa_support(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.der", phase2="auth=PAP",
dh_file="auth_serv/dsaparam.pem")
def test_ap_wpa2_eap_ttls_dh_params_not_found(dev, apdev):
"""EAP-TTLS and DH params file not found"""
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="mschap user", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
dh_file="auth_serv/dh-no-such-file.conf",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("EAP failure timed out")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_ttls_dh_params_invalid(dev, apdev):
"""EAP-TTLS and invalid DH params file"""
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="mschap user", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
dh_file="auth_serv/ca.pem",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("EAP failure timed out")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_ttls_dh_params_blob(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS/CHAP and setting DH params from blob"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dh = read_pem("auth_serv/dh2.conf")
if "OK" not in dev[0].request("SET blob dhparams " + dh.encode("hex")):
raise Exception("Could not set dhparams blob")
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.der", phase2="auth=PAP",
dh_file="blob://dhparams")
def test_ap_wpa2_eap_ttls_dh_params_server(dev, apdev):
"""WPA2-Enterprise using EAP-TTLS and alternative server dhparams"""
params = int_eap_server_params()
params["dh_file"] = "auth_serv/dh2.conf"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.der", phase2="auth=PAP")
def test_ap_wpa2_eap_ttls_dh_params_dsa_server(dev, apdev):
"""WPA2-Enterprise using EAP-TTLS and alternative server dhparams (DSA)"""
params = int_eap_server_params()
params["dh_file"] = "auth_serv/dsaparam.pem"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.der", phase2="auth=PAP")
def test_ap_wpa2_eap_ttls_dh_params_not_found(dev, apdev):
"""EAP-TLS server and dhparams file not found"""
params = int_eap_server_params()
params["dh_file"] = "auth_serv/dh-no-such-file.conf"
hapd = hostapd.add_ap(apdev[0], params, no_enable=True)
if "FAIL" not in hapd.request("ENABLE"):
raise Exception("Invalid configuration accepted")
def test_ap_wpa2_eap_ttls_dh_params_invalid(dev, apdev):
"""EAP-TLS server and invalid dhparams file"""
params = int_eap_server_params()
params["dh_file"] = "auth_serv/ca.pem"
hapd = hostapd.add_ap(apdev[0], params, no_enable=True)
if "FAIL" not in hapd.request("ENABLE"):
raise Exception("Invalid configuration accepted")
def test_ap_wpa2_eap_reauth(dev, apdev):
"""WPA2-Enterprise and Authenticator forcing reauthentication"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params['eap_reauth_period'] = '2'
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PAX", "pax.user@example.com",
password_hex="0123456789abcdef0123456789abcdef")
logger.info("Wait for reauthentication")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=10)
if ev is None:
raise Exception("Timeout on reauthentication")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("Timeout on reauthentication")
for i in range(0, 20):
state = dev[0].get_status_field("wpa_state")
if state == "COMPLETED":
break
time.sleep(0.1)
if state != "COMPLETED":
raise Exception("Reauthentication did not complete")
def test_ap_wpa2_eap_request_identity_message(dev, apdev):
"""Optional displayable message in EAP Request-Identity"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params['eap_message'] = 'hello\\0networkid=netw,nasid=foo,portid=0,NAIRealms=example.com'
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PAX", "pax.user@example.com",
password_hex="0123456789abcdef0123456789abcdef")
def test_ap_wpa2_eap_sim_aka_result_ind(dev, apdev):
"""WPA2-Enterprise using EAP-SIM/AKA and protected result indication"""
check_hlr_auc_gw_support()
params = int_eap_server_params()
params['eap_sim_db'] = "unix:/tmp/hlr_auc_gw.sock"
params['eap_sim_aka_result_ind'] = "1"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "SIM", "1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
phase1="result_ind=1")
eap_reauth(dev[0], "SIM")
eap_connect(dev[1], hapd, "SIM", "1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581")
dev[0].request("REMOVE_NETWORK all")
dev[1].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA", "0232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123",
phase1="result_ind=1")
eap_reauth(dev[0], "AKA")
eap_connect(dev[1], hapd, "AKA", "0232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581:000000000123")
dev[0].request("REMOVE_NETWORK all")
dev[1].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "AKA'", "6555444333222111",
password="5122250214c33e723a5dd523fc145fc0:981d464c7c52eb6e5036234984ad0bcf:000000000123",
phase1="result_ind=1")
eap_reauth(dev[0], "AKA'")
eap_connect(dev[1], hapd, "AKA'", "6555444333222111",
password="5122250214c33e723a5dd523fc145fc0:981d464c7c52eb6e5036234984ad0bcf:000000000123")
def test_ap_wpa2_eap_sim_zero_db_timeout(dev, apdev):
"""WPA2-Enterprise using EAP-SIM with zero database timeout"""
check_hlr_auc_gw_support()
params = int_eap_server_params()
params['eap_sim_db'] = "unix:/tmp/hlr_auc_gw.sock"
params['eap_sim_db_timeout'] = "0"
params['disable_pmksa_caching'] = '1'
hapd = hostapd.add_ap(apdev[0], params)
# Run multiple iterations to make it more likely to hit the case where the
# DB request times out and response is lost.
for i in range(20):
print i
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="SIM",
identity="1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
wait_connect=False, scan_freq="2412")
ev = dev[0].wait_event([ "CTRL-EVENT-CONNECTED",
"CTRL-EVENT-DISCONNECTED" ],
timeout=15)
if ev is None:
raise Exception("No connection result")
dev[0].request("REMOVE_NETWORK all")
if "CTRL-EVENT-DISCONNECTED" in ev:
break
dev[0].wait_disconnected()
hapd.ping()
def test_ap_wpa2_eap_too_many_roundtrips(dev, apdev):
"""WPA2-Enterprise connection resulting in too many EAP roundtrips"""
skip_with_fips(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="TTLS", identity="mschap user",
wait_connect=False, scan_freq="2412", ieee80211w="1",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
fragment_size="8")
ev = dev[0].wait_event(["EAP: more than",
"CTRL-EVENT-EAP-SUCCESS"], timeout=20)
if ev is None or "EAP: more than" not in ev:
raise Exception("EAP roundtrip limit not reached")
def test_ap_wpa2_eap_expanded_nak(dev, apdev):
"""WPA2-Enterprise connection with EAP resulting in expanded NAK"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="PSK", identity="vendor-test",
password_hex="ff23456789abcdef0123456789abcdef",
wait_connect=False)
found = False
for i in range(0, 5):
ev = dev[0].wait_event(["CTRL-EVENT-EAP-STATUS"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
if "refuse proposed method" in ev:
found = True
break
if not found:
raise Exception("Unexpected EAP status: " + ev)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"])
if ev is None:
raise Exception("EAP failure timed out")
def test_ap_wpa2_eap_sql(dev, apdev, params):
"""WPA2-Enterprise connection using SQLite for user DB"""
skip_with_fips(dev[0])
try:
import sqlite3
except ImportError:
raise HwsimSkip("No sqlite3 module available")
dbfile = os.path.join(params['logdir'], "eap-user.db")
try:
os.remove(dbfile)
except:
pass
con = sqlite3.connect(dbfile)
with con:
cur = con.cursor()
cur.execute("CREATE TABLE users(identity TEXT PRIMARY KEY, methods TEXT, password TEXT, remediation TEXT, phase2 INTEGER)")
cur.execute("CREATE TABLE wildcards(identity TEXT PRIMARY KEY, methods TEXT)")
cur.execute("INSERT INTO users(identity,methods,password,phase2) VALUES ('user-pap','TTLS-PAP','password',1)")
cur.execute("INSERT INTO users(identity,methods,password,phase2) VALUES ('user-chap','TTLS-CHAP','password',1)")
cur.execute("INSERT INTO users(identity,methods,password,phase2) VALUES ('user-mschap','TTLS-MSCHAP','password',1)")
cur.execute("INSERT INTO users(identity,methods,password,phase2) VALUES ('user-mschapv2','TTLS-MSCHAPV2','password',1)")
cur.execute("INSERT INTO wildcards(identity,methods) VALUES ('','TTLS,TLS')")
cur.execute("CREATE TABLE authlog(timestamp TEXT, session TEXT, nas_ip TEXT, username TEXT, note TEXT)")
try:
params = int_eap_server_params()
params["eap_user_file"] = "sqlite:" + dbfile
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "user-mschapv2",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
dev[0].request("REMOVE_NETWORK all")
eap_connect(dev[1], hapd, "TTLS", "user-mschap",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP")
dev[1].request("REMOVE_NETWORK all")
eap_connect(dev[0], hapd, "TTLS", "user-chap",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=CHAP")
eap_connect(dev[1], hapd, "TTLS", "user-pap",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP")
finally:
os.remove(dbfile)
def test_ap_wpa2_eap_non_ascii_identity(dev, apdev):
"""WPA2-Enterprise connection attempt using non-ASCII identity"""
params = int_eap_server_params()
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="\x80", password="password", wait_connect=False)
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="a\x80", password="password", wait_connect=False)
for i in range(0, 2):
ev = dev[i].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev[i].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=10)
if ev is None:
raise Exception("EAP method selection timed out")
def test_ap_wpa2_eap_non_ascii_identity2(dev, apdev):
"""WPA2-Enterprise connection attempt using non-ASCII identity"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="\x80", password="password", wait_connect=False)
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="a\x80", password="password", wait_connect=False)
for i in range(0, 2):
ev = dev[i].wait_event(["CTRL-EVENT-EAP-STARTED"], timeout=16)
if ev is None:
raise Exception("Association and EAP start timed out")
ev = dev[i].wait_event(["CTRL-EVENT-EAP-METHOD"], timeout=10)
if ev is None:
raise Exception("EAP method selection timed out")
def test_openssl_cipher_suite_config_wpas(dev, apdev):
"""OpenSSL cipher suite configuration on wpa_supplicant"""
tls = dev[0].request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("TLS library is not OpenSSL: " + tls)
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
openssl_ciphers="AES128",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP")
eap_connect(dev[1], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
openssl_ciphers="EXPORT",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP",
expect_failure=True, maybe_local_error=True)
dev[2].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", anonymous_identity="ttls",
password="password",
openssl_ciphers="FOO",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP",
wait_connect=False)
ev = dev[2].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=10)
if ev is None:
raise Exception("EAP failure after invalid openssl_ciphers not reported")
dev[2].request("DISCONNECT")
def test_openssl_cipher_suite_config_hapd(dev, apdev):
"""OpenSSL cipher suite configuration on hostapd"""
tls = dev[0].request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("wpa_supplicant TLS library is not OpenSSL: " + tls)
params = int_eap_server_params()
params['openssl_ciphers'] = "AES256"
hapd = hostapd.add_ap(apdev[0], params)
tls = hapd.request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("hostapd TLS library is not OpenSSL: " + tls)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP")
eap_connect(dev[1], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
openssl_ciphers="AES128",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP",
expect_failure=True)
eap_connect(dev[2], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
openssl_ciphers="HIGH:!ADH",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP")
params['openssl_ciphers'] = "FOO"
hapd2 = hostapd.add_ap(apdev[1], params, no_enable=True)
if "FAIL" not in hapd2.request("ENABLE"):
raise Exception("Invalid openssl_ciphers value accepted")
def test_wpa2_eap_ttls_pap_key_lifetime_in_memory(dev, apdev, params):
"""Key lifetime in memory with WPA2-Enterprise using EAP-TTLS/PAP"""
p = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], p)
password = "63d2d21ac3c09ed567ee004a34490f1d16e7fa5835edf17ddba70a63f1a90a25"
pid = find_wpas_process(dev[0])
id = eap_connect(dev[0], hapd, "TTLS", "pap-secret",
anonymous_identity="ttls", password=password,
ca_cert="auth_serv/ca.pem", phase2="auth=PAP")
# The decrypted copy of GTK is freed only after the CTRL-EVENT-CONNECTED
# event has been delivered, so verify that wpa_supplicant has returned to
# eloop before reading process memory.
time.sleep(1)
dev[0].ping()
buf = read_process_memory(pid, password)
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].relog()
msk = None
emsk = None
pmk = None
ptk = None
gtk = None
with open(os.path.join(params['logdir'], 'log0'), 'r') as f:
for l in f.readlines():
if "EAP-TTLS: Derived key - hexdump" in l:
val = l.strip().split(':')[3].replace(' ', '')
msk = binascii.unhexlify(val)
if "EAP-TTLS: Derived EMSK - hexdump" in l:
val = l.strip().split(':')[3].replace(' ', '')
emsk = binascii.unhexlify(val)
if "WPA: PMK - hexdump" in l:
val = l.strip().split(':')[3].replace(' ', '')
pmk = binascii.unhexlify(val)
if "WPA: PTK - hexdump" in l:
val = l.strip().split(':')[3].replace(' ', '')
ptk = binascii.unhexlify(val)
if "WPA: Group Key - hexdump" in l:
val = l.strip().split(':')[3].replace(' ', '')
gtk = binascii.unhexlify(val)
if not msk or not emsk or not pmk or not ptk or not gtk:
raise Exception("Could not find keys from debug log")
if len(gtk) != 16:
raise Exception("Unexpected GTK length")
kck = ptk[0:16]
kek = ptk[16:32]
tk = ptk[32:48]
fname = os.path.join(params['logdir'],
'wpa2_eap_ttls_pap_key_lifetime_in_memory.memctx-')
logger.info("Checking keys in memory while associated")
get_key_locations(buf, password, "Password")
get_key_locations(buf, pmk, "PMK")
get_key_locations(buf, msk, "MSK")
get_key_locations(buf, emsk, "EMSK")
if password not in buf:
raise HwsimSkip("Password not found while associated")
if pmk not in buf:
raise HwsimSkip("PMK not found while associated")
if kck not in buf:
raise Exception("KCK not found while associated")
if kek not in buf:
raise Exception("KEK not found while associated")
if tk in buf:
raise Exception("TK found from memory")
if gtk in buf:
get_key_locations(buf, gtk, "GTK")
raise Exception("GTK found from memory")
logger.info("Checking keys in memory after disassociation")
buf = read_process_memory(pid, password)
# Note: Password is still present in network configuration
# Note: PMK is in PMKSA cache and EAP fast re-auth data
get_key_locations(buf, password, "Password")
get_key_locations(buf, pmk, "PMK")
get_key_locations(buf, msk, "MSK")
get_key_locations(buf, emsk, "EMSK")
verify_not_present(buf, kck, fname, "KCK")
verify_not_present(buf, kek, fname, "KEK")
verify_not_present(buf, tk, fname, "TK")
verify_not_present(buf, gtk, fname, "GTK")
dev[0].request("PMKSA_FLUSH")
dev[0].set_network_quoted(id, "identity", "foo")
logger.info("Checking keys in memory after PMKSA cache and EAP fast reauth flush")
buf = read_process_memory(pid, password)
get_key_locations(buf, password, "Password")
get_key_locations(buf, pmk, "PMK")
get_key_locations(buf, msk, "MSK")
get_key_locations(buf, emsk, "EMSK")
verify_not_present(buf, pmk, fname, "PMK")
dev[0].request("REMOVE_NETWORK all")
logger.info("Checking keys in memory after network profile removal")
buf = read_process_memory(pid, password)
get_key_locations(buf, password, "Password")
get_key_locations(buf, pmk, "PMK")
get_key_locations(buf, msk, "MSK")
get_key_locations(buf, emsk, "EMSK")
verify_not_present(buf, password, fname, "password")
verify_not_present(buf, pmk, fname, "PMK")
verify_not_present(buf, kck, fname, "KCK")
verify_not_present(buf, kek, fname, "KEK")
verify_not_present(buf, tk, fname, "TK")
verify_not_present(buf, gtk, fname, "GTK")
verify_not_present(buf, msk, fname, "MSK")
verify_not_present(buf, emsk, fname, "EMSK")
def test_ap_wpa2_eap_unexpected_wep_eapol_key(dev, apdev):
"""WPA2-Enterprise connection and unexpected WEP EAPOL-Key"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
bssid = apdev[0]['bssid']
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=PAP")
# Send unexpected WEP EAPOL-Key; this gets dropped
res = dev[0].request("EAPOL_RX " + bssid + " 0203002c0100000000000000000000000000000000000000000000000000000000000000000000000000000000000000")
if "OK" not in res:
raise Exception("EAPOL_RX to wpa_supplicant failed")
def test_ap_wpa2_eap_in_bridge(dev, apdev):
"""WPA2-EAP and wpas interface in a bridge"""
br_ifname='sta-br0'
ifname='wlan5'
try:
_test_ap_wpa2_eap_in_bridge(dev, apdev)
finally:
subprocess.call(['ip', 'link', 'set', 'dev', br_ifname, 'down'])
subprocess.call(['brctl', 'delif', br_ifname, ifname])
subprocess.call(['brctl', 'delbr', br_ifname])
subprocess.call(['iw', ifname, 'set', '4addr', 'off'])
def _test_ap_wpa2_eap_in_bridge(dev, apdev):
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
br_ifname='sta-br0'
ifname='wlan5'
wpas = WpaSupplicant(global_iface='/tmp/wpas-wlan5')
subprocess.call(['brctl', 'addbr', br_ifname])
subprocess.call(['brctl', 'setfd', br_ifname, '0'])
subprocess.call(['ip', 'link', 'set', 'dev', br_ifname, 'up'])
subprocess.call(['iw', ifname, 'set', '4addr', 'on'])
subprocess.check_call(['brctl', 'addif', br_ifname, ifname])
wpas.interface_add(ifname, br_ifname=br_ifname)
wpas.dump_monitor()
id = eap_connect(wpas, hapd, "PAX", "pax.user@example.com",
password_hex="0123456789abcdef0123456789abcdef")
wpas.dump_monitor()
eap_reauth(wpas, "PAX")
wpas.dump_monitor()
# Try again as a regression test for packet socket workaround
eap_reauth(wpas, "PAX")
wpas.dump_monitor()
wpas.request("DISCONNECT")
wpas.wait_disconnected()
wpas.dump_monitor()
wpas.request("RECONNECT")
wpas.wait_connected()
wpas.dump_monitor()
def test_ap_wpa2_eap_session_ticket(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS and TLS session ticket enabled"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
key_mgmt = hapd.get_config()['key_mgmt']
if key_mgmt.split(' ')[0] != "WPA-EAP":
raise Exception("Unexpected GET_CONFIG(key_mgmt): " + key_mgmt)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem",
phase1="tls_disable_session_ticket=0", phase2="auth=PAP")
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_no_workaround(dev, apdev):
"""WPA2-Enterprise connection using EAP-TTLS and eap_workaround=0"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
key_mgmt = hapd.get_config()['key_mgmt']
if key_mgmt.split(' ')[0] != "WPA-EAP":
raise Exception("Unexpected GET_CONFIG(key_mgmt): " + key_mgmt)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", eap_workaround='0',
phase2="auth=PAP")
eap_reauth(dev[0], "TTLS")
def test_ap_wpa2_eap_tls_check_crl(dev, apdev):
"""EAP-TLS and server checking CRL"""
params = int_eap_server_params()
params['check_crl'] = '1'
hapd = hostapd.add_ap(apdev[0], params)
# check_crl=1 and no CRL available --> reject connection
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key", expect_failure=True)
dev[0].request("REMOVE_NETWORK all")
hapd.disable()
hapd.set("ca_cert", "auth_serv/ca-and-crl.pem")
hapd.enable()
# check_crl=1 and valid CRL --> accept
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
dev[0].request("REMOVE_NETWORK all")
hapd.disable()
hapd.set("check_crl", "2")
hapd.enable()
# check_crl=2 and valid CRL --> accept
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
dev[0].request("REMOVE_NETWORK all")
def test_ap_wpa2_eap_tls_oom(dev, apdev):
"""EAP-TLS and OOM"""
check_subject_match_support(dev[0])
check_altsubject_match_support(dev[0])
check_domain_match(dev[0])
check_domain_match_full(dev[0])
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
tests = [ (1, "tls_connection_set_subject_match"),
(2, "tls_connection_set_subject_match"),
(3, "tls_connection_set_subject_match"),
(4, "tls_connection_set_subject_match") ]
for count, func in tests:
with alloc_fail(dev[0], count, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key",
subject_match="/C=FI/O=w1.fi/CN=server.w1.fi",
altsubject_match="EMAIL:noone@example.com;DNS:server.w1.fi;URI:http://example.com/",
domain_suffix_match="server.w1.fi",
domain_match="server.w1.fi",
wait_connect=False, scan_freq="2412")
# TLS parameter configuration error results in CTRL-REQ-PASSPHRASE
ev = dev[0].wait_event(["CTRL-REQ-PASSPHRASE"], timeout=5)
if ev is None:
raise Exception("No passphrase request")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_tls_macacl(dev, apdev):
"""WPA2-Enterprise connection using MAC ACL"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params["macaddr_acl"] = "2"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[1], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
def test_ap_wpa2_eap_oom(dev, apdev):
"""EAP server and OOM"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].scan_for_bss(apdev[0]['bssid'], freq=2412)
with alloc_fail(hapd, 1, "eapol_auth_alloc"):
# The first attempt fails, but STA will send EAPOL-Start to retry and
# that succeeds.
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key",
scan_freq="2412")
def check_tls_ver(dev, hapd, phase1, expected):
eap_connect(dev, hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key",
phase1=phase1)
ver = dev.get_status_field("eap_tls_version")
if ver != expected:
raise Exception("Unexpected TLS version (expected %s): %s" % (expected, ver))
def test_ap_wpa2_eap_tls_versions(dev, apdev):
"""EAP-TLS and TLS version configuration"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
tls = dev[0].request("GET tls_library")
if tls.startswith("OpenSSL"):
if "build=OpenSSL 1.0.2" in tls and "run=OpenSSL 1.0.2" in tls:
check_tls_ver(dev[0], hapd,
"tls_disable_tlsv1_0=1 tls_disable_tlsv1_1=1",
"TLSv1.2")
elif tls.startswith("internal"):
check_tls_ver(dev[0], hapd,
"tls_disable_tlsv1_0=1 tls_disable_tlsv1_1=1", "TLSv1.2")
check_tls_ver(dev[1], hapd,
"tls_disable_tlsv1_0=1 tls_disable_tlsv1_2=1", "TLSv1.1")
check_tls_ver(dev[2], hapd,
"tls_disable_tlsv1_1=1 tls_disable_tlsv1_2=1", "TLSv1")
def test_rsn_ie_proto_eap_sta(dev, apdev):
"""RSN element protocol testing for EAP cases on STA side"""
bssid = apdev[0]['bssid']
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
# This is the RSN element used normally by hostapd
params['own_ie_override'] = '30140100000fac040100000fac040100000fac010c00'
hapd = hostapd.add_ap(apdev[0], params)
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="GPSK",
identity="gpsk user",
password="abcdefghijklmnop0123456789abcdef",
scan_freq="2412")
tests = [ ('No RSN Capabilities field',
'30120100000fac040100000fac040100000fac01'),
('No AKM Suite fields',
'300c0100000fac040100000fac04'),
('No Pairwise Cipher Suite fields',
'30060100000fac04'),
('No Group Data Cipher Suite field',
'30020100') ]
for txt,ie in tests:
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
logger.info(txt)
hapd.disable()
hapd.set('own_ie_override', ie)
hapd.enable()
dev[0].request("BSS_FLUSH 0")
dev[0].scan_for_bss(bssid, 2412, force_scan=True, only_new=True)
dev[0].select_network(id, freq=2412)
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].flush_scan_cache()
def check_tls_session_resumption_capa(dev, hapd):
tls = hapd.request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("hostapd TLS library is not OpenSSL: " + tls)
tls = dev.request("GET tls_library")
if not tls.startswith("OpenSSL"):
raise HwsimSkip("Session resumption not supported with this TLS library: " + tls)
def test_eap_ttls_pap_session_resumption(dev, apdev):
"""EAP-TTLS/PAP session resumption"""
params = int_eap_server_params()
params['tls_session_lifetime'] = '60'
hapd = hostapd.add_ap(apdev[0], params)
check_tls_session_resumption_capa(dev[0], hapd)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", eap_workaround='0',
phase2="auth=PAP")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the second connection")
def test_eap_ttls_chap_session_resumption(dev, apdev):
"""EAP-TTLS/CHAP session resumption"""
params = int_eap_server_params()
params['tls_session_lifetime'] = '60'
hapd = hostapd.add_ap(apdev[0], params)
check_tls_session_resumption_capa(dev[0], hapd)
eap_connect(dev[0], hapd, "TTLS", "chap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.der", phase2="auth=CHAP")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the second connection")
def test_eap_ttls_mschap_session_resumption(dev, apdev):
"""EAP-TTLS/MSCHAP session resumption"""
check_domain_suffix_match(dev[0])
params = int_eap_server_params()
params['tls_session_lifetime'] = '60'
hapd = hostapd.add_ap(apdev[0], params)
check_tls_session_resumption_capa(dev[0], hapd)
eap_connect(dev[0], hapd, "TTLS", "mschap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAP",
domain_suffix_match="server.w1.fi")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the second connection")
def test_eap_ttls_mschapv2_session_resumption(dev, apdev):
"""EAP-TTLS/MSCHAPv2 session resumption"""
check_domain_suffix_match(dev[0])
check_eap_capa(dev[0], "MSCHAPV2")
params = int_eap_server_params()
params['tls_session_lifetime'] = '60'
hapd = hostapd.add_ap(apdev[0], params)
check_tls_session_resumption_capa(dev[0], hapd)
eap_connect(dev[0], hapd, "TTLS", "DOMAIN\mschapv2 user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
domain_suffix_match="server.w1.fi")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the second connection")
def test_eap_ttls_eap_gtc_session_resumption(dev, apdev):
"""EAP-TTLS/EAP-GTC session resumption"""
params = int_eap_server_params()
params['tls_session_lifetime'] = '60'
hapd = hostapd.add_ap(apdev[0], params)
check_tls_session_resumption_capa(dev[0], hapd)
eap_connect(dev[0], hapd, "TTLS", "user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", phase2="autheap=GTC")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the second connection")
def test_eap_ttls_no_session_resumption(dev, apdev):
"""EAP-TTLS session resumption disabled on server"""
params = int_eap_server_params()
params['tls_session_lifetime'] = '0'
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TTLS", "pap user",
anonymous_identity="ttls", password="password",
ca_cert="auth_serv/ca.pem", eap_workaround='0',
phase2="auth=PAP")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the second connection")
def test_eap_peap_session_resumption(dev, apdev):
"""EAP-PEAP session resumption"""
params = int_eap_server_params()
params['tls_session_lifetime'] = '60'
hapd = hostapd.add_ap(apdev[0], params)
check_tls_session_resumption_capa(dev[0], hapd)
eap_connect(dev[0], hapd, "PEAP", "user",
anonymous_identity="peap", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the second connection")
def test_eap_peap_session_resumption_crypto_binding(dev, apdev):
"""EAP-PEAP session resumption with crypto binding"""
params = int_eap_server_params()
params['tls_session_lifetime'] = '60'
hapd = hostapd.add_ap(apdev[0], params)
check_tls_session_resumption_capa(dev[0], hapd)
eap_connect(dev[0], hapd, "PEAP", "user",
anonymous_identity="peap", password="password",
phase1="peapver=0 crypto_binding=2",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the second connection")
def test_eap_peap_no_session_resumption(dev, apdev):
"""EAP-PEAP session resumption disabled on server"""
params = int_eap_server_params()
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "PEAP", "user",
anonymous_identity="peap", password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the second connection")
def test_eap_tls_session_resumption(dev, apdev):
"""EAP-TLS session resumption"""
params = int_eap_server_params()
params['tls_session_lifetime'] = '60'
hapd = hostapd.add_ap(apdev[0], params)
check_tls_session_resumption_capa(dev[0], hapd)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the second connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the third connection")
def test_eap_tls_session_resumption_expiration(dev, apdev):
"""EAP-TLS session resumption"""
params = int_eap_server_params()
params['tls_session_lifetime'] = '1'
hapd = hostapd.add_ap(apdev[0], params)
check_tls_session_resumption_capa(dev[0], hapd)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
# Allow multiple attempts since OpenSSL may not expire the cached entry
# immediately.
for i in range(10):
time.sleep(1.2)
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") == '0':
break
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Session resumption used after lifetime expiration")
def test_eap_tls_no_session_resumption(dev, apdev):
"""EAP-TLS session resumption disabled on server"""
params = int_eap_server_params()
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the second connection")
def test_eap_tls_session_resumption_radius(dev, apdev):
"""EAP-TLS session resumption (RADIUS)"""
params = { "ssid": "as", "beacon_int": "2000",
"radius_server_clients": "auth_serv/radius_clients.conf",
"radius_server_auth_port": '18128',
"eap_server": "1",
"eap_user_file": "auth_serv/eap_user.conf",
"ca_cert": "auth_serv/ca.pem",
"server_cert": "auth_serv/server.pem",
"private_key": "auth_serv/server.key",
"tls_session_lifetime": "60" }
authsrv = hostapd.add_ap(apdev[1], params)
check_tls_session_resumption_capa(dev[0], authsrv)
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params['auth_server_port'] = "18128"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '1':
raise Exception("Session resumption not used on the second connection")
def test_eap_tls_no_session_resumption_radius(dev, apdev):
"""EAP-TLS session resumption disabled (RADIUS)"""
params = { "ssid": "as", "beacon_int": "2000",
"radius_server_clients": "auth_serv/radius_clients.conf",
"radius_server_auth_port": '18128',
"eap_server": "1",
"eap_user_file": "auth_serv/eap_user.conf",
"ca_cert": "auth_serv/ca.pem",
"server_cert": "auth_serv/server.pem",
"private_key": "auth_serv/server.key",
"tls_session_lifetime": "0" }
hostapd.add_ap(apdev[1], params)
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params['auth_server_port'] = "18128"
hapd = hostapd.add_ap(apdev[0], params)
eap_connect(dev[0], hapd, "TLS", "tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the first connection")
dev[0].request("REAUTHENTICATE")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("EAP success timed out")
ev = dev[0].wait_event(["WPA: Key negotiation completed"], timeout=10)
if ev is None:
raise Exception("Key handshake with the AP timed out")
if dev[0].get_status_field("tls_session_reused") != '0':
raise Exception("Unexpected session resumption on the second connection")
def test_eap_mschapv2_errors(dev, apdev):
"""EAP-MSCHAPv2 error cases"""
check_eap_capa(dev[0], "MSCHAPV2")
check_eap_capa(dev[0], "FAST")
params = hostapd.wpa2_eap_params(ssid="test-wpa-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="MSCHAPV2",
identity="phase1-user", password="password",
scan_freq="2412")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ (1, "hash_nt_password_hash;mschapv2_derive_response"),
(1, "nt_password_hash;mschapv2_derive_response"),
(1, "nt_password_hash;=mschapv2_derive_response"),
(1, "generate_nt_response;mschapv2_derive_response"),
(1, "generate_authenticator_response;mschapv2_derive_response"),
(1, "nt_password_hash;=mschapv2_derive_response"),
(1, "get_master_key;mschapv2_derive_response"),
(1, "os_get_random;eap_mschapv2_challenge_reply") ]
for count, func in tests:
with fail_test(dev[0], count, func):
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="MSCHAPV2",
identity="phase1-user", password="password",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ (1, "hash_nt_password_hash;mschapv2_derive_response"),
(1, "hash_nt_password_hash;=mschapv2_derive_response"),
(1, "generate_nt_response_pwhash;mschapv2_derive_response"),
(1, "generate_authenticator_response_pwhash;mschapv2_derive_response") ]
for count, func in tests:
with fail_test(dev[0], count, func):
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="MSCHAPV2",
identity="phase1-user",
password_hex="hash:8846f7eaee8fb117ad06bdd830b7586c",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ (1, "eap_mschapv2_init"),
(1, "eap_msg_alloc;eap_mschapv2_challenge_reply"),
(1, "eap_msg_alloc;eap_mschapv2_success"),
(1, "eap_mschapv2_getKey") ]
for count, func in tests:
with alloc_fail(dev[0], count, func):
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="MSCHAPV2",
identity="phase1-user", password="password",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ (1, "eap_msg_alloc;eap_mschapv2_failure") ]
for count, func in tests:
with alloc_fail(dev[0], count, func):
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="MSCHAPV2",
identity="phase1-user", password="wrong password",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ (2, "eap_mschapv2_init"),
(3, "eap_mschapv2_init") ]
for count, func in tests:
with alloc_fail(dev[0], count, func):
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="FAST",
anonymous_identity="FAST", identity="user",
password="password",
ca_cert="auth_serv/ca.pem", phase2="auth=MSCHAPV2",
phase1="fast_provisioning=1",
pac_file="blob://fast_pac",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_eap_gpsk_errors(dev, apdev):
"""EAP-GPSK error cases"""
params = hostapd.wpa2_eap_params(ssid="test-wpa-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="GPSK",
identity="gpsk user",
password="abcdefghijklmnop0123456789abcdef",
scan_freq="2412")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ (1, "os_get_random;eap_gpsk_send_gpsk_2", None),
(1, "eap_gpsk_derive_session_id;eap_gpsk_send_gpsk_2",
"cipher=1"),
(1, "eap_gpsk_derive_session_id;eap_gpsk_send_gpsk_2",
"cipher=2"),
(1, "eap_gpsk_derive_keys_helper", None),
(2, "eap_gpsk_derive_keys_helper", None),
(1, "eap_gpsk_compute_mic_aes;eap_gpsk_compute_mic;eap_gpsk_send_gpsk_2",
"cipher=1"),
(1, "hmac_sha256;eap_gpsk_compute_mic;eap_gpsk_send_gpsk_2",
"cipher=2"),
(1, "eap_gpsk_compute_mic;eap_gpsk_validate_gpsk_3_mic", None),
(1, "eap_gpsk_compute_mic;eap_gpsk_send_gpsk_4", None),
(1, "eap_gpsk_derive_mid_helper", None) ]
for count, func, phase1 in tests:
with fail_test(dev[0], count, func):
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="GPSK",
identity="gpsk user",
password="abcdefghijklmnop0123456789abcdef",
phase1=phase1,
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ (1, "eap_gpsk_init"),
(2, "eap_gpsk_init"),
(3, "eap_gpsk_init"),
(1, "eap_gpsk_process_id_server"),
(1, "eap_msg_alloc;eap_gpsk_send_gpsk_2"),
(1, "eap_gpsk_derive_session_id;eap_gpsk_send_gpsk_2"),
(1, "eap_gpsk_derive_mid_helper;eap_gpsk_derive_session_id;eap_gpsk_send_gpsk_2"),
(1, "eap_gpsk_derive_keys"),
(1, "eap_gpsk_derive_keys_helper"),
(1, "eap_msg_alloc;eap_gpsk_send_gpsk_4"),
(1, "eap_gpsk_getKey"),
(1, "eap_gpsk_get_emsk"),
(1, "eap_gpsk_get_session_id") ]
for count, func in tests:
with alloc_fail(dev[0], count, func):
dev[0].request("ERP_FLUSH")
dev[0].connect("test-wpa-eap", key_mgmt="WPA-EAP", eap="GPSK",
identity="gpsk user@domain", erp="1",
password="abcdefghijklmnop0123456789abcdef",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_sim_db(dev, apdev, params):
"""EAP-SIM DB error cases"""
sockpath = '/tmp/hlr_auc_gw.sock-test'
try:
os.remove(sockpath)
except:
pass
hparams = int_eap_server_params()
hparams['eap_sim_db'] = 'unix:' + sockpath
hapd = hostapd.add_ap(apdev[0], hparams)
# Initial test with hlr_auc_gw socket not available
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP WPA-EAP-SHA256",
eap="SIM", identity="1232010000000000",
password="90dca4eda45b53cf0f12d7c9c3bc6a89:cb9cccc4b9258e6dca4760379fb82581",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=10)
if ev is None:
raise Exception("EAP-Failure not reported")
dev[0].wait_disconnected()
dev[0].request("DISCONNECT")
# Test with invalid responses and response timeout
class test_handler(SocketServer.DatagramRequestHandler):
def handle(self):
data = self.request[0].strip()
socket = self.request[1]
logger.debug("Received hlr_auc_gw request: " + data)
# EAP-SIM DB: Failed to parse response string
socket.sendto("FOO", self.client_address)
# EAP-SIM DB: Failed to parse response string
socket.sendto("FOO 1", self.client_address)
# EAP-SIM DB: Unknown external response
socket.sendto("FOO 1 2", self.client_address)
logger.info("No proper response - wait for pending eap_sim_db request timeout")
server = SocketServer.UnixDatagramServer(sockpath, test_handler)
server.timeout = 1
dev[0].select_network(id)
server.handle_request()
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=10)
if ev is None:
raise Exception("EAP-Failure not reported")
dev[0].wait_disconnected()
dev[0].request("DISCONNECT")
# Test with a valid response
class test_handler2(SocketServer.DatagramRequestHandler):
def handle(self):
data = self.request[0].strip()
socket = self.request[1]
logger.debug("Received hlr_auc_gw request: " + data)
fname = os.path.join(params['logdir'],
'hlr_auc_gw.milenage_db')
cmd = subprocess.Popen(['../../hostapd/hlr_auc_gw',
'-m', fname, data],
stdout=subprocess.PIPE)
res = cmd.stdout.read().strip()
cmd.stdout.close()
logger.debug("hlr_auc_gw response: " + res)
socket.sendto(res, self.client_address)
server.RequestHandlerClass = test_handler2
dev[0].select_network(id)
server.handle_request()
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
def test_eap_tls_sha512(dev, apdev, params):
"""EAP-TLS with SHA512 signature"""
params = int_eap_server_params()
params["ca_cert"] = "auth_serv/sha512-ca.pem"
params["server_cert"] = "auth_serv/sha512-server.pem"
params["private_key"] = "auth_serv/sha512-server.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user sha512",
ca_cert="auth_serv/sha512-ca.pem",
client_cert="auth_serv/sha512-user.pem",
private_key="auth_serv/sha512-user.key",
scan_freq="2412")
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user sha512",
ca_cert="auth_serv/sha512-ca.pem",
client_cert="auth_serv/sha384-user.pem",
private_key="auth_serv/sha384-user.key",
scan_freq="2412")
def test_eap_tls_sha384(dev, apdev, params):
"""EAP-TLS with SHA384 signature"""
params = int_eap_server_params()
params["ca_cert"] = "auth_serv/sha512-ca.pem"
params["server_cert"] = "auth_serv/sha384-server.pem"
params["private_key"] = "auth_serv/sha384-server.key"
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user sha512",
ca_cert="auth_serv/sha512-ca.pem",
client_cert="auth_serv/sha512-user.pem",
private_key="auth_serv/sha512-user.key",
scan_freq="2412")
dev[1].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user sha512",
ca_cert="auth_serv/sha512-ca.pem",
client_cert="auth_serv/sha384-user.pem",
private_key="auth_serv/sha384-user.key",
scan_freq="2412")
def test_ap_wpa2_eap_assoc_rsn(dev, apdev):
"""WPA2-Enterprise AP and association request RSN IE differences"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap-11w")
params["ieee80211w"] = "2"
hostapd.add_ap(apdev[1], params)
# Success cases with optional RSN IE fields removed one by one
tests = [ ("Normal wpa_supplicant assoc req RSN IE",
"30140100000fac040100000fac040100000fac010000"),
("Extra PMKIDCount field in RSN IE",
"30160100000fac040100000fac040100000fac0100000000"),
("Extra Group Management Cipher Suite in RSN IE",
"301a0100000fac040100000fac040100000fac0100000000000fac06"),
("Extra undefined extension field in RSN IE",
"301c0100000fac040100000fac040100000fac0100000000000fac061122"),
("RSN IE without RSN Capabilities",
"30120100000fac040100000fac040100000fac01"),
("RSN IE without AKM", "300c0100000fac040100000fac04"),
("RSN IE without pairwise", "30060100000fac04"),
("RSN IE without group", "30020100") ]
for title, ie in tests:
logger.info(title)
set_test_assoc_ie(dev[0], ie)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="GPSK",
identity="gpsk user",
password="abcdefghijklmnop0123456789abcdef",
scan_freq="2412")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ ("Normal wpa_supplicant assoc req RSN IE",
"30140100000fac040100000fac040100000fac01cc00"),
("Group management cipher included in assoc req RSN IE",
"301a0100000fac040100000fac040100000fac01cc000000000fac06") ]
for title, ie in tests:
logger.info(title)
set_test_assoc_ie(dev[0], ie)
dev[0].connect("test-wpa2-eap-11w", key_mgmt="WPA-EAP", ieee80211w="1",
eap="GPSK", identity="gpsk user",
password="abcdefghijklmnop0123456789abcdef",
scan_freq="2412")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ ("Invalid group cipher", "30060100000fac02", 41),
("Invalid pairwise cipher", "300c0100000fac040100000fac02", 42) ]
for title, ie, status in tests:
logger.info(title)
set_test_assoc_ie(dev[0], ie)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="GPSK",
identity="gpsk user",
password="abcdefghijklmnop0123456789abcdef",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["CTRL-EVENT-ASSOC-REJECT"])
if ev is None:
raise Exception("Association rejection not reported")
if "status_code=" + str(status) not in ev:
raise Exception("Unexpected status code: " + ev)
dev[0].request("REMOVE_NETWORK all")
dev[0].dump_monitor()
tests = [ ("Management frame protection not enabled",
"30140100000fac040100000fac040100000fac010000", 31),
("Unsupported management group cipher",
"301a0100000fac040100000fac040100000fac01cc000000000fac0b", 31) ]
for title, ie, status in tests:
logger.info(title)
set_test_assoc_ie(dev[0], ie)
dev[0].connect("test-wpa2-eap-11w", key_mgmt="WPA-EAP", ieee80211w="1",
eap="GPSK", identity="gpsk user",
password="abcdefghijklmnop0123456789abcdef",
scan_freq="2412", wait_connect=False)
ev = dev[0].wait_event(["CTRL-EVENT-ASSOC-REJECT"])
if ev is None:
raise Exception("Association rejection not reported")
if "status_code=" + str(status) not in ev:
raise Exception("Unexpected status code: " + ev)
dev[0].request("REMOVE_NETWORK all")
dev[0].dump_monitor()
def test_eap_tls_ext_cert_check(dev, apdev):
"""EAP-TLS and external server certification validation"""
# With internal server certificate chain validation
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user",
ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key",
phase1="tls_ext_cert_check=1", scan_freq="2412",
only_add_network=True)
run_ext_cert_check(dev, apdev, id)
def test_eap_ttls_ext_cert_check(dev, apdev):
"""EAP-TTLS and external server certification validation"""
# Without internal server certificate chain validation
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TTLS",
identity="pap user", anonymous_identity="ttls",
password="password", phase2="auth=PAP",
phase1="tls_ext_cert_check=1", scan_freq="2412",
only_add_network=True)
run_ext_cert_check(dev, apdev, id)
def test_eap_peap_ext_cert_check(dev, apdev):
"""EAP-PEAP and external server certification validation"""
# With internal server certificate chain validation
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PEAP",
identity="user", anonymous_identity="peap",
ca_cert="auth_serv/ca.pem",
password="password", phase2="auth=MSCHAPV2",
phase1="tls_ext_cert_check=1", scan_freq="2412",
only_add_network=True)
run_ext_cert_check(dev, apdev, id)
def test_eap_fast_ext_cert_check(dev, apdev):
"""EAP-FAST and external server certification validation"""
check_eap_capa(dev[0], "FAST")
# With internal server certificate chain validation
dev[0].request("SET blob fast_pac_auth_ext ")
id = dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="FAST",
identity="user", anonymous_identity="FAST",
ca_cert="auth_serv/ca.pem",
password="password", phase2="auth=GTC",
phase1="tls_ext_cert_check=1 fast_provisioning=2",
pac_file="blob://fast_pac_auth_ext",
scan_freq="2412",
only_add_network=True)
run_ext_cert_check(dev, apdev, id)
def run_ext_cert_check(dev, apdev, net_id):
check_ext_cert_check_support(dev[0])
if not openssl_imported:
raise HwsimSkip("OpenSSL python method not available")
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].select_network(net_id)
certs = {}
while True:
ev = dev[0].wait_event(["CTRL-EVENT-EAP-PEER-CERT",
"CTRL-REQ-EXT_CERT_CHECK",
"CTRL-EVENT-EAP-SUCCESS"], timeout=10)
if ev is None:
raise Exception("No peer server certificate event seen")
if "CTRL-EVENT-EAP-PEER-CERT" in ev:
depth = None
cert = None
vals = ev.split(' ')
for v in vals:
if v.startswith("depth="):
depth = int(v.split('=')[1])
elif v.startswith("cert="):
cert = v.split('=')[1]
if depth is not None and cert:
certs[depth] = binascii.unhexlify(cert)
elif "CTRL-EVENT-EAP-SUCCESS" in ev:
raise Exception("Unexpected EAP-Success")
elif "CTRL-REQ-EXT_CERT_CHECK" in ev:
id = ev.split(':')[0].split('-')[-1]
break
if 0 not in certs:
raise Exception("Server certificate not received")
if 1 not in certs:
raise Exception("Server certificate issuer not received")
cert = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_ASN1,
certs[0])
cn = cert.get_subject().commonName
logger.info("Server certificate CN=" + cn)
issuer = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_ASN1,
certs[1])
icn = issuer.get_subject().commonName
logger.info("Issuer certificate CN=" + icn)
if cn != "server.w1.fi":
raise Exception("Unexpected server certificate CN: " + cn)
if icn != "Root CA":
raise Exception("Unexpected server certificate issuer CN: " + icn)
ev = dev[0].wait_event(["CTRL-EVENT-EAP-SUCCESS"], timeout=0.1)
if ev:
raise Exception("Unexpected EAP-Success before external check result indication")
dev[0].request("CTRL-RSP-EXT_CERT_CHECK-" + id + ":good")
dev[0].wait_connected()
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
if "FAIL" in dev[0].request("PMKSA_FLUSH"):
raise Exception("PMKSA_FLUSH failed")
dev[0].request("SET blob fast_pac_auth_ext ")
dev[0].request("RECONNECT")
ev = dev[0].wait_event(["CTRL-REQ-EXT_CERT_CHECK"], timeout=10)
if ev is None:
raise Exception("No peer server certificate event seen (2)")
id = ev.split(':')[0].split('-')[-1]
dev[0].request("CTRL-RSP-EXT_CERT_CHECK-" + id + ":bad")
ev = dev[0].wait_event(["CTRL-EVENT-EAP-FAILURE"], timeout=5)
if ev is None:
raise Exception("EAP-Failure not reported")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_eap_tls_errors(dev, apdev):
"""EAP-TLS error cases"""
params = int_eap_server_params()
params['fragment_size'] = '100'
hostapd.add_ap(apdev[0], params)
with alloc_fail(dev[0], 1,
"eap_peer_tls_reassemble_fragment"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
with alloc_fail(dev[0], 1, "eap_tls_init"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
with alloc_fail(dev[0], 1, "eap_peer_tls_ssl_init"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user", ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key",
engine="1",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
ev = dev[0].wait_event(["CTRL-REQ-PIN"], timeout=5)
if ev is None:
raise Exception("No CTRL-REQ-PIN seen")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
tests = [ "eap_peer_tls_derive_key;eap_tls_success",
"eap_peer_tls_derive_session_id;eap_tls_success",
"eap_tls_getKey",
"eap_tls_get_emsk",
"eap_tls_get_session_id" ]
for func in tests:
with alloc_fail(dev[0], 1, func):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="TLS",
identity="tls user@domain",
ca_cert="auth_serv/ca.pem",
client_cert="auth_serv/user.pem",
private_key="auth_serv/user.key",
erp="1",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
with alloc_fail(dev[0], 1, "eap_unauth_tls_init"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="UNAUTH-TLS",
identity="unauth-tls", ca_cert="auth_serv/ca.pem",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
with alloc_fail(dev[0], 1, "eap_peer_tls_ssl_init;eap_unauth_tls_init"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="UNAUTH-TLS",
identity="unauth-tls", ca_cert="auth_serv/ca.pem",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
with alloc_fail(dev[0], 1, "eap_wfa_unauth_tls_init"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP",
eap="WFA-UNAUTH-TLS",
identity="osen@example.com", ca_cert="auth_serv/ca.pem",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
with alloc_fail(dev[0], 1, "eap_peer_tls_ssl_init;eap_wfa_unauth_tls_init"):
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP",
eap="WFA-UNAUTH-TLS",
identity="osen@example.com", ca_cert="auth_serv/ca.pem",
wait_connect=False, scan_freq="2412")
wait_fail_trigger(dev[0], "GET_ALLOC_FAIL")
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_status(dev, apdev):
"""EAP state machine status information"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hostapd.add_ap(apdev[0], params)
dev[0].connect("test-wpa2-eap", key_mgmt="WPA-EAP", eap="PEAP",
identity="cert user",
ca_cert="auth_serv/ca.pem", phase2="auth=TLS",
ca_cert2="auth_serv/ca.pem",
client_cert2="auth_serv/user.pem",
private_key2="auth_serv/user.key",
scan_freq="2412", wait_connect=False)
success = False
states = []
method_states = []
decisions = []
req_methods = []
selected_methods = []
for i in range(100000):
s = dev[0].get_status(extra="VERBOSE")
if 'EAP state' in s:
state = s['EAP state']
if state:
if state not in states:
states.append(state)
if state == "SUCCESS":
success = True
break
if 'methodState' in s:
val = s['methodState']
if val not in method_states:
method_states.append(val)
if 'decision' in s:
val = s['decision']
if val not in decisions:
decisions.append(val)
if 'reqMethod' in s:
val = s['reqMethod']
if val not in req_methods:
req_methods.append(val)
if 'selectedMethod' in s:
val = s['selectedMethod']
if val not in selected_methods:
selected_methods.append(val)
logger.info("Iterations: %d" % i)
logger.info("EAP states: " + str(states))
logger.info("methodStates: " + str(method_states))
logger.info("decisions: " + str(decisions))
logger.info("reqMethods: " + str(req_methods))
logger.info("selectedMethods: " + str(selected_methods))
if not success:
raise Exception("EAP did not succeed")
dev[0].wait_connected()
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected()
def test_ap_wpa2_eap_gpsk_ptk_rekey_ap(dev, apdev):
"""WPA2-Enterprise with EAP-GPSK and PTK rekey enforced by AP"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
params['wpa_ptk_rekey'] = '2'
hapd = hostapd.add_ap(apdev[0], params)
id = eap_connect(dev[0], hapd, "GPSK", "gpsk user",
password="abcdefghijklmnop0123456789abcdef")
ev = dev[0].wait_event(["WPA: Key negotiation completed"])
if ev is None:
raise Exception("PTK rekey timed out")
hwsim_utils.test_connectivity(dev[0], hapd)
def test_ap_wpa2_eap_wildcard_ssid(dev, apdev):
"""WPA2-Enterprise connection using EAP-GPSK and wildcard SSID"""
params = hostapd.wpa2_eap_params(ssid="test-wpa2-eap")
hapd = hostapd.add_ap(apdev[0], params)
dev[0].connect(bssid=apdev[0]['bssid'], key_mgmt="WPA-EAP", eap="GPSK",
identity="gpsk user",
password="abcdefghijklmnop0123456789abcdef",
scan_freq="2412")
| 45.709708 | 157 | 0.619697 | 37,399 | 286,737 | 4.564507 | 0.028905 | 0.029946 | 0.015043 | 0.018289 | 0.877862 | 0.855115 | 0.829926 | 0.804432 | 0.78202 | 0.756063 | 0 | 0.056482 | 0.244914 | 286,737 | 6,272 | 158 | 45.716996 | 0.731963 | 0.015035 | 0 | 0.733296 | 0 | 0.001489 | 0.295842 | 0.085672 | 0.000558 | 0 | 0 | 0 | 0 | 0 | null | null | 0.066443 | 0.005211 | null | null | 0.000186 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
6ab74afe368c772daf510cbb62996de238b7ed00 | 10,461 | py | Python | edge/model/inference/symmetric7.py | Data-Science-in-Mechanical-Engineering/edge | 586eaba2f0957e75940f4f19fa774603f57eae89 | [
"MIT"
] | null | null | null | edge/model/inference/symmetric7.py | Data-Science-in-Mechanical-Engineering/edge | 586eaba2f0957e75940f4f19fa774603f57eae89 | [
"MIT"
] | null | null | null | edge/model/inference/symmetric7.py | Data-Science-in-Mechanical-Engineering/edge | 586eaba2f0957e75940f4f19fa774603f57eae89 | [
"MIT"
] | null | null | null | import gpytorch
import math
from edge.utils import atleast_2d, constraint_from_tuple
from .inference import GP
from edge.model.inference.kernels.custom_kernels import ConjugateKernel, ProductDecompositionKernel
from .tensorwrap import tensorwrap
def get_prior_and_constraint(prior, constraint):
if prior is not None:
prior = gpytorch.priors.NormalPrior(*prior)
constraint = constraint_from_tuple(constraint)
return prior, constraint
def get_initialization(name, initialization, prior):
init_val = initialization.get(name)
if init_val is not None:
return init_val
elif prior is not None:
return prior.mean
else:
return None
class SymmetricMaternCosGP(GP):
@tensorwrap('train_x', 'train_y')
def __init__(self, train_x, train_y,
noise_prior=None, noise_constraint=(1e-3, 1e4),
lengthscale_prior=None, lengthscale_constraint=None,
outputscale_prior=None, outputscale_constraint=None,
hyperparameters_initialization=None,
**kwargs):
train_x = atleast_2d(train_x)
if train_x.shape[1] != 7:
raise ValueError('SymmetricMaternCosGP can only be used on '
'7-dimensional data.')
self.__structure_dict = {
'noise_prior': noise_prior,
'noise_constraint': noise_constraint,
'lengthscale_prior': lengthscale_prior,
'lengthscale_constraint': lengthscale_constraint,
'outputscale_prior': outputscale_prior,
'outputscale_constraint': outputscale_constraint,
}
self.__structure_dict.update(kwargs)
if hyperparameters_initialization is None:
hyperparameters_initialization = {}
mean_module = gpytorch.means.ZeroMean()
lp, lc = get_prior_and_constraint(lengthscale_prior,
lengthscale_constraint)
matern_0 = gpytorch.kernels.MaternKernel(
nu=2.5,
ard_num_dims=4,
lengthscale_prior=lp,
lengthscale_constraint=lc,
)
l_init = get_initialization(
'matern_0.lengthscale', hyperparameters_initialization, lp
)
if l_init is not None:
matern_0.lengthscale = l_init
sym_matern_0 = matern_0 + ConjugateKernel(
matern_0, conjugation=[1, -1, 1, -1]
)
cos = gpytorch.kernels.CosineKernel()
cos.period_length = math.pi
lp, lc = get_prior_and_constraint(lengthscale_prior,
lengthscale_constraint)
matern_1 = gpytorch.kernels.MaternKernel(
nu=2.5,
ard_num_dims=1,
lengthscale_prior=lp,
lengthscale_constraint=lc
)
l_init = get_initialization(
'matern_1.lengthscale', hyperparameters_initialization, lp
)
if l_init is not None:
matern_1.lengthscale = l_init
sym_matern_1 = matern_1 + ConjugateKernel(matern_1, conjugation=[-1])
lp, lc = get_prior_and_constraint(lengthscale_prior,
lengthscale_constraint)
action = gpytorch.kernels.MaternKernel(
nu=2.5,
ard_num_dims=1,
lengthscale_prior=lp,
lengthscale_constraint=lc
)
l_init = get_initialization(
'matern_1.lengthscale', hyperparameters_initialization, lp
)
if l_init is not None:
action.lengthscale = l_init
# action = gpytorch.kernels.IndexKernel(
# num_tasks=4,
# rank=2,
# )
# sq2o2 = math.sqrt(2) / 2
# action.covar_factor = torch.nn.Parameter(torch.tensor([
# [0, 0],
# [sq2o2, -sq2o2],
# [0, 0],
# [-sq2o2, sq2o2],
# ]).float())
# action.var = torch.tensor([1., 0., 1., 0.]).float()
# # The covariance matrix of action is now
# # [[ 1, 0, 0, 0]
# # [ 0, 1, 0, -1]
# # [ 0, 0, 1, 0]
# # [-1, 0, 0, 1]]
prod_decomp = ProductDecompositionKernel(
(sym_matern_0, 4),
(cos, 1),
(sym_matern_1, 1),
(action, 1)
)
op, oc = get_prior_and_constraint(outputscale_prior,
outputscale_constraint)
covar_module = gpytorch.kernels.ScaleKernel(
base_kernel=prod_decomp,
outputscale_prior=op,
outputscale_constraint=oc
)
o_init = get_initialization(
'outputscale', hyperparameters_initialization, op
)
if o_init is not None:
covar_module.outputscale = o_init
noise_p, nc = get_prior_and_constraint(noise_prior,
noise_constraint)
likelihood = gpytorch.likelihoods.GaussianLikelihood(
noise_prior=noise_p,
noise_constraint=nc
)
n_init = get_initialization(
'noise_covar.noise', hyperparameters_initialization, op
)
if n_init is not None:
likelihood.noise_covar.noise = n_init
super(SymmetricMaternCosGP, self).__init__(train_x, train_y,
mean_module, covar_module,
likelihood, **kwargs)
@property
def structure_dict(self):
return self.__structure_dict
class GloballySymmetricMaternCosGP(GP):
@tensorwrap('train_x', 'train_y')
def __init__(self, train_x, train_y,
noise_prior=None, noise_constraint=(1e-3, 1e4),
lengthscale_prior=None, lengthscale_constraint=None,
outputscale_prior=None, outputscale_constraint=None,
hyperparameters_initialization=None,
**kwargs):
train_x = atleast_2d(train_x)
if train_x.shape[1] != 7:
raise ValueError('SymmetricMaternCosGP can only be used on '
'7-dimensional data.')
self.__structure_dict = {
'noise_prior': noise_prior,
'noise_constraint': noise_constraint,
'lengthscale_prior': lengthscale_prior,
'lengthscale_constraint': lengthscale_constraint,
'outputscale_prior': outputscale_prior,
'outputscale_constraint': outputscale_constraint,
}
self.__structure_dict.update(kwargs)
if hyperparameters_initialization is None:
hyperparameters_initialization = {}
mean_module = gpytorch.means.ZeroMean()
lp, lc = get_prior_and_constraint(lengthscale_prior,
lengthscale_constraint)
matern_0 = gpytorch.kernels.MaternKernel(
nu=2.5,
ard_num_dims=4,
lengthscale_prior=lp,
lengthscale_constraint=lc,
)
l_init = get_initialization(
'matern_0.lengthscale', hyperparameters_initialization, lp
)
if l_init is not None:
matern_0.lengthscale = l_init
cos = gpytorch.kernels.CosineKernel()
cos.period_length = math.pi
lp, lc = get_prior_and_constraint(lengthscale_prior,
lengthscale_constraint)
matern_1 = gpytorch.kernels.MaternKernel(
nu=2.5,
ard_num_dims=1,
lengthscale_prior=lp,
lengthscale_constraint=lc
)
l_init = get_initialization(
'matern_1.lengthscale', hyperparameters_initialization, lp
)
if l_init is not None:
matern_1.lengthscale = l_init
lp, lc = get_prior_and_constraint(lengthscale_prior,
lengthscale_constraint)
action = gpytorch.kernels.MaternKernel(
nu=2.5,
ard_num_dims=1,
lengthscale_prior=lp,
lengthscale_constraint=lc
)
l_init = get_initialization(
'matern_1.lengthscale', hyperparameters_initialization, lp
)
if l_init is not None:
action.lengthscale = l_init
# action = gpytorch.kernels.IndexKernel(
# num_tasks=4,
# rank=2,
# )
# sq2o2 = math.sqrt(2) / 2
# action.covar_factor = torch.nn.Parameter(torch.tensor([
# [0, 0],
# [sq2o2, -sq2o2],
# [0, 0],
# [-sq2o2, sq2o2],
# ]).float())
# action.var = torch.tensor([1., 0., 1., 0.]).float()
# # The covariance matrix of action is now
# # [[ 1, 0, 0, 0]
# # [ 0, 1, 0, -1]
# # [ 0, 0, 1, 0]
# # [-1, 0, 0, 1]]
prod_decomp = ProductDecompositionKernel(
(matern_0, 4),
(cos, 1),
(matern_1, 1),
(action, 1)
)
sym = prod_decomp + ConjugateKernel(
prod_decomp, conjugation=[1, -1, 1, -1, -1, -1, -1]
)
op, oc = get_prior_and_constraint(outputscale_prior,
outputscale_constraint)
covar_module = gpytorch.kernels.ScaleKernel(
base_kernel=sym,
outputscale_prior=op,
outputscale_constraint=oc
)
o_init = get_initialization(
'outputscale', hyperparameters_initialization, op
)
if o_init is not None:
covar_module.outputscale = o_init
noise_p, nc = get_prior_and_constraint(noise_prior,
noise_constraint)
likelihood = gpytorch.likelihoods.GaussianLikelihood(
noise_prior=noise_p,
noise_constraint=nc
)
n_init = get_initialization(
'noise_covar.noise', hyperparameters_initialization, op
)
if n_init is not None:
likelihood.noise_covar.noise = n_init
super(GloballySymmetricMaternCosGP, self).__init__(train_x, train_y,
mean_module, covar_module,
likelihood, **kwargs)
@property
def structure_dict(self):
return self.__structure_dict | 36.072414 | 99 | 0.560176 | 1,019 | 10,461 | 5.454367 | 0.121688 | 0.051817 | 0.021051 | 0.041562 | 0.865959 | 0.848507 | 0.847427 | 0.847427 | 0.847427 | 0.847427 | 0 | 0.0231 | 0.35857 | 10,461 | 290 | 100 | 36.072414 | 0.805216 | 0.07657 | 0 | 0.734513 | 0 | 0 | 0.055532 | 0.009151 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026549 | false | 0 | 0.026549 | 0.00885 | 0.088496 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6acf41b215a9a99fad6c8825181a0c33ba4624ae | 22,849 | py | Python | telemetry/third_party/modulegraph/modulegraph_tests/test_edge_data.py | Scopetta197/catapult | 9cbc959a8a88460f71473794bdc92e39d6b3ee0d | [
"BSD-3-Clause"
] | 1 | 2021-07-04T03:26:43.000Z | 2021-07-04T03:26:43.000Z | telemetry/third_party/modulegraph/modulegraph_tests/test_edge_data.py | QPC-database/catapult | de5768d3116cc7792bf7c0b679912a6ca82db758 | [
"BSD-3-Clause"
] | null | null | null | telemetry/third_party/modulegraph/modulegraph_tests/test_edge_data.py | QPC-database/catapult | de5768d3116cc7792bf7c0b679912a6ca82db758 | [
"BSD-3-Clause"
] | null | null | null | from __future__ import absolute_import
import os
import sys
if sys.version_info[:2] <= (2, 6):
import unittest2 as unittest
else:
import unittest
from modulegraph import modulegraph
# XXX: Todo: simular tests with bytecompiled modules
class TestEdgeData (unittest.TestCase):
if not hasattr(unittest.TestCase, 'assertIsInstance'):
def assertIsInstance(self, value, types):
if not isinstance(value, types):
self.fail("%r is not an instance of %r"%(value, types))
def test_regular_import(self):
root = os.path.join(
os.path.dirname(os.path.abspath(__file__)),
'testpkg-edgedata')
mf = modulegraph.ModuleGraph(path=[ root ] + sys.path)
script_name = os.path.join(root, 'script.py')
mf.run_script(script_name)
script_node = mf.findNode(script_name)
self.assertIsInstance(script_node, modulegraph.Script)
node = mf.findNode('toplevel_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=False))
node = mf.findNode('toplevel_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=False))
node = mf.findNode('toplevel_class_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=False))
node = mf.findNode('toplevel_class_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=False))
node = mf.findNode('toplevel_conditional_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=False, fromlist=False))
node = mf.findNode('toplevel_conditional_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=False, fromlist=False))
node = mf.findNode('toplevel_conditional_import_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=True, fromlist=False))
node = mf.findNode('toplevel_conditional_import_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=True, fromlist=False))
node = mf.findNode('toplevel_conditional_import2_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=True, fromlist=False))
node = mf.findNode('toplevel_conditional_import2_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=True, fromlist=False))
node = mf.findNode('toplevel_import_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=True, fromlist=False))
node = mf.findNode('toplevel_import_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=True, fromlist=False))
node = mf.findNode('toplevel_import2_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=True, fromlist=False))
node = mf.findNode('toplevel_import2_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=True, fromlist=False))
node = mf.findNode('function_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=False, fromlist=False))
node = mf.findNode('function_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=False, fromlist=False))
node = mf.findNode('function_class_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=False, fromlist=False))
node = mf.findNode('function_class_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=False, fromlist=False))
node = mf.findNode('function_conditional_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=False, fromlist=False))
node = mf.findNode('function_conditional_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=False, fromlist=False))
node = mf.findNode('function_conditional_import_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=True, fromlist=False))
node = mf.findNode('function_conditional_import_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=True, fromlist=False))
node = mf.findNode('function_conditional_import2_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=True, fromlist=False))
node = mf.findNode('function_conditional_import2_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=True, fromlist=False))
node = mf.findNode('function_import_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=True, fromlist=False))
node = mf.findNode('function_import_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=True, fromlist=False))
node = mf.findNode('function_import2_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=True, fromlist=False))
node = mf.findNode('function_import2_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=True, fromlist=False))
def test_multi_import(self):
root = os.path.join(
os.path.dirname(os.path.abspath(__file__)),
'testpkg-edgedata')
mf = modulegraph.ModuleGraph(path=[ root ] + sys.path)
script_name = os.path.join(root, 'script_multi_import.py')
mf.run_script(script_name)
script_node = mf.findNode(script_name)
self.assertIsInstance(script_node, modulegraph.Script)
node = mf.findNode('os.path')
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=True, fromlist=False))
node = mf.findNode('os')
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=False))
node = mf.findNode('sys')
ed = mf.edgeData(script_node, node)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=False, fromlist=False))
node = mf.findNode('platform')
ed = mf.edgeData(script_node, node)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=False, fromlist=False))
node = mf.findNode('email')
ed = mf.edgeData(script_node, node)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=False))
def test_from_imports(self):
root = os.path.join(
os.path.dirname(os.path.abspath(__file__)),
'testpkg-edgedata')
mf = modulegraph.ModuleGraph(path=[ root ] + sys.path)
script_name = os.path.join(root, 'script_from_import.py')
mf.run_script(script_name)
script_node = mf.findNode(script_name)
self.assertIsInstance(script_node, modulegraph.Script)
node = mf.findNode('pkg.toplevel_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.toplevel_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.toplevel_class_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.toplevel_class_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.toplevel_conditional_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.toplevel_conditional_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.toplevel_conditional_import_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.toplevel_conditional_import_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.toplevel_conditional_import2_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.toplevel_conditional_import2_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=False, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.toplevel_import_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.toplevel_import_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.toplevel_import2_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.toplevel_import2_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=False, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.function_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.function_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.function_class_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.function_class_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.function_conditional_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.function_conditional_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=False, fromlist=True))
node = mf.findNode('pkg.function_conditional_import_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.function_conditional_import_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.function_conditional_import2_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.function_conditional_import2_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=True, function=True, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.function_import_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.function_import_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.function_import2_existing')
self.assertIsInstance(node, modulegraph.SourceModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=True, fromlist=True))
node = mf.findNode('pkg.function_import2_nonexisting')
self.assertIsInstance(node, modulegraph.MissingModule)
ed = mf.edgeData(script_node, node)
self.assertIsInstance(ed, modulegraph.DependencyInfo)
self.assertEqual(ed, modulegraph.DependencyInfo(conditional=False, function=True, tryexcept=True, fromlist=True))
if __name__ == "__main__":
unittest.main()
| 54.53222 | 124 | 0.72966 | 2,346 | 22,849 | 7.006394 | 0.033248 | 0.094117 | 0.195474 | 0.066801 | 0.972379 | 0.972197 | 0.972197 | 0.972197 | 0.972197 | 0.972197 | 0 | 0.001053 | 0.168498 | 22,849 | 418 | 125 | 54.662679 | 0.864098 | 0.002188 | 0 | 0.758112 | 0 | 0 | 0.083344 | 0.074264 | 0 | 0 | 0 | 0.002392 | 0.530973 | 1 | 0.011799 | false | 0 | 0.126844 | 0 | 0.141593 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
6aec992b5e8facc1ebcd72c35a0a8fe766c5ca50 | 2,328 | py | Python | elsec/test/fixture.py | mkocikowski/elsec | 1568d594a61ccdc210276cf071a83cec381574c2 | [
"MIT",
"Unlicense"
] | 5 | 2015-07-02T02:54:26.000Z | 2021-05-03T14:16:45.000Z | elsec/test/fixture.py | mkocikowski/elsec | 1568d594a61ccdc210276cf071a83cec381574c2 | [
"MIT",
"Unlicense"
] | null | null | null | elsec/test/fixture.py | mkocikowski/elsec | 1568d594a61ccdc210276cf071a83cec381574c2 | [
"MIT",
"Unlicense"
] | 1 | 2021-05-14T09:38:11.000Z | 2021-05-14T09:38:11.000Z | # -*- coding: utf-8 -*-
import json
import elsec.http
def delete():
# delete test indexes
elsec.http.delete("http://localhost:9200/elsec_test_index1")
elsec.http.delete("http://localhost:9200/elsec_test_index2")
elsec.http.delete("http://localhost:9200/elsec_test_index_1")
elsec.http.delete("http://localhost:9200/elsec_test_index_2")
def create():
# create test indexes
elsec.http.put("http://localhost:9200/elsec_test_index_1/", """{"settings": {"index": {"number_of_replicas": 0, "number_of_shards": 1}}}""")
elsec.http.put("http://localhost:9200/elsec_test_index_2/", """{"settings": {"index": {"number_of_replicas": 0, "number_of_shards": 1}}}""")
# create an alias which joins the two indices
elsec.http.post("http://localhost:9200/_aliases/", """{"actions": [{ "add" : {"index": "elsec_test_index_1", "alias": "elsec_test_alias_1" } }, {"add": {"index": "elsec_test_index_2", "alias": "elsec_test_alias_1" }}, {"add": {"index": "elsec_test_index_1", "alias": "elsec_test_alias_2" }}]}""")
# populate with test documents
elsec.http.put("http://localhost:9200/elsec_test_index_1/doctype_1/1", json.dumps({'field_1': 'in1, dt1', 'field_2': 'value 1', }))
elsec.http.put("http://localhost:9200/elsec_test_index_1/doctype_1/2", json.dumps({'field_1': 'in1, dt1', 'field_2': 'value 2', }))
elsec.http.put("http://localhost:9200/elsec_test_index_1/doctype_1/3", json.dumps({'field_1': 'in1, dt1', 'field_2': 'value 3', }))
elsec.http.put("http://localhost:9200/elsec_test_index_1/doctype_1/4", json.dumps({'field_1': 'in1, dt1', 'field_2': None, }))
elsec.http.put("http://localhost:9200/elsec_test_index_1/doctype_2/1", json.dumps({'field_1': 'in1, dt2', 'field_2': 'value 1', }))
elsec.http.put("http://localhost:9200/elsec_test_index_2/doctype_1/1", json.dumps({'field_1': 'in2, dt1', 'field_2': 'value 1', }))
elsec.http.put("http://localhost:9200/elsec_test_index_2/doctype_2/2", json.dumps({'field_1': 'in2, dt2', 'field_2': 'value 1', }))
elsec.http.put("http://localhost:9200/elsec_test_index_2/doctype_3/2", json.dumps({'field_A': 'in2, dt3', 'field_B': 'value 1', }))
elsec.http.post("http://localhost:9200/elsec_test_index_1,elsec_test_index_2/_flush", None)
if __name__ == '__main__':
delete()
create() | 56.780488 | 300 | 0.674399 | 353 | 2,328 | 4.147309 | 0.164306 | 0.135246 | 0.162568 | 0.22541 | 0.821038 | 0.805328 | 0.762978 | 0.75 | 0.64959 | 0.530738 | 0 | 0.069599 | 0.111254 | 2,328 | 41 | 301 | 56.780488 | 0.637989 | 0.05756 | 0 | 0 | 0 | 0.043478 | 0.625857 | 0.048881 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086957 | true | 0 | 0.086957 | 0 | 0.173913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0aa90adb179738d518f3968cdcb20fbc1e9a4c25 | 12,071 | py | Python | tests/test_sections.py | tangq/zppy | f86fbb41a71c25ef14ceb23c153311b7a3b4d735 | [
"BSD-3-Clause"
] | null | null | null | tests/test_sections.py | tangq/zppy | f86fbb41a71c25ef14ceb23c153311b7a3b4d735 | [
"BSD-3-Clause"
] | null | null | null | tests/test_sections.py | tangq/zppy | f86fbb41a71c25ef14ceb23c153311b7a3b4d735 | [
"BSD-3-Clause"
] | null | null | null | import os
import unittest
from configobj import ConfigObj
from validate import Validator
from zppy.utils import getTasks
def get_config(test_case, config_file):
# Subdirectory where templates are located
templateDir = os.path.join("zppy", "templates")
# Read configuration file and validate it
config = ConfigObj(config_file, configspec=os.path.join(templateDir, "default.ini"))
validator = Validator()
test_case.assertTrue(config.validate(validator))
# Add templateDir to config
config["default"]["templateDir"] = templateDir
return config
class TestAllSets(unittest.TestCase):
def test_sections(self):
config = get_config(self, "tests/test_sections.cfg")
# default
actual_default = config["default"]
expected_default = {
"input": "INPUT",
"input_subdir": "INPUT_SUBDIR",
"output": "OUTPUT",
"case": "CASE",
"www": "WWWW",
"partition": "SHORT",
"debug": False,
"e3sm_unified": "latest",
"dry_run": False,
"templateDir": "zppy/templates",
}
self.assertEqual(actual_default, expected_default)
# ts
section_name = "ts"
actual_section = config[section_name]
expected_section = {
"active": True,
"vars": "FSNTOA,FLUT,FSNT,FLNT,FSNS,FLNS,SHFLX,QFLX,PRECC,PRECL,PRECSC,PRECSL,TS,TREFHT",
"mapping_file": "MAPPING_FILE_TS",
"years": ["0001:0020:5"],
"qos": "regular",
"nodes": 1,
"walltime": "02:00:00",
"input_files": "eam.h0",
"frequency": "monthly",
"grid": "",
"area_nm": "area",
"dpf": 30,
"tpd": 1,
}
self.assertEqual(actual_section, expected_section)
actual_tasks = getTasks(config, section_name)
self.assertEqual(len(actual_tasks), 1)
actual_task = actual_tasks[0]
expected_task = {
"active": True,
"area_nm": "area",
"case": "CASE",
"debug": False,
"dpf": 30,
"dry_run": False,
"e3sm_unified": "latest",
"frequency": "monthly",
"grid": "",
"input": "INPUT",
"input_files": "eam.h0",
"input_subdir": "INPUT_SUBDIR",
"mapping_file": "MAPPING_FILE_TS",
"nodes": 1,
"output": "OUTPUT",
"partition": "SHORT",
"qos": "regular",
"subsection": None,
"templateDir": "zppy/templates",
"tpd": 1,
"vars": "FSNTOA,FLUT,FSNT,FLNT,FSNS,FLNS,SHFLX,QFLX,PRECC,PRECL,PRECSC,PRECSL,TS,TREFHT",
"walltime": "02:00:00",
"www": "WWWW",
"years": ["0001:0020:5"],
}
self.assertEqual(actual_task, expected_task)
# climo
section_name = "climo"
actual_section = config[section_name]
expected_section = {
"active": True,
"years": ["0001:0050:50"],
"mapping_file": "MAPPING_FILE_CLIMO",
"qos": "regular",
"nodes": 4,
"walltime": "02:00:00",
"input_files": "eam.h0",
"frequency": "monthly",
"grid": "",
"exclude": False,
"vars": "",
}
self.assertEqual(actual_section, expected_section)
actual_tasks = getTasks(config, section_name)
self.assertEqual(len(actual_tasks), 1)
actual_task = actual_tasks[0]
expected_task = {
"active": True,
"case": "CASE",
"debug": False,
"dry_run": False,
"e3sm_unified": "latest",
"exclude": False,
"frequency": "monthly",
"grid": "",
"input": "INPUT",
"input_files": "eam.h0",
"input_subdir": "INPUT_SUBDIR",
"mapping_file": "MAPPING_FILE_CLIMO",
"nodes": 4,
"output": "OUTPUT",
"partition": "SHORT",
"qos": "regular",
"subsection": None,
"templateDir": "zppy/templates",
"vars": "",
"walltime": "02:00:00",
"www": "WWWW",
"years": ["0001:0050:50"],
}
self.assertEqual(actual_task, expected_task)
def test_subsections(self):
config = get_config(self, "tests/test_subsections.cfg")
# default
actual_default = config["default"]
expected_default = {
"input": "INPUT",
"input_subdir": "INPUT_SUBDIR",
"output": "OUTPUT",
"case": "CASE",
"www": "WWWW",
"partition": "SHORT",
"debug": False,
"e3sm_unified": "latest",
"dry_run": False,
"templateDir": "zppy/templates",
}
self.assertEqual(actual_default, expected_default)
# ts
section_name = "ts"
actual_section = config[section_name]
expected_section = {
"active": True,
"vars": "FSNTOA,FLUT,FSNT,FLNT,FSNS,FLNS,SHFLX,QFLX,PRECC,PRECL,PRECSC,PRECSL,TS,TREFHT",
"qos": "regular",
"nodes": 1,
"walltime": "02:00:00",
"input_files": "eam.h0",
"frequency": "monthly",
"mapping_file": "",
"grid": "",
"area_nm": "area",
"years": [""],
"dpf": 30,
"tpd": 1,
"ts_grid1": {
"mapping_file": "MAPPING_FILE_TS_GRID1",
"years": ["0001:0020:5"],
"active": None,
"qos": None,
"nodes": None,
"walltime": None,
"input_files": None,
"frequency": None,
"grid": None,
"area_nm": None,
"vars": None,
"dpf": None,
"tpd": None,
},
"ts_grid2": {
"mapping_file": "MAPPING_FILE_TS_GRID2",
"years": ["0001:0020:10"],
"active": None,
"qos": None,
"nodes": None,
"walltime": None,
"input_files": None,
"frequency": None,
"grid": None,
"area_nm": None,
"vars": None,
"dpf": None,
"tpd": None,
},
}
self.assertEqual(actual_section, expected_section)
actual_tasks = getTasks(config, section_name)
self.assertEqual(len(actual_tasks), 2)
actual_task = actual_tasks[0]
expected_task = {
"active": True,
"area_nm": "area",
"case": "CASE",
"debug": False,
"dpf": 30,
"dry_run": False,
"e3sm_unified": "latest",
"frequency": "monthly",
"grid": "",
"input": "INPUT",
"input_files": "eam.h0",
"input_subdir": "INPUT_SUBDIR",
"mapping_file": "MAPPING_FILE_TS_GRID1",
"nodes": 1,
"output": "OUTPUT",
"partition": "SHORT",
"qos": "regular",
"subsection": "ts_grid1",
"templateDir": "zppy/templates",
"tpd": 1,
"vars": "FSNTOA,FLUT,FSNT,FLNT,FSNS,FLNS,SHFLX,QFLX,PRECC,PRECL,PRECSC,PRECSL,TS,TREFHT",
"walltime": "02:00:00",
"www": "WWWW",
"years": ["0001:0020:5"],
}
self.assertEqual(actual_task, expected_task)
actual_task = actual_tasks[1]
expected_task = {
"active": True,
"area_nm": "area",
"case": "CASE",
"debug": False,
"dpf": 30,
"dry_run": False,
"e3sm_unified": "latest",
"frequency": "monthly",
"grid": "",
"input": "INPUT",
"input_files": "eam.h0",
"input_subdir": "INPUT_SUBDIR",
"mapping_file": "MAPPING_FILE_TS_GRID2",
"nodes": 1,
"output": "OUTPUT",
"partition": "SHORT",
"qos": "regular",
"subsection": "ts_grid2",
"templateDir": "zppy/templates",
"tpd": 1,
"vars": "FSNTOA,FLUT,FSNT,FLNT,FSNS,FLNS,SHFLX,QFLX,PRECC,PRECL,PRECSC,PRECSL,TS,TREFHT",
"walltime": "02:00:00",
"www": "WWWW",
"years": ["0001:0020:10"],
}
self.assertEqual(actual_task, expected_task)
# climo
section_name = "climo"
actual_section = config[section_name]
expected_section = {
"active": True,
"years": ["0001:0050:50"],
"mapping_file": "MAPPING_FILE_CLIMO",
"qos": "regular",
"nodes": 4,
"walltime": "02:00:00",
"input_files": "eam.h0",
"frequency": "monthly",
"grid": "",
"exclude": False,
"vars": "",
"climo_grid1": {
"mapping_file": "MAPPING_FILE_CLIMO_GRID1",
"active": None,
"qos": None,
"nodes": None,
"walltime": None,
"input_files": None,
"frequency": None,
"grid": None,
"years": None,
"exclude": None,
"vars": None,
},
"climo_grid2": {
"mapping_file": "MAPPING_FILE_CLIMO_GRID2",
"years": ["0001:0100:50"],
"partition": "LONG",
"active": None,
"qos": None,
"nodes": None,
"walltime": None,
"input_files": None,
"frequency": None,
"grid": None,
"exclude": None,
"vars": None,
},
}
self.assertEqual(actual_section, expected_section)
actual_tasks = getTasks(config, section_name)
self.assertEqual(len(actual_tasks), 2)
actual_task = actual_tasks[0]
expected_task = {
"active": True,
"case": "CASE",
"debug": False,
"dry_run": False,
"e3sm_unified": "latest",
"exclude": False,
"frequency": "monthly",
"grid": "",
"input": "INPUT",
"input_files": "eam.h0",
"input_subdir": "INPUT_SUBDIR",
"mapping_file": "MAPPING_FILE_CLIMO_GRID1",
"nodes": 4,
"output": "OUTPUT",
"partition": "SHORT",
"qos": "regular",
"subsection": "climo_grid1",
"templateDir": "zppy/templates",
"vars": "",
"walltime": "02:00:00",
"www": "WWWW",
"years": ["0001:0050:50"],
}
self.assertEqual(actual_task, expected_task)
actual_task = actual_tasks[1]
expected_task = {
"active": True,
"case": "CASE",
"debug": False,
"dry_run": False,
"e3sm_unified": "latest",
"exclude": False,
"frequency": "monthly",
"grid": "",
"input": "INPUT",
"input_files": "eam.h0",
"input_subdir": "INPUT_SUBDIR",
"mapping_file": "MAPPING_FILE_CLIMO_GRID2",
"nodes": 4,
"output": "OUTPUT",
"partition": "LONG",
"qos": "regular",
"subsection": "climo_grid2",
"templateDir": "zppy/templates",
"vars": "",
"walltime": "02:00:00",
"www": "WWWW",
"years": ["0001:0100:50"],
}
self.assertEqual(actual_task, expected_task)
if __name__ == "__main__":
unittest.main()
| 32.536388 | 101 | 0.461602 | 1,056 | 12,071 | 5.085227 | 0.114583 | 0.055307 | 0.043575 | 0.053259 | 0.86108 | 0.840782 | 0.814339 | 0.795158 | 0.795158 | 0.78324 | 0 | 0.033306 | 0.390606 | 12,071 | 370 | 102 | 32.624324 | 0.69671 | 0.011598 | 0 | 0.865497 | 0 | 0.01462 | 0.292509 | 0.051925 | 0 | 0 | 0 | 0 | 0.049708 | 1 | 0.008772 | false | 0 | 0.01462 | 0 | 0.02924 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0aae1578070a472f3bd15163b4d2866761c533a9 | 8,590 | py | Python | test/test_verifier_db.py | tiandao-qsr/keylime | e742f8fe921fbd630a48a3be4d77e60f8dc4bfe8 | [
"Apache-2.0"
] | null | null | null | test/test_verifier_db.py | tiandao-qsr/keylime | e742f8fe921fbd630a48a3be4d77e60f8dc4bfe8 | [
"Apache-2.0"
] | null | null | null | test/test_verifier_db.py | tiandao-qsr/keylime | e742f8fe921fbd630a48a3be4d77e60f8dc4bfe8 | [
"Apache-2.0"
] | null | null | null | '''
SPDX-License-Identifier: Apache-2.0
Copyright 2020 Luke Hinds (lhinds@redhat.com), Red Hat, Inc.
'''
import unittest
from keylime.db.verifier_db import VerfierMain
from keylime.db.keylime_db import SessionManager
from sqlalchemy import create_engine
# BEGIN TEST DATA
test_data = {
'v': 'cf0B779EA1dkHVWfTxQuSLHNFeutYeSmVWe7JOFWzXg=',
'ip': '127.0.0.1',
'port': 9002,
'operational_state': 1,
'public_key': '',
'tpm_policy': '{"22": ["0000000000000000000000000000000000000001", "0000000000000000000000000000000000000000000000000000000000000001", "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001", "ffffffffffffffffffffffffffffffffffffffff", "ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff", "ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"], "15": ["0000000000000000000000000000000000000000", "0000000000000000000000000000000000000000000000000000000000000000", "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"], "mask": "0x408400"}',
'vtpm_policy': '{"23": ["ffffffffffffffffffffffffffffffffffffffff", "0000000000000000000000000000000000000000"], "15": ["0000000000000000000000000000000000000000"], "mask": "0x808000"}',
'meta_data': '{"cert_serial": 2, "subject": "/C=US/CN=D432FBB3-D2F1-4A97-9EF7-75BD81C00000/ST=MA/L=Lexington/O=MITLL/OU=53"}',
'allowlist': '{"allowlist": {"/boot/System.map-5.1.17-300.fc30.x86_64": ["bdc084cc61c67dada53ff92c3235fbc774eace36aceb11967718399837e36485"], "/boot/vmlinuz-5.0.9-301.fc30.x86_64": ["187e65c35f449df145b57940cb73606623ab1eccc352f5b0d9b64c4d2ad3be58"], "/boot/initramfs-5.1.15-300.fc30.x86_64.img": ["7fb94b644d95de6ed2f70c247cf9a572027815b8f6a00b8c5f7b9fd2feef0ff1"], "/boot/config-5.0.9-301.fc30.x86_64": ["540f7b2732b8018be45dcfdf737fa6e51d9f5924d85b6c1987ddb4215260b49f"], "boot_aggregate": ["0000000000000000000000000000000000000000"]}, "exclude": ["/*"]}',
'revocation_key': '-----BEGIN PRIVATE KEY-----\nMIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQDs1onKjLZHDnqu\nnrsCb5aZohK2FU+jjU4NT23x1UzYzpFU9wBZ0avj+HeFYbQiAKanSbS7PvhjJMdE\naWMgRMgigr2K1xx+ZhBu4zTPFMy11msxMIL/HPROSYx/9wUZrhf4z/rBsuppFVs3\nKfKmQHuptjZX+D+m+nANO8WOILyW2+5YO5FNw1XJ3gVO6elJJ/CzQcYWIioONuTM\nQ+g8OQc+yTZruwedcSpOX56GpBImWUKXzTz3zoX7AlYxjEBjT86rxoBVXo3ZIwYx\n+mJk7NADN2qvLSXxTnLBxISHdsUDBP5DfurFQPhZC5oARN6/Y4zPESnhm8iwlG5o\n2ZjxVzphAgMBAAECggEBAKVKudImUJTY8yBp4aS6koXYymxQBUvlM8MwW1A7iK2L\nxXxiAtms7uVlJK1vWhOdFrKMS1mfgiVXpscFMkx0FKWZT4XVyaohu3hYlCOupYyH\nADrNW6+G2q7EwA0TLnkUuuBI7v4+y0DZydZ/LT2ApY31gIn21R3JjWh+/crK6DP0\nJO51hLO+z4GAMbWimRzA3lnYltUSJEvam3EHnj/pW+hlczjdI6AfJTWRWx6+gqP3\nRBvLcjBA9ZIx4JzYab5tnvwnd8ZzVItYBQJ8UhxzNsrSzEGguUEO4G/jYQTtYi6T\nufksmewcIClp48AfDThKSCMQXgFwpVI4EPxwmfd6Mt0CgYEA+i+2jjeFREMNam4p\nEBf5tmY2xvg3HXGgCjBfllepZQZHQatfv/kEqhFW497W+okyjTXflMR1TkjMKAqO\nahA+D1lItycPxsvTTiZ85KgrybbQT7Y+s2ET2f68wZh2XyiJIYE/MNi3ZclIBFaY\npyXicj0RIB6IY9PIHNgdEHI4casCgYEA8ldrcbWof8YpwJ6KFVuMvkYKniVF0aXH\nsQUWL/dyjBYIq/jg3Z4J+b0360DhZVpp1SaO4jFISxVMRzkDf3/gbKxH9F4a9Id8\nDmGH15v1ooKBYfkk7GwEB3AOY4gN3RMnWb1hxxhjsM9pmeTffqgqYzHYzv1ArjHe\ntYkjWOqPECMCgYBT//kXPuTrymeSuHHpCWO6Lg9uNqCqrh/BzAQMAlrJpJYAIn3/\ngqhiQXgfAg7EB5SFfPUYie2o3yBMwV6XleSAWsXjWKYfZQgJUTrVuvEYxNykJthe\nedWkd7cAeSQlRwLj0PVafSj2b+JSMpEGbd3d5Ur+scGxYsXpiVYY04DICQKBgBPZ\nhTtzHbIZkSHt2nGVZhnPst7xPp7FbW3adM7I/eDrjRpI8GI2p6qFDSd/0PZ0SWbk\nGZ/9WWaNAAp1aQvwdXlxQxOJAbw1vLuQ0Yefhqcg+WgE+DlFP688RnFwm3IYN4jq\nMjAUl1XMJ2IrlQLS02X8lz2dEMcz3oIQEY0e6UjxAoGAFeiOjFF2i4wRRUKx8kpb\nnBKRmFaMXdkeMV2IQALJ4skNNflf0YdDFVniFUyq9vfbq2drJSnMiy8Dvju0j5PC\n+MALz22fsNoIV2h6gz0i1lXiyVgpoAhYCbbPv0wO6iHKPBzH3Onv6BKrVMy1pnzh\n6QsfbhjzBfFg1Zxp/h1tBqA=\n-----END PRIVATE KEY-----\n',
'tpm_version': 0,
'accept_tpm_hash_algs': ['sha512',
'sha384',
'sha256',
'sha1'],
'accept_tpm_encryption_algs': ['ecc', 'rsa'],
'accept_tpm_signing_algs': ['ecschnorr', 'rsassa'],
'hash_alg': '',
'enc_alg': '',
'sign_alg': '',
'agent_id': 'D432FBB3-D2F1-4A97-9EF7-75BD81C00000'
}
agent_id = 'D432FBB3-D2F1-4A97-9EF7-75BD81C00000'
TENANT_FAILED = 10
# END TEST DATA
class TestVerfierDB(unittest.TestCase):
def setUp(self):
self.engine = create_engine('sqlite://')
VerfierMain.metadata.create_all(self.engine, checkfirst=True)
self.session = SessionManager().make_session(self.engine)
self.populate_agent()
def populate_agent(self):
self.session.add(VerfierMain(**test_data))
self.session.commit()
def test_add_agent(self):
agent = self.session.query(VerfierMain).filter_by(
agent_id=agent_id).first()
self.assertEqual(
agent.v, 'cf0B779EA1dkHVWfTxQuSLHNFeutYeSmVWe7JOFWzXg=')
self.assertEqual(
agent.port, 9002)
self.assertEqual(
agent.tpm_policy, '{"22": ["0000000000000000000000000000000000000001", "0000000000000000000000000000000000000000000000000000000000000001", "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001", "ffffffffffffffffffffffffffffffffffffffff", "ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff", "ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"], "15": ["0000000000000000000000000000000000000000", "0000000000000000000000000000000000000000000000000000000000000000", "000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"], "mask": "0x408400"}')
self.assertEqual(
agent.revocation_key, '-----BEGIN PRIVATE KEY-----\nMIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQDs1onKjLZHDnqu\nnrsCb5aZohK2FU+jjU4NT23x1UzYzpFU9wBZ0avj+HeFYbQiAKanSbS7PvhjJMdE\naWMgRMgigr2K1xx+ZhBu4zTPFMy11msxMIL/HPROSYx/9wUZrhf4z/rBsuppFVs3\nKfKmQHuptjZX+D+m+nANO8WOILyW2+5YO5FNw1XJ3gVO6elJJ/CzQcYWIioONuTM\nQ+g8OQc+yTZruwedcSpOX56GpBImWUKXzTz3zoX7AlYxjEBjT86rxoBVXo3ZIwYx\n+mJk7NADN2qvLSXxTnLBxISHdsUDBP5DfurFQPhZC5oARN6/Y4zPESnhm8iwlG5o\n2ZjxVzphAgMBAAECggEBAKVKudImUJTY8yBp4aS6koXYymxQBUvlM8MwW1A7iK2L\nxXxiAtms7uVlJK1vWhOdFrKMS1mfgiVXpscFMkx0FKWZT4XVyaohu3hYlCOupYyH\nADrNW6+G2q7EwA0TLnkUuuBI7v4+y0DZydZ/LT2ApY31gIn21R3JjWh+/crK6DP0\nJO51hLO+z4GAMbWimRzA3lnYltUSJEvam3EHnj/pW+hlczjdI6AfJTWRWx6+gqP3\nRBvLcjBA9ZIx4JzYab5tnvwnd8ZzVItYBQJ8UhxzNsrSzEGguUEO4G/jYQTtYi6T\nufksmewcIClp48AfDThKSCMQXgFwpVI4EPxwmfd6Mt0CgYEA+i+2jjeFREMNam4p\nEBf5tmY2xvg3HXGgCjBfllepZQZHQatfv/kEqhFW497W+okyjTXflMR1TkjMKAqO\nahA+D1lItycPxsvTTiZ85KgrybbQT7Y+s2ET2f68wZh2XyiJIYE/MNi3ZclIBFaY\npyXicj0RIB6IY9PIHNgdEHI4casCgYEA8ldrcbWof8YpwJ6KFVuMvkYKniVF0aXH\nsQUWL/dyjBYIq/jg3Z4J+b0360DhZVpp1SaO4jFISxVMRzkDf3/gbKxH9F4a9Id8\nDmGH15v1ooKBYfkk7GwEB3AOY4gN3RMnWb1hxxhjsM9pmeTffqgqYzHYzv1ArjHe\ntYkjWOqPECMCgYBT//kXPuTrymeSuHHpCWO6Lg9uNqCqrh/BzAQMAlrJpJYAIn3/\ngqhiQXgfAg7EB5SFfPUYie2o3yBMwV6XleSAWsXjWKYfZQgJUTrVuvEYxNykJthe\nedWkd7cAeSQlRwLj0PVafSj2b+JSMpEGbd3d5Ur+scGxYsXpiVYY04DICQKBgBPZ\nhTtzHbIZkSHt2nGVZhnPst7xPp7FbW3adM7I/eDrjRpI8GI2p6qFDSd/0PZ0SWbk\nGZ/9WWaNAAp1aQvwdXlxQxOJAbw1vLuQ0Yefhqcg+WgE+DlFP688RnFwm3IYN4jq\nMjAUl1XMJ2IrlQLS02X8lz2dEMcz3oIQEY0e6UjxAoGAFeiOjFF2i4wRRUKx8kpb\nnBKRmFaMXdkeMV2IQALJ4skNNflf0YdDFVniFUyq9vfbq2drJSnMiy8Dvju0j5PC\n+MALz22fsNoIV2h6gz0i1lXiyVgpoAhYCbbPv0wO6iHKPBzH3Onv6BKrVMy1pnzh\n6QsfbhjzBfFg1Zxp/h1tBqA=\n-----END PRIVATE KEY-----\n')
self.assertEqual(agent.accept_tpm_hash_algs, [
'sha512',
'sha384',
'sha256',
'sha1'])
def test_count_agents(self):
agent = self.session.query(
VerfierMain.agent_id).count()
self.assertEqual(agent, 1)
def test_set_operation_state(self):
self.session.query(VerfierMain).filter(agent_id == agent_id).update(
{'operational_state': TENANT_FAILED})
self.session.commit()
agent = self.session.query(VerfierMain).filter_by(
agent_id=agent_id).first()
self.assertEqual(agent.operational_state, 10)
def test_delete_agent(self):
agent = self.session.query(VerfierMain).filter_by(
agent_id=agent_id).first()
self.session.query(VerfierMain).filter_by(
agent_id=agent_id).delete()
self.session.commit()
agent = self.session.query(VerfierMain).filter_by(
agent_id=agent_id).first()
self.assertIsNone(agent)
def tearDown(self):
self.session.close()
| 88.556701 | 1,769 | 0.807567 | 575 | 8,590 | 11.937391 | 0.394783 | 0.015297 | 0.016317 | 0.027535 | 0.758159 | 0.748689 | 0.743444 | 0.72771 | 0.716346 | 0.716346 | 0 | 0.219569 | 0.098137 | 8,590 | 96 | 1,770 | 89.479167 | 0.666452 | 0.014785 | 0 | 0.25 | 0 | 0.092105 | 0.720132 | 0.657322 | 0 | 1 | 0.002839 | 0 | 0.105263 | 1 | 0.092105 | false | 0 | 0.052632 | 0 | 0.157895 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
7c55c156d9c9bc1652c5c54d1d0a09de68baef4e | 133 | py | Python | models/__init__.py | byeongjokim/LateTemporalModeling3DCNN_for_sign | e3a802fcf91dc3930aea782464ee34d9b747d3ab | [
"MIT"
] | null | null | null | models/__init__.py | byeongjokim/LateTemporalModeling3DCNN_for_sign | e3a802fcf91dc3930aea782464ee34d9b747d3ab | [
"MIT"
] | null | null | null | models/__init__.py | byeongjokim/LateTemporalModeling3DCNN_for_sign | e3a802fcf91dc3930aea782464ee34d9b747d3ab | [
"MIT"
] | null | null | null | from .rgb_resneXt3D import *
from .rgb_I3D import *
from .rgb_r2plus1d import *
from .rgb_slowfast import *
from .rgb_depth import *
| 22.166667 | 28 | 0.774436 | 20 | 133 | 4.9 | 0.4 | 0.357143 | 0.530612 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035398 | 0.150376 | 133 | 5 | 29 | 26.6 | 0.831858 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7c560450323b860754cfbad46d147f6b2f94d4dd | 1,757 | py | Python | swagger_server/controllers/default_controller.py | lazartomi/3d-beacons-client | bf911c8e4821a57f8c146841d7664546d3f0f2ad | [
"Apache-2.0"
] | null | null | null | swagger_server/controllers/default_controller.py | lazartomi/3d-beacons-client | bf911c8e4821a57f8c146841d7664546d3f0f2ad | [
"Apache-2.0"
] | null | null | null | swagger_server/controllers/default_controller.py | lazartomi/3d-beacons-client | bf911c8e4821a57f8c146841d7664546d3f0f2ad | [
"Apache-2.0"
] | null | null | null | import connexion
import six
from swagger_server.models.result import Result # noqa: E501
from swagger_server import util
def sequence_sequence_json_get(sequence, provider=None, template=None): # noqa: E501
"""sequence_sequence_json_get
# noqa: E501
:param sequence: Amino acid sequence
:type sequence: str
:param provider:
:type provider: str
:param template: Template is 4 letter PDB code, or 4 letter code with assembly ID and chain for SMTL entries
:type template: str
:rtype: Result
"""
return 'do some magic!'
def uniprot_qualifier_json_get(qualifier, provider=None, template=None, range=None): # noqa: E501
"""uniprot_qualifier_json_get
# noqa: E501
:param qualifier: UniProtKB accession number (AC) or entry name (ID)
:type qualifier: str
:param provider:
:type provider: str
:param template: Template is 4 letter PDB code, or 4 letter code with assembly ID and chain for SMTL entries
:type template: str
:param range: Specify a UniProt sequence residue range
:type range: str
:rtype: Result
"""
return 'do some magic!'
def uniprot_qualifier_pdb_get(qualifier, sort=None, provider=None, template=None, range=None): # noqa: E501
"""uniprot_qualifier_pdb_get
# noqa: E501
:param qualifier: UniProtKB accession number (AC) or entry name (ID)
:type qualifier: str
:param sort:
:type sort: str
:param provider:
:type provider: str
:param template: Template is 4 letter PDB code, or 4 letter code with assembly ID and chain for SMTL entries
:type template: str
:param range: Specify a UniProt sequence residue range
:type range: str
:rtype: str
"""
return 'do some magic!'
| 27.888889 | 112 | 0.695504 | 241 | 1,757 | 4.987552 | 0.228216 | 0.0599 | 0.049917 | 0.0599 | 0.75624 | 0.736273 | 0.736273 | 0.736273 | 0.736273 | 0.736273 | 0 | 0.019897 | 0.227661 | 1,757 | 62 | 113 | 28.33871 | 0.865881 | 0.642573 | 0 | 0.3 | 0 | 0 | 0.088608 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.4 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
7c56cac3f71a5eaef78502008997e80bf0962478 | 27,616 | py | Python | huaweicloud-sdk-lts/huaweicloudsdklts/v2/lts_client.py | huaweicloud/huaweicloud-sdk-python-v3 | 7a6270390fcbf192b3882bf763e7016e6026ef78 | [
"Apache-2.0"
] | 64 | 2020-06-12T07:05:07.000Z | 2022-03-30T03:32:50.000Z | huaweicloud-sdk-lts/huaweicloudsdklts/v2/lts_client.py | huaweicloud/huaweicloud-sdk-python-v3 | 7a6270390fcbf192b3882bf763e7016e6026ef78 | [
"Apache-2.0"
] | 11 | 2020-07-06T07:56:54.000Z | 2022-01-11T11:14:40.000Z | huaweicloud-sdk-lts/huaweicloudsdklts/v2/lts_client.py | huaweicloud/huaweicloud-sdk-python-v3 | 7a6270390fcbf192b3882bf763e7016e6026ef78 | [
"Apache-2.0"
] | 24 | 2020-06-08T11:42:13.000Z | 2022-03-04T06:44:08.000Z | # coding: utf-8
from __future__ import absolute_import
import datetime
import re
import importlib
import six
from huaweicloudsdkcore.client import Client, ClientBuilder
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkcore.utils import http_utils
from huaweicloudsdkcore.sdk_stream_request import SdkStreamRequest
class LtsClient(Client):
"""
:param configuration: .Configuration object for this client
:param pool_threads: The number of threads to use for async requests
to the API. More threads means more concurrent API requests.
"""
PRIMITIVE_TYPES = (float, bool, bytes, six.text_type) + six.integer_types
NATIVE_TYPES_MAPPING = {
'int': int,
'long': int if six.PY3 else long,
'float': float,
'str': str,
'bool': bool,
'date': datetime.date,
'datetime': datetime.datetime,
'object': object,
}
def __init__(self):
super(LtsClient, self).__init__()
self.model_package = importlib.import_module("huaweicloudsdklts.v2.model")
self.preset_headers = {'User-Agent': 'HuaweiCloud-SDK-Python'}
@classmethod
def new_builder(cls, clazz=None):
if clazz is None:
return ClientBuilder(cls)
if clazz.__name__ != "LtsClient":
raise TypeError("client type error, support client type is LtsClient")
return ClientBuilder(clazz)
def create_log_dump_obs(self, request):
"""日志转储
该接口用于将指定的一个或多个日志流的日志转储到OBS服务。
:param CreateLogDumpObsRequest request
:return: CreateLogDumpObsResponse
"""
return self.create_log_dump_obs_with_http_info(request)
def create_log_dump_obs_with_http_info(self, request):
"""日志转储
该接口用于将指定的一个或多个日志流的日志转储到OBS服务。
:param CreateLogDumpObsRequest request
:return: CreateLogDumpObsResponse
"""
all_params = ['create_log_dump_obs_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/log-dump/obs',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateLogDumpObsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_log_group(self, request):
"""创建日志组
该接口用于创建一个日志组
:param CreateLogGroupRequest request
:return: CreateLogGroupResponse
"""
return self.create_log_group_with_http_info(request)
def create_log_group_with_http_info(self, request):
"""创建日志组
该接口用于创建一个日志组
:param CreateLogGroupRequest request
:return: CreateLogGroupResponse
"""
all_params = ['create_log_group_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/groups',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateLogGroupResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_log_stream(self, request):
"""创建日志流
该接口用于创建某个指定日志组下的日志流
:param CreateLogStreamRequest request
:return: CreateLogStreamResponse
"""
return self.create_log_stream_with_http_info(request)
def create_log_stream_with_http_info(self, request):
"""创建日志流
该接口用于创建某个指定日志组下的日志流
:param CreateLogStreamRequest request
:return: CreateLogStreamResponse
"""
all_params = ['log_group_id', 'create_log_stream_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'log_group_id' in local_var_params:
path_params['log_group_id'] = local_var_params['log_group_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/groups/{log_group_id}/streams',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateLogStreamResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def delete_log_group(self, request):
"""删除日志组
该接口用于删除指定日志组。当日志组中的日志流配置了日志转储,需要取消日志转储后才可删除。
:param DeleteLogGroupRequest request
:return: DeleteLogGroupResponse
"""
return self.delete_log_group_with_http_info(request)
def delete_log_group_with_http_info(self, request):
"""删除日志组
该接口用于删除指定日志组。当日志组中的日志流配置了日志转储,需要取消日志转储后才可删除。
:param DeleteLogGroupRequest request
:return: DeleteLogGroupResponse
"""
all_params = ['log_group_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'log_group_id' in local_var_params:
path_params['log_group_id'] = local_var_params['log_group_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/groups/{log_group_id}',
method='DELETE',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DeleteLogGroupResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def delete_log_stream(self, request):
"""删除日志流
该接口用于删除指定日志组下的指定日志流。当该日志流配置了日志转储,需要取消日志转储后才可删除。
:param DeleteLogStreamRequest request
:return: DeleteLogStreamResponse
"""
return self.delete_log_stream_with_http_info(request)
def delete_log_stream_with_http_info(self, request):
"""删除日志流
该接口用于删除指定日志组下的指定日志流。当该日志流配置了日志转储,需要取消日志转储后才可删除。
:param DeleteLogStreamRequest request
:return: DeleteLogStreamResponse
"""
all_params = ['log_group_id', 'log_stream_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'log_group_id' in local_var_params:
path_params['log_group_id'] = local_var_params['log_group_id']
if 'log_stream_id' in local_var_params:
path_params['log_stream_id'] = local_var_params['log_stream_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/groups/{log_group_id}/streams/{log_stream_id}',
method='DELETE',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DeleteLogStreamResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def disable_log_collection(self, request):
"""关闭超额采集开关
该接口用于将超额采集日志功能关闭。
:param DisableLogCollectionRequest request
:return: DisableLogCollectionResponse
"""
return self.disable_log_collection_with_http_info(request)
def disable_log_collection_with_http_info(self, request):
"""关闭超额采集开关
该接口用于将超额采集日志功能关闭。
:param DisableLogCollectionRequest request
:return: DisableLogCollectionResponse
"""
all_params = []
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/collection/disable',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DisableLogCollectionResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def enable_log_collection(self, request):
"""打开超额采集开关
该接口用于将超额采集日志功能打开。
:param EnableLogCollectionRequest request
:return: EnableLogCollectionResponse
"""
return self.enable_log_collection_with_http_info(request)
def enable_log_collection_with_http_info(self, request):
"""打开超额采集开关
该接口用于将超额采集日志功能打开。
:param EnableLogCollectionRequest request
:return: EnableLogCollectionResponse
"""
all_params = []
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/collection/enable',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='EnableLogCollectionResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_log_groups(self, request):
"""查询账号下所有日志组
该接口用于查询账号下所有日志组。
:param ListLogGroupsRequest request
:return: ListLogGroupsResponse
"""
return self.list_log_groups_with_http_info(request)
def list_log_groups_with_http_info(self, request):
"""查询账号下所有日志组
该接口用于查询账号下所有日志组。
:param ListLogGroupsRequest request
:return: ListLogGroupsResponse
"""
all_params = []
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/groups',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListLogGroupsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_log_stream(self, request):
"""查询指定日志组下的所有日志流
该接口用于查询指定日志组下的所有日志流信息。
:param ListLogStreamRequest request
:return: ListLogStreamResponse
"""
return self.list_log_stream_with_http_info(request)
def list_log_stream_with_http_info(self, request):
"""查询指定日志组下的所有日志流
该接口用于查询指定日志组下的所有日志流信息。
:param ListLogStreamRequest request
:return: ListLogStreamResponse
"""
all_params = ['log_group_id', 'tag']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'log_group_id' in local_var_params:
path_params['log_group_id'] = local_var_params['log_group_id']
query_params = []
if 'tag' in local_var_params:
query_params.append(('tag', local_var_params['tag']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/groups/{log_group_id}/streams',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListLogStreamResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_logs(self, request):
"""查询日志
该接口用于查询指定日志流下的日志内容。
:param ListLogsRequest request
:return: ListLogsResponse
"""
return self.list_logs_with_http_info(request)
def list_logs_with_http_info(self, request):
"""查询日志
该接口用于查询指定日志流下的日志内容。
:param ListLogsRequest request
:return: ListLogsResponse
"""
all_params = ['log_group_id', 'log_stream_id', 'list_logs_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'log_group_id' in local_var_params:
path_params['log_group_id'] = local_var_params['log_group_id']
if 'log_stream_id' in local_var_params:
path_params['log_stream_id'] = local_var_params['log_stream_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/groups/{log_group_id}/streams/{log_stream_id}/content/query',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListLogsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_query_structured_logs(self, request):
"""查询结构化日志
该接口用于查询指定日志流下的结构化日志内容。
:param ListQueryStructuredLogsRequest request
:return: ListQueryStructuredLogsResponse
"""
return self.list_query_structured_logs_with_http_info(request)
def list_query_structured_logs_with_http_info(self, request):
"""查询结构化日志
该接口用于查询指定日志流下的结构化日志内容。
:param ListQueryStructuredLogsRequest request
:return: ListQueryStructuredLogsResponse
"""
all_params = ['log_group_id', 'log_stream_id', 'list_query_structured_logs_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'log_group_id' in local_var_params:
path_params['log_group_id'] = local_var_params['log_group_id']
if 'log_stream_id' in local_var_params:
path_params['log_stream_id'] = local_var_params['log_stream_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/groups/{log_group_id}/streams/{log_stream_id}/struct-content/query',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListQueryStructuredLogsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_structured_logs_with_time_range(self, request):
"""查询结构化日志(新版)
该接口用于查询指定日志流下的结构化日志内容(新版)。
:param ListStructuredLogsWithTimeRangeRequest request
:return: ListStructuredLogsWithTimeRangeResponse
"""
return self.list_structured_logs_with_time_range_with_http_info(request)
def list_structured_logs_with_time_range_with_http_info(self, request):
"""查询结构化日志(新版)
该接口用于查询指定日志流下的结构化日志内容(新版)。
:param ListStructuredLogsWithTimeRangeRequest request
:return: ListStructuredLogsWithTimeRangeResponse
"""
all_params = ['log_stream_id', 'list_structured_logs_with_time_range_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'log_stream_id' in local_var_params:
path_params['log_stream_id'] = local_var_params['log_stream_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/streams/{log_stream_id}/struct-content/query',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListStructuredLogsWithTimeRangeResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def update_log_group(self, request):
"""修改日志组
该接口用于修改指定日志组下的日志存储时长。
:param UpdateLogGroupRequest request
:return: UpdateLogGroupResponse
"""
return self.update_log_group_with_http_info(request)
def update_log_group_with_http_info(self, request):
"""修改日志组
该接口用于修改指定日志组下的日志存储时长。
:param UpdateLogGroupRequest request
:return: UpdateLogGroupResponse
"""
all_params = ['log_group_id', 'update_log_group_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'log_group_id' in local_var_params:
path_params['log_group_id'] = local_var_params['log_group_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/groups/{log_group_id}',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UpdateLogGroupResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def call_api(self, resource_path, method, path_params=None, query_params=None, header_params=None, body=None,
post_params=None, response_type=None, response_headers=None, auth_settings=None,
collection_formats=None, request_type=None):
"""Makes the HTTP request and returns deserialized data.
:param resource_path: Path to method endpoint.
:param method: Method to call.
:param path_params: Path parameters in the url.
:param query_params: Query parameters in the url.
:param header_params: Header parameters to be placed in the request header.
:param body: Request body.
:param post_params dict: Request post form parameters,
for `application/x-www-form-urlencoded`, `multipart/form-data`.
:param auth_settings list: Auth Settings names for the request.
:param response_type: Response data type.
:param response_headers: Header should be added to response data.
:param collection_formats: dict of collection formats for path, query,
header, and post parameters.
:param request_type: Request data type.
:return:
Return the response directly.
"""
return self.do_http_request(
method=method,
resource_path=resource_path,
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body,
post_params=post_params,
response_type=response_type,
response_headers=response_headers,
collection_formats=collection_formats,
request_type=request_type)
| 30.148472 | 113 | 0.629925 | 2,769 | 27,616 | 5.875767 | 0.077284 | 0.031469 | 0.055071 | 0.036755 | 0.841426 | 0.832329 | 0.823049 | 0.776644 | 0.774923 | 0.641795 | 0 | 0.000814 | 0.288202 | 27,616 | 915 | 114 | 30.181421 | 0.826881 | 0.139702 | 0 | 0.780156 | 0 | 0 | 0.102991 | 0.051982 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05642 | false | 0 | 0.019455 | 0 | 0.138132 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7cacbe6b6e27ab1aae0ce777fb72567ac7a9a00b | 45,503 | py | Python | sdk/containerregistry/azure-containerregistry/azure/containerregistry/_generated/aio/operations/_container_registry_operations.py | praveenkuttappan/azure-sdk-for-python | 4b79413667b7539750a6c7dde15737013a3d4bd5 | [
"MIT"
] | null | null | null | sdk/containerregistry/azure-containerregistry/azure/containerregistry/_generated/aio/operations/_container_registry_operations.py | praveenkuttappan/azure-sdk-for-python | 4b79413667b7539750a6c7dde15737013a3d4bd5 | [
"MIT"
] | 1 | 2021-06-07T06:37:28.000Z | 2021-06-07T06:37:28.000Z | sdk/containerregistry/azure-containerregistry/azure/containerregistry/_generated/aio/operations/_container_registry_operations.py | praveenkuttappan/azure-sdk-for-python | 4b79413667b7539750a6c7dde15737013a3d4bd5 | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------
# Code generated by Microsoft (R) AutoRest Code Generator (autorest: 3.6.6, generator: @autorest/python@5.6.4)
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from typing import Any, AsyncIterable, Callable, Dict, Generic, Optional, TypeVar
import warnings
from azure.core.async_paging import AsyncItemPaged, AsyncList
from azure.core.exceptions import ClientAuthenticationError, HttpResponseError, ResourceExistsError, ResourceNotFoundError, map_error
from azure.core.pipeline import PipelineResponse
from azure.core.pipeline.transport import AsyncHttpResponse, HttpRequest
from ... import models as _models
T = TypeVar('T')
ClsType = Optional[Callable[[PipelineResponse[HttpRequest, AsyncHttpResponse], T, Dict[str, Any]], Any]]
class ContainerRegistryOperations:
"""ContainerRegistryOperations async operations.
You should not instantiate this class directly. Instead, you should create a Client instance that
instantiates it for you and attaches it as an attribute.
:ivar models: Alias to model classes used in this operation group.
:type models: ~container_registry.models
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
models = _models
def __init__(self, client, config, serializer, deserializer) -> None:
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self._config = config
async def check_docker_v2_support(
self,
**kwargs
) -> None:
"""Tells whether this Docker Registry instance supports Docker Registry HTTP API v2.
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
# Construct URL
url = self.check_docker_v2_support.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
if cls:
return cls(pipeline_response, None, {})
check_docker_v2_support.metadata = {'url': '/v2/'} # type: ignore
async def get_manifest(
self,
name: str,
reference: str,
accept: Optional[str] = None,
**kwargs
) -> "_models.Manifest":
"""Get the manifest identified by ``name`` and ``reference`` where ``reference`` can be a tag or
digest.
:param name: Name of the image (including the namespace).
:type name: str
:param reference: A tag or a digest, pointing to a specific image.
:type reference: str
:param accept: Accept header string delimited by comma. For example,
application/vnd.docker.distribution.manifest.v2+json.
:type accept: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: Manifest, or the result of cls(response)
:rtype: ~container_registry.models.Manifest
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.Manifest"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
# Construct URL
url = self.get_manifest.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
'reference': self._serialize.url("reference", reference, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
if accept is not None:
header_parameters['accept'] = self._serialize.header("accept", accept, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('Manifest', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_manifest.metadata = {'url': '/v2/{name}/manifests/{reference}'} # type: ignore
async def create_manifest(
self,
name: str,
reference: str,
payload: "_models.Manifest",
**kwargs
) -> object:
"""Put the manifest identified by ``name`` and ``reference`` where ``reference`` can be a tag or
digest.
:param name: Name of the image (including the namespace).
:type name: str
:param reference: A tag or a digest, pointing to a specific image.
:type reference: str
:param payload: Manifest body, can take v1 or v2 values depending on accept header.
:type payload: ~container_registry.models.Manifest
:keyword callable cls: A custom type or function that will be passed the direct response
:return: object, or the result of cls(response)
:rtype: object
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[object]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
content_type = kwargs.pop("content_type", "application/vnd.docker.distribution.manifest.v2+json")
accept = "application/json"
# Construct URL
url = self.create_manifest.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
'reference': self._serialize.url("reference", reference, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(payload, 'Manifest')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [201]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
response_headers = {}
response_headers['Docker-Content-Digest']=self._deserialize('str', response.headers.get('Docker-Content-Digest'))
response_headers['Location']=self._deserialize('str', response.headers.get('Location'))
response_headers['Content-Length']=self._deserialize('long', response.headers.get('Content-Length'))
deserialized = self._deserialize('object', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, response_headers)
return deserialized
create_manifest.metadata = {'url': '/v2/{name}/manifests/{reference}'} # type: ignore
async def delete_manifest(
self,
name: str,
reference: str,
**kwargs
) -> None:
"""Delete the manifest identified by ``name`` and ``reference``. Note that a manifest can *only*
be deleted by ``digest``.
:param name: Name of the image (including the namespace).
:type name: str
:param reference: Digest of a BLOB.
:type reference: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
# Construct URL
url = self.delete_manifest.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
'reference': self._serialize.url("reference", reference, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.delete(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202, 404]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
if cls:
return cls(pipeline_response, None, {})
delete_manifest.metadata = {'url': '/v2/{name}/manifests/{reference}'} # type: ignore
def get_repositories(
self,
last: Optional[str] = None,
n: Optional[int] = None,
**kwargs
) -> AsyncIterable["_models.Repositories"]:
"""List repositories.
:param last: Query parameter for the last item in previous query. Result set will include
values lexically after last.
:type last: str
:param n: query parameter for max number of items.
:type n: int
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either Repositories or the result of cls(response)
:rtype: ~azure.core.async_paging.AsyncItemPaged[~container_registry.models.Repositories]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.Repositories"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.get_repositories.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
if last is not None:
query_parameters['last'] = self._serialize.query("last", last, 'str')
if n is not None:
query_parameters['n'] = self._serialize.query("n", n, 'int')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
}
url = self._client.format_url(url, **path_format_arguments)
request = self._client.get(url, query_parameters, header_parameters)
return request
async def extract_data(pipeline_response):
deserialized = self._deserialize('Repositories', pipeline_response)
list_of_elem = deserialized.repositories
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.link or None, AsyncList(list_of_elem)
async def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error)
return pipeline_response
return AsyncItemPaged(
get_next, extract_data
)
get_repositories.metadata = {'url': '/acr/v1/_catalog'} # type: ignore
async def get_properties(
self,
name: str,
**kwargs
) -> "_models.ContainerRepositoryProperties":
"""Get repository attributes.
:param name: Name of the image (including the namespace).
:type name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ContainerRepositoryProperties, or the result of cls(response)
:rtype: ~container_registry.models.ContainerRepositoryProperties
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ContainerRepositoryProperties"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
# Construct URL
url = self.get_properties.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('ContainerRepositoryProperties', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_properties.metadata = {'url': '/acr/v1/{name}'} # type: ignore
async def delete_repository(
self,
name: str,
**kwargs
) -> None:
"""Delete the repository identified by ``name``.
:param name: Name of the image (including the namespace).
:type name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
# Construct URL
url = self.delete_repository.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.delete(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202, 404]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
if cls:
return cls(pipeline_response, None, {})
delete_repository.metadata = {'url': '/acr/v1/{name}'} # type: ignore
async def update_properties(
self,
name: str,
value: Optional["_models.RepositoryWriteableProperties"] = None,
**kwargs
) -> "_models.ContainerRepositoryProperties":
"""Update the attribute identified by ``name`` where ``reference`` is the name of the repository.
:param name: Name of the image (including the namespace).
:type name: str
:param value: Repository attribute value.
:type value: ~container_registry.models.RepositoryWriteableProperties
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ContainerRepositoryProperties, or the result of cls(response)
:rtype: ~container_registry.models.ContainerRepositoryProperties
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ContainerRepositoryProperties"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self.update_properties.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
if value is not None:
body_content = self._serialize.body(value, 'RepositoryWriteableProperties')
else:
body_content = None
body_content_kwargs['content'] = body_content
request = self._client.patch(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('ContainerRepositoryProperties', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
update_properties.metadata = {'url': '/acr/v1/{name}'} # type: ignore
def get_tags(
self,
name: str,
last: Optional[str] = None,
n: Optional[int] = None,
orderby: Optional[str] = None,
digest: Optional[str] = None,
**kwargs
) -> AsyncIterable["_models.TagList"]:
"""List tags of a repository.
:param name: Name of the image (including the namespace).
:type name: str
:param last: Query parameter for the last item in previous query. Result set will include
values lexically after last.
:type last: str
:param n: query parameter for max number of items.
:type n: int
:param orderby: orderby query parameter.
:type orderby: str
:param digest: filter by digest.
:type digest: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either TagList or the result of cls(response)
:rtype: ~azure.core.async_paging.AsyncItemPaged[~container_registry.models.TagList]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.TagList"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.get_tags.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
if last is not None:
query_parameters['last'] = self._serialize.query("last", last, 'str')
if n is not None:
query_parameters['n'] = self._serialize.query("n", n, 'int')
if orderby is not None:
query_parameters['orderby'] = self._serialize.query("orderby", orderby, 'str')
if digest is not None:
query_parameters['digest'] = self._serialize.query("digest", digest, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
request = self._client.get(url, query_parameters, header_parameters)
return request
async def extract_data(pipeline_response):
deserialized = self._deserialize('TagList', pipeline_response)
list_of_elem = deserialized.tag_attribute_bases
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.link or None, AsyncList(list_of_elem)
async def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error)
return pipeline_response
return AsyncItemPaged(
get_next, extract_data
)
get_tags.metadata = {'url': '/acr/v1/{name}/_tags'} # type: ignore
async def get_tag_properties(
self,
name: str,
reference: str,
**kwargs
) -> "_models.ArtifactTagProperties":
"""Get tag attributes by tag.
:param name: Name of the image (including the namespace).
:type name: str
:param reference: Tag name.
:type reference: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ArtifactTagProperties, or the result of cls(response)
:rtype: ~container_registry.models.ArtifactTagProperties
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ArtifactTagProperties"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
# Construct URL
url = self.get_tag_properties.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
'reference': self._serialize.url("reference", reference, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('ArtifactTagProperties', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_tag_properties.metadata = {'url': '/acr/v1/{name}/_tags/{reference}'} # type: ignore
async def update_tag_attributes(
self,
name: str,
reference: str,
value: Optional["_models.TagWriteableProperties"] = None,
**kwargs
) -> "_models.ArtifactTagProperties":
"""Update tag attributes.
:param name: Name of the image (including the namespace).
:type name: str
:param reference: Tag name.
:type reference: str
:param value: Tag attribute value.
:type value: ~container_registry.models.TagWriteableProperties
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ArtifactTagProperties, or the result of cls(response)
:rtype: ~container_registry.models.ArtifactTagProperties
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ArtifactTagProperties"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self.update_tag_attributes.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
'reference': self._serialize.url("reference", reference, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
if value is not None:
body_content = self._serialize.body(value, 'TagWriteableProperties')
else:
body_content = None
body_content_kwargs['content'] = body_content
request = self._client.patch(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('ArtifactTagProperties', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
update_tag_attributes.metadata = {'url': '/acr/v1/{name}/_tags/{reference}'} # type: ignore
async def delete_tag(
self,
name: str,
reference: str,
**kwargs
) -> None:
"""Delete tag.
:param name: Name of the image (including the namespace).
:type name: str
:param reference: Tag name.
:type reference: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: None, or the result of cls(response)
:rtype: None
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
# Construct URL
url = self.delete_tag.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
'reference': self._serialize.url("reference", reference, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.delete(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202, 404]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
if cls:
return cls(pipeline_response, None, {})
delete_tag.metadata = {'url': '/acr/v1/{name}/_tags/{reference}'} # type: ignore
def get_manifests(
self,
name: str,
last: Optional[str] = None,
n: Optional[int] = None,
orderby: Optional[str] = None,
**kwargs
) -> AsyncIterable["_models.AcrManifests"]:
"""List manifests of a repository.
:param name: Name of the image (including the namespace).
:type name: str
:param last: Query parameter for the last item in previous query. Result set will include
values lexically after last.
:type last: str
:param n: query parameter for max number of items.
:type n: int
:param orderby: orderby query parameter.
:type orderby: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either AcrManifests or the result of cls(response)
:rtype: ~azure.core.async_paging.AsyncItemPaged[~container_registry.models.AcrManifests]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.AcrManifests"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.get_manifests.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
if last is not None:
query_parameters['last'] = self._serialize.query("last", last, 'str')
if n is not None:
query_parameters['n'] = self._serialize.query("n", n, 'int')
if orderby is not None:
query_parameters['orderby'] = self._serialize.query("orderby", orderby, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
request = self._client.get(url, query_parameters, header_parameters)
return request
async def extract_data(pipeline_response):
deserialized = self._deserialize('AcrManifests', pipeline_response)
list_of_elem = deserialized.manifests
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.link or None, AsyncList(list_of_elem)
async def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error)
return pipeline_response
return AsyncItemPaged(
get_next, extract_data
)
get_manifests.metadata = {'url': '/acr/v1/{name}/_manifests'} # type: ignore
async def get_manifest_properties(
self,
name: str,
digest: str,
**kwargs
) -> "_models.ArtifactManifestProperties":
"""Get manifest attributes.
:param name: Name of the image (including the namespace).
:type name: str
:param digest: Digest of a BLOB.
:type digest: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ArtifactManifestProperties, or the result of cls(response)
:rtype: ~container_registry.models.ArtifactManifestProperties
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ArtifactManifestProperties"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
accept = "application/json"
# Construct URL
url = self.get_manifest_properties.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
'digest': self._serialize.url("digest", digest, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('ArtifactManifestProperties', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_manifest_properties.metadata = {'url': '/acr/v1/{name}/_manifests/{digest}'} # type: ignore
async def update_manifest_properties(
self,
name: str,
digest: str,
value: Optional["_models.ManifestWriteableProperties"] = None,
**kwargs
) -> "_models.ArtifactManifestProperties":
"""Update properties of a manifest.
:param name: Name of the image (including the namespace).
:type name: str
:param digest: Digest of a BLOB.
:type digest: str
:param value: Manifest attribute value.
:type value: ~container_registry.models.ManifestWriteableProperties
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ArtifactManifestProperties, or the result of cls(response)
:rtype: ~container_registry.models.ArtifactManifestProperties
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ArtifactManifestProperties"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self.update_manifest_properties.metadata['url'] # type: ignore
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True),
'name': self._serialize.url("name", name, 'str'),
'digest': self._serialize.url("digest", digest, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
if value is not None:
body_content = self._serialize.body(value, 'ManifestWriteableProperties')
else:
body_content = None
body_content_kwargs['content'] = body_content
request = self._client.patch(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.AcrErrors, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('ArtifactManifestProperties', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
update_manifest_properties.metadata = {'url': '/acr/v1/{name}/_manifests/{digest}'} # type: ignore
| 45.007913 | 133 | 0.636639 | 4,870 | 45,503 | 5.765708 | 0.051129 | 0.021689 | 0.023363 | 0.018448 | 0.884576 | 0.876456 | 0.857937 | 0.845151 | 0.837779 | 0.825849 | 0 | 0.006463 | 0.255346 | 45,503 | 1,010 | 134 | 45.052475 | 0.822217 | 0.116718 | 0 | 0.789137 | 1 | 0 | 0.089658 | 0.027372 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011182 | false | 0 | 0.011182 | 0 | 0.076677 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6b18fbcf7f940c6f86ba4b059f94ef2259838db3 | 26,060 | py | Python | vplexapi-7.0.0.0/vplexapi/api/meta_volume_api.py | lhernand3z/python-vplex | 0f94723fd56c7a3a85c4afb3b78046b9c66b93e4 | [
"Apache-2.0"
] | null | null | null | vplexapi-7.0.0.0/vplexapi/api/meta_volume_api.py | lhernand3z/python-vplex | 0f94723fd56c7a3a85c4afb3b78046b9c66b93e4 | [
"Apache-2.0"
] | null | null | null | vplexapi-7.0.0.0/vplexapi/api/meta_volume_api.py | lhernand3z/python-vplex | 0f94723fd56c7a3a85c4afb3b78046b9c66b93e4 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
VPlex REST API
A definition for the next-gen VPlex API # noqa: E501
OpenAPI spec version: 0.1
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from vplexapi.api_client import ApiClient
class MetaVolumeApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_meta_volume(self, cluster_name, meta_volume_payload, **kwargs): # noqa: E501
"""Create a new MetaVolume # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.create_meta_volume(cluster_name, meta_volume_payload, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param MetaVolumePayload meta_volume_payload: (required)
:param str x_include_object: When passed as part of a POST request, controls whether the representation of the newly created object is included in the response. Defaults to 'true' which will include the object in the response. This header is useful because refreshing the newly created object is usually the slowest part of a POST operation.
:return: MetaVolume
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.create_meta_volume_with_http_info(cluster_name, meta_volume_payload, **kwargs) # noqa: E501
else:
(data) = self.create_meta_volume_with_http_info(cluster_name, meta_volume_payload, **kwargs) # noqa: E501
return data
def create_meta_volume_with_http_info(self, cluster_name, meta_volume_payload, **kwargs): # noqa: E501
"""Create a new MetaVolume # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.create_meta_volume_with_http_info(cluster_name, meta_volume_payload, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param MetaVolumePayload meta_volume_payload: (required)
:param str x_include_object: When passed as part of a POST request, controls whether the representation of the newly created object is included in the response. Defaults to 'true' which will include the object in the response. This header is useful because refreshing the newly created object is usually the slowest part of a POST operation.
:return: MetaVolume
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['cluster_name', 'meta_volume_payload', 'x_include_object'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_meta_volume" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'cluster_name' is set
if ('cluster_name' not in params or
params['cluster_name'] is None):
raise ValueError("Missing the required parameter `cluster_name` when calling `create_meta_volume`") # noqa: E501
# verify the required parameter 'meta_volume_payload' is set
if ('meta_volume_payload' not in params or
params['meta_volume_payload'] is None):
raise ValueError("Missing the required parameter `meta_volume_payload` when calling `create_meta_volume`") # noqa: E501
collection_formats = {}
path_params = {}
if 'cluster_name' in params:
path_params['cluster_name'] = params['cluster_name'] # noqa: E501
query_params = []
header_params = {}
if 'x_include_object' in params:
header_params['X-Include-Object'] = params['x_include_object'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'meta_volume_payload' in params:
body_params = params['meta_volume_payload']
# Authentication setting
auth_settings = ['basicAuth', 'jwtAuth'] # noqa: E501
return self.api_client.call_api(
'/clusters/{cluster_name}/meta_volumes', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MetaVolume', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_meta_volume(self, cluster_name, name, **kwargs): # noqa: E501
"""Deletes a single MetaVolume # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_meta_volume(cluster_name, name, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param str name: The name of a specific instance of the resource (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.delete_meta_volume_with_http_info(cluster_name, name, **kwargs) # noqa: E501
else:
(data) = self.delete_meta_volume_with_http_info(cluster_name, name, **kwargs) # noqa: E501
return data
def delete_meta_volume_with_http_info(self, cluster_name, name, **kwargs): # noqa: E501
"""Deletes a single MetaVolume # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_meta_volume_with_http_info(cluster_name, name, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param str name: The name of a specific instance of the resource (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['cluster_name', 'name'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_meta_volume" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'cluster_name' is set
if ('cluster_name' not in params or
params['cluster_name'] is None):
raise ValueError("Missing the required parameter `cluster_name` when calling `delete_meta_volume`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `delete_meta_volume`") # noqa: E501
collection_formats = {}
path_params = {}
if 'cluster_name' in params:
path_params['cluster_name'] = params['cluster_name'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['basicAuth', 'jwtAuth'] # noqa: E501
return self.api_client.call_api(
'/clusters/{cluster_name}/meta_volumes/{name}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_meta_volume(self, cluster_name, name, **kwargs): # noqa: E501
"""Returns a single MetaVolume by name # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_meta_volume(cluster_name, name, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param str name: The name of a specific instance of the resource (required)
:param str fields: Select which fields are included in the response. 'name' is always included. See FieldSelectionExpression for details.
:return: MetaVolume
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.get_meta_volume_with_http_info(cluster_name, name, **kwargs) # noqa: E501
else:
(data) = self.get_meta_volume_with_http_info(cluster_name, name, **kwargs) # noqa: E501
return data
def get_meta_volume_with_http_info(self, cluster_name, name, **kwargs): # noqa: E501
"""Returns a single MetaVolume by name # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_meta_volume_with_http_info(cluster_name, name, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param str name: The name of a specific instance of the resource (required)
:param str fields: Select which fields are included in the response. 'name' is always included. See FieldSelectionExpression for details.
:return: MetaVolume
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['cluster_name', 'name', 'fields'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_meta_volume" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'cluster_name' is set
if ('cluster_name' not in params or
params['cluster_name'] is None):
raise ValueError("Missing the required parameter `cluster_name` when calling `get_meta_volume`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_meta_volume`") # noqa: E501
collection_formats = {}
path_params = {}
if 'cluster_name' in params:
path_params['cluster_name'] = params['cluster_name'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['basicAuth', 'jwtAuth'] # noqa: E501
return self.api_client.call_api(
'/clusters/{cluster_name}/meta_volumes/{name}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MetaVolume', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_meta_volumes(self, cluster_name, **kwargs): # noqa: E501
"""Returns a list of MetaVolume objects. Supports paging # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_meta_volumes(cluster_name, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param str name: Filter results by name. See LexicalQueryExpression for details.
:param str health_state: Filter results by health_state. See LexicalQueryExpression for details.
:param str operational_status: Filter results by operational_status. See LexicalQueryExpression for details.
:param int offset: Index of the first element to include in paginated results.<br> <b>'limit' must also be specified.</b>
:param int limit: <p>Maximum number of elements to include in paginated results.<br> <b>'offset' must also be specified.<b>
:param str sort_by: Specify the field priority order and direction for sorting. See SortingOrderExpression for details.
:param str fields: Select which fields are included in the response. 'name' is always included. See FieldSelectionExpression for details.
:return: list[MetaVolume]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.get_meta_volumes_with_http_info(cluster_name, **kwargs) # noqa: E501
else:
(data) = self.get_meta_volumes_with_http_info(cluster_name, **kwargs) # noqa: E501
return data
def get_meta_volumes_with_http_info(self, cluster_name, **kwargs): # noqa: E501
"""Returns a list of MetaVolume objects. Supports paging # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_meta_volumes_with_http_info(cluster_name, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param str name: Filter results by name. See LexicalQueryExpression for details.
:param str health_state: Filter results by health_state. See LexicalQueryExpression for details.
:param str operational_status: Filter results by operational_status. See LexicalQueryExpression for details.
:param int offset: Index of the first element to include in paginated results.<br> <b>'limit' must also be specified.</b>
:param int limit: <p>Maximum number of elements to include in paginated results.<br> <b>'offset' must also be specified.<b>
:param str sort_by: Specify the field priority order and direction for sorting. See SortingOrderExpression for details.
:param str fields: Select which fields are included in the response. 'name' is always included. See FieldSelectionExpression for details.
:return: list[MetaVolume]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['cluster_name', 'name', 'health_state', 'operational_status', 'offset', 'limit', 'sort_by', 'fields'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_meta_volumes" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'cluster_name' is set
if ('cluster_name' not in params or
params['cluster_name'] is None):
raise ValueError("Missing the required parameter `cluster_name` when calling `get_meta_volumes`") # noqa: E501
if 'offset' in params and params['offset'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `offset` when calling `get_meta_volumes`, must be a value greater than or equal to `0`") # noqa: E501
if 'limit' in params and params['limit'] > 100: # noqa: E501
raise ValueError("Invalid value for parameter `limit` when calling `get_meta_volumes`, must be a value less than or equal to `100`") # noqa: E501
if 'limit' in params and params['limit'] < 1: # noqa: E501
raise ValueError("Invalid value for parameter `limit` when calling `get_meta_volumes`, must be a value greater than or equal to `1`") # noqa: E501
collection_formats = {}
path_params = {}
if 'cluster_name' in params:
path_params['cluster_name'] = params['cluster_name'] # noqa: E501
query_params = []
if 'name' in params:
query_params.append(('name', params['name'])) # noqa: E501
if 'health_state' in params:
query_params.append(('health_state', params['health_state'])) # noqa: E501
if 'operational_status' in params:
query_params.append(('operational_status', params['operational_status'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'sort_by' in params:
query_params.append(('sort_by', params['sort_by'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['basicAuth', 'jwtAuth'] # noqa: E501
return self.api_client.call_api(
'/clusters/{cluster_name}/meta_volumes', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[MetaVolume]', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_meta_volume(self, cluster_name, name, meta_volume_patch_payload, **kwargs): # noqa: E501
"""Update attributes on a MetaVolume # noqa: E501
Settable attributes: 'active' . NOTE: only true value is allowed # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.patch_meta_volume(cluster_name, name, meta_volume_patch_payload, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param str name: The name of a specific instance of the resource (required)
:param list[JsonPatchOp] meta_volume_patch_payload: (required)
:return: MetaVolume
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.patch_meta_volume_with_http_info(cluster_name, name, meta_volume_patch_payload, **kwargs) # noqa: E501
else:
(data) = self.patch_meta_volume_with_http_info(cluster_name, name, meta_volume_patch_payload, **kwargs) # noqa: E501
return data
def patch_meta_volume_with_http_info(self, cluster_name, name, meta_volume_patch_payload, **kwargs): # noqa: E501
"""Update attributes on a MetaVolume # noqa: E501
Settable attributes: 'active' . NOTE: only true value is allowed # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.patch_meta_volume_with_http_info(cluster_name, name, meta_volume_patch_payload, async=True)
>>> result = thread.get()
:param async bool
:param str cluster_name: The name of the cluster (required)
:param str name: The name of a specific instance of the resource (required)
:param list[JsonPatchOp] meta_volume_patch_payload: (required)
:return: MetaVolume
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['cluster_name', 'name', 'meta_volume_patch_payload'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_meta_volume" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'cluster_name' is set
if ('cluster_name' not in params or
params['cluster_name'] is None):
raise ValueError("Missing the required parameter `cluster_name` when calling `patch_meta_volume`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `patch_meta_volume`") # noqa: E501
# verify the required parameter 'meta_volume_patch_payload' is set
if ('meta_volume_patch_payload' not in params or
params['meta_volume_patch_payload'] is None):
raise ValueError("Missing the required parameter `meta_volume_patch_payload` when calling `patch_meta_volume`") # noqa: E501
collection_formats = {}
path_params = {}
if 'cluster_name' in params:
path_params['cluster_name'] = params['cluster_name'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'meta_volume_patch_payload' in params:
body_params = params['meta_volume_patch_payload']
# Authentication setting
auth_settings = ['basicAuth', 'jwtAuth'] # noqa: E501
return self.api_client.call_api(
'/clusters/{cluster_name}/meta_volumes/{name}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MetaVolume', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 46.205674 | 350 | 0.635802 | 3,145 | 26,060 | 5.0531 | 0.075358 | 0.058835 | 0.020765 | 0.022653 | 0.947898 | 0.932419 | 0.92191 | 0.908382 | 0.89265 | 0.883652 | 0 | 0.013866 | 0.277705 | 26,060 | 563 | 351 | 46.287744 | 0.83042 | 0.055526 | 0 | 0.714286 | 1 | 0.00974 | 0.238729 | 0.047147 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.012987 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
860d5cd79783a64d9b1c9e74b720ec67352cb3d8 | 22,413 | py | Python | tests/test_views.py | nelsonkam/drf-friendly-errors | 4902ef677ec893f518e6a47499d5ca68dd5097b7 | [
"MIT"
] | null | null | null | tests/test_views.py | nelsonkam/drf-friendly-errors | 4902ef677ec893f518e6a47499d5ca68dd5097b7 | [
"MIT"
] | null | null | null | tests/test_views.py | nelsonkam/drf-friendly-errors | 4902ef677ec893f518e6a47499d5ca68dd5097b7 | [
"MIT"
] | null | null | null | from django.core.urlresolvers import reverse
from rest_framework import status
from rest_framework.settings import api_settings
from rest_framework.test import APIRequestFactory
from rest_framework_friendly_errors.settings import (
FRIENDLY_FIELD_ERRORS, FRIENDLY_NON_FIELD_ERRORS,
FRIENDLY_VALIDATOR_ERRORS
)
from . import BaseTestCase
from .models import Snippet
from .views import Snippet2List, SnippetDetail, SnippetList
class ListViewTestCase(BaseTestCase):
def setUp(self):
super(ListViewTestCase, self).setUp()
self.factory = APIRequestFactory()
def test_empty_list_view(self):
request = self.factory.get(reverse('api:snippet-list'))
response = SnippetList.as_view()(request)
self.assertEqual(response.data, [])
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_create_a_valid_snippet(self):
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_invalid_boolean(self):
self.data_set['linenos'] = 'A text instead of a bool'
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['BooleanField']['invalid']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('linenos'))
self.assertEqual(type(errors['linenos']), list)
self.assertEqual(errors['linenos'][0]['code'], code)
def test_invalid_char_field(self):
# Too long string
self.data_set['title'] = 'Too Long Title For Defined Serializer'
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['CharField']['max_length']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('title'))
self.assertEqual(type(errors['title']), list)
self.assertEqual(errors['title'][0]['code'], code)
# Empty string
self.data_set['title'] = ''
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['CharField']['blank']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('title'))
self.assertEqual(type(errors['title']), list)
self.assertEqual(errors['title'][0]['code'], code)
# No data provided
self.data_set.pop('title')
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['CharField']['required']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('title'))
self.assertEqual(type(errors['title']), list)
self.assertEqual(errors['title'][0]['code'], code)
def test_invalid_choice_field(self):
# invalid choice
self.data_set['language'] = 'brainfuck'
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['ChoiceField']['invalid_choice']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('language'))
self.assertEqual(type(errors['language']), list)
self.assertEqual(errors['language'][0]['code'], code)
# empty string
self.data_set['language'] = ''
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['ChoiceField']['invalid_choice']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('language'))
self.assertEqual(type(errors['language']), list)
self.assertEqual(errors['language'][0]['code'], code)
# no data provided
self.data_set.pop('language')
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['ChoiceField']['required']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('language'))
self.assertEqual(type(errors['language']), list)
self.assertEqual(errors['language'][0]['code'], code)
def test_invalid_decimal_field(self):
# invalid
self.data_set['rating'] = 'text instead of float'
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['DecimalField']['invalid']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('rating'))
self.assertEqual(type(errors['rating']), list)
self.assertEqual(errors['rating'][0]['code'], code)
# decimal places
self.data_set['rating'] = 2.99
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['DecimalField']['max_decimal_places']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('rating'))
self.assertEqual(type(errors['rating']), list)
self.assertEqual(errors['rating'][0]['code'], code)
# decimal max digits
self.data_set['rating'] = 222.9
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['DecimalField']['max_digits']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('rating'))
self.assertEqual(type(errors['rating']), list)
self.assertEqual(errors['rating'][0]['code'], code)
def test_datetime_field_error_content(self):
# invalid
self.data_set['posted_date'] = 'text instead of date'
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['DateTimeField']['invalid']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('posted_date'))
self.assertEqual(type(errors['posted_date']), list)
self.assertEqual(errors['posted_date'][0]['code'], code)
def test_custom_field_validation_method(self):
self.data_set['comment'] = 'comment'
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('comment'))
self.assertEqual(type(errors['comment']), list)
self.assertEqual(errors['comment'][0]['code'], 'validate_comment')
def test_custom_field_validation_using_validators(self):
self.data_set['title'] = 'A title'
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('title'))
self.assertEqual(type(errors['title']), list)
self.assertEqual(errors['title'][0]['code'], 'incorrect_title')
def test_field_dependency_validation(self):
self.data_set['title'] = 'A Python'
self.data_set['language'] = 'c++'
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_NON_FIELD_ERRORS['invalid']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get(api_settings.NON_FIELD_ERRORS_KEY))
self.assertEqual(type(errors[api_settings.NON_FIELD_ERRORS_KEY]), list)
c = errors[api_settings.NON_FIELD_ERRORS_KEY][0]['code']
self.assertEqual(c, code)
def test_error_registration(self):
self.data_set['title'] = 'A Python'
self.data_set['language'] = 'c++'
request = self.factory.post(reverse('api:snippet2-list'),
data=self.data_set)
response = Snippet2List.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['ChoiceField']['invalid_choice']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('language'))
self.assertEqual(type(errors['language']), list)
self.assertEqual(errors['language'][0]['code'], code)
def test_couple_errors(self):
self.data_set['comment'] = 'comment'
self.data_set['rating'] = 'Not a number at all'
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertEqual(len(errors), 2)
def test_unique_constraint(self):
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
SnippetList.as_view()(request)
request = self.factory.post(reverse('api:snippet-list'),
data=self.data_set)
response = SnippetList.as_view()(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_VALIDATOR_ERRORS['UniqueValidator']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('watermark'))
self.assertEqual(type(errors['watermark']), list)
self.assertEqual(errors['watermark'][0]['code'], code)
class DetailViewTestCase(BaseTestCase):
def setUp(self):
super(DetailViewTestCase, self).setUp()
self.factory = APIRequestFactory()
self.snippet = Snippet.objects.create(**self.data_set)
def test_retrieve_object(self):
request = self.factory.get(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}))
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_update_snippet(self):
self.data_set['code'] = 'def foo(bar):\n\treturn bar'
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.data['code'], 'def foo(bar):\n\treturn bar')
def update_invalid_boolean(self):
self.data_set['linenos'] = 'A text instead of a bool'
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['BooleanField']['invalid']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('linenos'))
self.assertEqual(type(errors['linenos']), list)
self.assertEqual(errors['linenos'][0]['code'], code)
def test_upload_invalid_char_field(self):
# Too long string
self.data_set['title'] = 'Too Long Title For Defined Serializer'
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['CharField']['max_length']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('title'))
self.assertEqual(type(errors['title']), list)
self.assertEqual(errors['title'][0]['code'], code)
# Empty string
self.data_set['title'] = ''
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['CharField']['blank']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('title'))
self.assertEqual(type(errors['title']), list)
self.assertEqual(errors['title'][0]['code'], code)
# No data provided
self.data_set.pop('title')
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['CharField']['required']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('title'))
self.assertEqual(type(errors['title']), list)
self.assertEqual(errors['title'][0]['code'], code)
def test_upload_invalid_choice_field(self):
# invalid choice
self.data_set['language'] = 'brainfuck'
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['ChoiceField']['invalid_choice']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('language'))
self.assertEqual(type(errors['language']), list)
self.assertEqual(errors['language'][0]['code'], code)
# empty string
self.data_set['language'] = ''
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['ChoiceField']['invalid_choice']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('language'))
self.assertEqual(type(errors['language']), list)
self.assertEqual(errors['language'][0]['code'], code)
# no data provided
self.data_set.pop('language')
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['ChoiceField']['required']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('language'))
self.assertEqual(type(errors['language']), list)
self.assertEqual(errors['language'][0]['code'], code)
def test_upload_invalid_decimal_field(self):
# invalid
self.data_set['rating'] = 'text instead of float'
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['DecimalField']['invalid']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('rating'))
self.assertEqual(type(errors['rating']), list)
self.assertEqual(errors['rating'][0]['code'], code)
# decimal places
self.data_set['rating'] = 2.99
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['DecimalField']['max_decimal_places']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('rating'))
self.assertEqual(type(errors['rating']), list)
self.assertEqual(errors['rating'][0]['code'], code)
# decimal max digits
self.data_set['rating'] = 222.9
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['DecimalField']['max_digits']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('rating'))
self.assertEqual(type(errors['rating']), list)
self.assertEqual(errors['rating'][0]['code'], code)
def test_datetime_field_error_content(self):
# invalid
self.data_set['posted_date'] = 'text instead of date'
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_FIELD_ERRORS['DateTimeField']['invalid']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('posted_date'))
self.assertEqual(type(errors['posted_date']), list)
self.assertEqual(errors['posted_date'][0]['code'], code)
def test_cannot_update_to_not_unique_watermark(self):
self.data_set['watermark'] = 'TEST2'
Snippet.objects.create(**self.data_set)
request = self.factory.put(reverse('api:snippet-detail',
kwargs={'pk': self.snippet.pk}),
data=self.data_set)
response = SnippetDetail.as_view()(request, pk=self.snippet.pk)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
code = FRIENDLY_VALIDATOR_ERRORS['UniqueValidator']
errors = response.data.get('errors')
self.assertIsNotNone(errors)
self.assertIsNotNone(errors.get('watermark'))
self.assertEqual(type(errors['watermark']), list)
self.assertEqual(errors['watermark'][0]['code'], code)
| 49.259341 | 79 | 0.625128 | 2,444 | 22,413 | 5.569967 | 0.058101 | 0.101374 | 0.053331 | 0.129802 | 0.92397 | 0.910674 | 0.896276 | 0.886432 | 0.87747 | 0.874238 | 0 | 0.008639 | 0.245929 | 22,413 | 454 | 80 | 49.367841 | 0.796817 | 0.012537 | 0 | 0.849873 | 0 | 0 | 0.115915 | 0 | 0 | 0 | 0 | 0 | 0.379135 | 1 | 0.058524 | false | 0 | 0.020356 | 0 | 0.083969 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8610a73be1e3c277f622ad5cc35bd3b64b88bff6 | 135 | py | Python | src/trainer/losses.py | epou/msc_workbench | 4b7729270284ede388f9edc769881f97cae51d0d | [
"MIT"
] | null | null | null | src/trainer/losses.py | epou/msc_workbench | 4b7729270284ede388f9edc769881f97cae51d0d | [
"MIT"
] | null | null | null | src/trainer/losses.py | epou/msc_workbench | 4b7729270284ede388f9edc769881f97cae51d0d | [
"MIT"
] | null | null | null | import tensorflow as tf
def ssim(y_true, y_pred, max_val=1.0):
return 1 - tf.reduce_mean(tf.image.ssim(y_true, y_pred, max_val))
| 22.5 | 69 | 0.725926 | 28 | 135 | 3.25 | 0.607143 | 0.10989 | 0.197802 | 0.21978 | 0.43956 | 0.43956 | 0.43956 | 0 | 0 | 0 | 0 | 0.026087 | 0.148148 | 135 | 5 | 70 | 27 | 0.765217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
86383714c1da2af9a333ff0f4a31cd8718cd3af7 | 102 | py | Python | larq/version_test.py | sfalkena/larq | 5a66108c695d5df2dd7e06f8b25c73f26a1df35e | [
"Apache-2.0"
] | 496 | 2019-07-12T14:31:39.000Z | 2022-03-30T22:10:56.000Z | larq/version_test.py | sfalkena/larq | 5a66108c695d5df2dd7e06f8b25c73f26a1df35e | [
"Apache-2.0"
] | 201 | 2019-07-12T22:29:51.000Z | 2022-02-10T01:49:02.000Z | larq/version_test.py | sfalkena/larq | 5a66108c695d5df2dd7e06f8b25c73f26a1df35e | [
"Apache-2.0"
] | 69 | 2019-07-29T03:07:26.000Z | 2022-03-31T21:21:08.000Z | import larq
def test_version():
assert hasattr(larq, "__version__") and "." in larq.__version__
| 17 | 67 | 0.715686 | 13 | 102 | 4.923077 | 0.692308 | 0.34375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 102 | 5 | 68 | 20.4 | 0.752941 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
864a949bff043735dfa536558b56a1c782b67ef3 | 3,399 | py | Python | app/web/pages/forms.py | rashidlasker/rebu | 190793af565d9c2dba386210a3fd2ac501fc8dd2 | [
"MIT"
] | null | null | null | app/web/pages/forms.py | rashidlasker/rebu | 190793af565d9c2dba386210a3fd2ac501fc8dd2 | [
"MIT"
] | null | null | null | app/web/pages/forms.py | rashidlasker/rebu | 190793af565d9c2dba386210a3fd2ac501fc8dd2 | [
"MIT"
] | 1 | 2019-07-26T19:49:20.000Z | 2019-07-26T19:49:20.000Z | from django import forms
class LoginForm(forms.Form):
username = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Username'}))
password = forms.CharField(min_length=8, widget=forms.PasswordInput(attrs={'class':'form-control', 'placeholder':'Password'}))
class RegisterForm(forms.Form):
username = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Username'}))
password = forms.CharField(min_length=8, widget=forms.PasswordInput(attrs={'class':'form-control', 'placeholder':'Password'}))
confirm_password = forms.CharField(min_length=8, widget=forms.PasswordInput(attrs={'class':'form-control', 'placeholder':'Confirm Password'}))
first_name = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'First Name'}))
last_name = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Last Name'}))
street = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Street'}))
zip_code = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Zip Code'}))
state = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'State'}))
country = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Country'}))
bio = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Bio'}))
links = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Links'}))
language = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Language'}))
gender = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Gender'}))
def clean(self):
cleaned_data = super(RegisterForm, self).clean()
password = cleaned_data.get("password")
confirm_password = cleaned_data.get("confirm_password")
if password != confirm_password:
raise forms.ValidationError(
"password and confirm_password does not match"
)
class MealForm(forms.Form):
name = forms.CharField(max_length=30, widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Name'}))
calories = forms.IntegerField(widget=forms.NumberInput(attrs={'class':'form-control', 'placeholder':'Calories'}))
description = forms.CharField(widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Description'}))
spice = forms.IntegerField(widget=forms.NumberInput(attrs={'class':'form-control', 'placeholder':'Spice'}))
price = forms.FloatField(widget=forms.NumberInput(attrs={'class':'form-control', 'placeholder':'Price'}))
tags = forms.CharField(max_length=150, widget=forms.TextInput(attrs={'class':'form-control', 'placeholder':'Tags'}))
takeout_available = forms.BooleanField(initial=True, widget=forms.CheckboxInput(attrs={'class':'custom-control-input', 'id':'takeoutCheck'}))
num_plates = forms.IntegerField(widget=forms.NumberInput(attrs={'class':'form-control', 'placeholder':'Num Plates'}))
start = forms.DateTimeField(widget=forms.DateTimeInput(attrs={'class':'form-control', 'placeholder':'Start'}))
end = forms.DateTimeField(widget=forms.DateTimeInput(attrs={'class':'form-control', 'placeholder':'End'})) | 82.902439 | 146 | 0.719035 | 380 | 3,399 | 6.384211 | 0.197368 | 0.113355 | 0.1385 | 0.207749 | 0.706925 | 0.706925 | 0.706925 | 0.706925 | 0.641797 | 0.641797 | 0 | 0.002602 | 0.095616 | 3,399 | 41 | 147 | 82.902439 | 0.786597 | 0 | 0 | 0.108108 | 0 | 0 | 0.279118 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0.189189 | 0.027027 | 0 | 0.810811 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
864ccab5d3ef25f549424ac473190fe76a3b841f | 7,729 | py | Python | tests/integration/Test_Pronouns.py | Drewlark/whitakers_words | c0bf18d06215eb1e585413e5d426c9426b30c85a | [
"MIT"
] | null | null | null | tests/integration/Test_Pronouns.py | Drewlark/whitakers_words | c0bf18d06215eb1e585413e5d426c9426b30c85a | [
"MIT"
] | null | null | null | tests/integration/Test_Pronouns.py | Drewlark/whitakers_words | c0bf18d06215eb1e585413e5d426c9426b30c85a | [
"MIT"
] | null | null | null | import unittest
from whitakers_words.enums import Case, Gender, Number, WordType
from whitakers_words.parser import Parser
class PronounTest(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.par = Parser()
def test_pronoun(self):
result = self.par.parse("se")
self.assertEqual(len(result.forms), 1)
self.assertEqual(len(result.forms[0].analyses), 1)
for analysis in result.forms[0].analyses.values():
self.assertEqual(analysis.lexeme.roots[0], "-")
self.assertEqual(analysis.lexeme.wordType, WordType.PRON)
self.assertEqual(len(analysis.inflections), 2)
# common properties and features
for inflection in analysis.inflections:
self.assertEqual(inflection.stem, "s")
self.assertEqual(inflection.affix, "e")
self.assertEqual(inflection.wordType, WordType.PRON)
self.assertTrue(inflection.has_feature(Gender.C))
self.assertTrue(inflection.has_feature(Number.X))
other_features = [[x.features["Case"]] for x in analysis.inflections]
self.assertTrue([Case.ACC] in other_features)
self.assertTrue([Case.ABL] in other_features)
def test_personal_pronoun(self):
result = self.par.parse("tu")
self.assertEqual(len(result.forms), 1)
self.assertEqual(len(result.forms[0].analyses), 1)
for analysis in result.forms[0].analyses.values():
self.assertEqual(analysis.lexeme.roots[0], "tu")
self.assertEqual(analysis.lexeme.wordType, WordType.PRON)
self.assertEqual(len(analysis.inflections), 2)
# common properties and features
for inflection in analysis.inflections:
self.assertEqual(inflection.stem, "tu")
self.assertEqual(inflection.affix, "")
self.assertEqual(inflection.wordType, WordType.PRON)
self.assertTrue(inflection.has_feature(Gender.C))
self.assertTrue(inflection.has_feature(Number.S))
other_features = [[x.features["Case"]] for x in analysis.inflections]
self.assertTrue([Case.NOM] in other_features)
self.assertTrue([Case.VOC] in other_features)
def test_quos(self):
result = self.par.parse("quos")
self.assertEqual(len(result.forms), 1)
self.assertEqual(len(result.forms[0].analyses), 31)
for analysis in result.forms[0].analyses.values():
self.assertEqual(analysis.lexeme.roots[0], "qu")
self.assertEqual(analysis.lexeme.wordType, WordType.PRON)
self.assertEqual(len(analysis.inflections), 1)
# common properties and features
for inflection in analysis.inflections:
self.assertEqual(inflection.stem, "qu")
self.assertEqual(inflection.affix, "os")
self.assertEqual(inflection.wordType, WordType.PRON)
self.assertTrue(inflection.has_feature(Gender.M))
self.assertTrue(inflection.has_feature(Number.P))
self.assertTrue(inflection.has_feature(Case.ACC))
def test_tuas(self):
result = self.par.parse("tuas")
self.assertEqual(len(result.forms), 1)
self.assertEqual(len(result.forms[0].analyses), 1)
for analysis in result.forms[0].analyses.values():
self.assertEqual(analysis.lexeme.roots[0], "tu")
# adjectival pronoun
self.assertEqual(analysis.lexeme.wordType, WordType.ADJ)
self.assertEqual(len(analysis.inflections), 1)
# common properties and features
for inflection in analysis.inflections:
self.assertEqual(inflection.stem, "tu")
self.assertEqual(inflection.affix, "as")
self.assertEqual(inflection.wordType, WordType.ADJ)
self.assertTrue(inflection.has_feature(Gender.F))
self.assertTrue(inflection.has_feature(Number.P))
self.assertTrue(inflection.has_feature(Case.ACC))
def test_ea(self):
result = self.par.parse("ea")
self.assertEqual(len(result.forms), 1)
# TODO fix medieval hit for 'eare', and -dem
self.assertEqual(len(result.forms[0].analyses), 3)
for analysis in result.forms[0].analyses.values():
if analysis.lexeme.wordType == WordType.V or len(analysis.inflections) > 4:
continue
self.assertEqual(analysis.lexeme.roots[0], "i")
self.assertEqual(analysis.lexeme.wordType, WordType.PRON)
self.assertEqual(len(analysis.inflections), 4)
# common properties
for inflection in analysis.inflections:
self.assertEqual(inflection.stem, "e")
self.assertEqual(inflection.affix, "a")
self.assertEqual(inflection.wordType, WordType.PRON)
features = [
[x.features["Case"], x.features["Number"], x.features["Gender"]]
for x in analysis.inflections
]
self.assertTrue([Case.NOM, Number.P, Gender.N] in features)
self.assertTrue([Case.ACC, Number.P, Gender.N] in features)
self.assertTrue([Case.NOM, Number.S, Gender.F] in features)
self.assertTrue([Case.ABL, Number.S, Gender.F] in features)
def test_ipsum(self):
result = self.par.parse("ipsum")
self.assertEqual(len(result.forms), 1)
# TODO fix medieval hit for 'eare', and -dem
self.assertEqual(len(result.forms[0].analyses), 1)
for analysis in result.forms[0].analyses.values():
self.assertEqual(analysis.lexeme.roots[0], "ips")
self.assertEqual(analysis.lexeme.wordType, WordType.PRON)
self.assertEqual(len(analysis.inflections), 3)
# common properties
for inflection in analysis.inflections:
self.assertEqual(inflection.stem, "ips")
self.assertEqual(inflection.affix, "um")
self.assertEqual(inflection.wordType, WordType.PRON)
self.assertTrue(inflection.has_feature(Number.S))
other_features = [
[x.features["Case"], x.features["Gender"]] for x in analysis.inflections
]
self.assertTrue([Case.NOM, Gender.N] in other_features)
self.assertTrue([Case.ACC, Gender.N] in other_features)
self.assertTrue([Case.ACC, Gender.M] in other_features)
def test_ipsa(self):
result = self.par.parse("ipsa")
self.assertEqual(len(result.forms), 1)
self.assertEqual(len(result.forms[0].analyses), 1)
for analysis in result.forms[0].analyses.values():
self.assertEqual(analysis.lexeme.roots[0], "ips")
self.assertEqual(analysis.lexeme.wordType, WordType.PRON)
self.assertEqual(len(analysis.inflections), 4)
# common properties
for inflection in analysis.inflections:
self.assertEqual(inflection.stem, "ips")
self.assertEqual(inflection.affix, "a")
self.assertEqual(inflection.wordType, WordType.PRON)
features = [
[x.features["Case"], x.features["Number"], x.features["Gender"]]
for x in analysis.inflections
]
self.assertTrue([Case.NOM, Number.S, Gender.F] in features)
self.assertTrue([Case.NOM, Number.P, Gender.N] in features)
self.assertTrue([Case.ACC, Number.P, Gender.N] in features)
self.assertTrue([Case.ABL, Number.S, Gender.F] in features)
| 46.842424 | 88 | 0.619485 | 858 | 7,729 | 5.544289 | 0.107226 | 0.176582 | 0.079462 | 0.070633 | 0.91276 | 0.866513 | 0.81522 | 0.81522 | 0.807021 | 0.806391 | 0 | 0.007732 | 0.263682 | 7,729 | 164 | 89 | 47.128049 | 0.82815 | 0.036486 | 0 | 0.590909 | 0 | 0 | 0.014793 | 0 | 0 | 0 | 0 | 0.006098 | 0.621212 | 1 | 0.060606 | false | 0 | 0.022727 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
86d7f90a3ba00b3fd44fee6bd454c0ef5d5efaf1 | 146,268 | py | Python | pysnmp/SL81-STD-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/SL81-STD-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/SL81-STD-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module SL81-STD-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/SL81-STD-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 20:57:56 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, Integer, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "OctetString", "Integer", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ValueSizeConstraint, ConstraintsIntersection, ValueRangeConstraint, ConstraintsUnion = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ValueSizeConstraint", "ConstraintsIntersection", "ValueRangeConstraint", "ConstraintsUnion")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
ObjectIdentity, IpAddress, NotificationType, iso, ModuleIdentity, enterprises, Gauge32, TimeTicks, Counter32, MibIdentifier, Counter64, MibScalar, MibTable, MibTableRow, MibTableColumn, Unsigned32, Bits, ObjectName, NotificationType, Integer32 = mibBuilder.importSymbols("SNMPv2-SMI", "ObjectIdentity", "IpAddress", "NotificationType", "iso", "ModuleIdentity", "enterprises", "Gauge32", "TimeTicks", "Counter32", "MibIdentifier", "Counter64", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Unsigned32", "Bits", "ObjectName", "NotificationType", "Integer32")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
omnitronix = MibIdentifier((1, 3, 6, 1, 4, 1, 3052))
sl81 = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5))
status = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 1))
config = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2))
productIds = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 3))
techSupport = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 99))
eventSensorStatus = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1))
dataEventStatus = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 1, 2))
eventSensorBasics = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1))
dataEventConfig = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2))
serialPorts = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3))
network = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2, 4))
modem = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2, 5))
snmp = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2, 6))
pagers = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7))
time = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2, 8))
timeouts = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 2, 9))
esPointTable = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1), )
if mibBuilder.loadTexts: esPointTable.setStatus('mandatory')
esPointEntry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1), ).setIndexNames((0, "SL81-STD-MIB", "esIndexES"), (0, "SL81-STD-MIB", "esIndexPC"), (0, "SL81-STD-MIB", "esIndexPoint"))
if mibBuilder.loadTexts: esPointEntry.setStatus('mandatory')
esIndexES = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esIndexES.setStatus('mandatory')
esIndexPC = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esIndexPC.setStatus('mandatory')
esIndexPoint = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esIndexPoint.setStatus('mandatory')
esPointName = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: esPointName.setStatus('mandatory')
esPointInEventState = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: esPointInEventState.setStatus('mandatory')
esPointValueInt = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-32768, 32767))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: esPointValueInt.setStatus('mandatory')
esPointValueStr = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esPointValueStr.setStatus('mandatory')
esPointTimeLastChange = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1, 8), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esPointTimeLastChange.setStatus('mandatory')
esPointTimetickLastChange = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 1, 1, 1, 9), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esPointTimetickLastChange.setStatus('mandatory')
deStatusTable = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 1, 2, 1), )
if mibBuilder.loadTexts: deStatusTable.setStatus('mandatory')
deStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 1, 2, 1, 1), ).setIndexNames((0, "SL81-STD-MIB", "deStatusIndex"))
if mibBuilder.loadTexts: deStatusEntry.setStatus('mandatory')
deStatusIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 2, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: deStatusIndex.setStatus('mandatory')
deStatusName = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 2, 1, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: deStatusName.setStatus('mandatory')
deStatusCounter = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 2, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: deStatusCounter.setStatus('mandatory')
deStatusThreshold = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 2, 1, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: deStatusThreshold.setStatus('mandatory')
deStatusLastTriggerTime = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 2, 1, 1, 5), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: deStatusLastTriggerTime.setStatus('mandatory')
deStatusLastTriggerData = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 1, 2, 1, 1, 6), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: deStatusLastTriggerData.setStatus('mandatory')
esNumberEventSensors = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esNumberEventSensors.setStatus('mandatory')
esTable = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2), )
if mibBuilder.loadTexts: esTable.setStatus('mandatory')
esEntry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1), ).setIndexNames((0, "SL81-STD-MIB", "esIndex"))
if mibBuilder.loadTexts: esEntry.setStatus('mandatory')
esIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esIndex.setStatus('mandatory')
esName = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esName.setStatus('mandatory')
esID = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 3), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esID.setStatus('mandatory')
esNumberTempSensors = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esNumberTempSensors.setStatus('mandatory')
esTempReportingMode = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 5), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esTempReportingMode.setStatus('mandatory')
esNumberCCs = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esNumberCCs.setStatus('mandatory')
esCCReportingMode = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esCCReportingMode.setStatus('mandatory')
esNumberHumidSensors = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esNumberHumidSensors.setStatus('mandatory')
esHumidReportingMode = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 9), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esHumidReportingMode.setStatus('mandatory')
esNumberNoiseSensors = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 10), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esNumberNoiseSensors.setStatus('mandatory')
esNoiseReportingMode = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 11), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esNoiseReportingMode.setStatus('mandatory')
esNumberAirflowSensors = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 12), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esNumberAirflowSensors.setStatus('mandatory')
esAirflowReportingMode = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 13), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esAirflowReportingMode.setStatus('mandatory')
esNumberAnalog = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 14), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esNumberAnalog.setStatus('mandatory')
esAnalogReportingMode = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 15), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esAnalogReportingMode.setStatus('mandatory')
esNumberRelayOutputs = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 16), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esNumberRelayOutputs.setStatus('mandatory')
esRelayReportingMode = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 1, 2, 1, 17), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: esRelayReportingMode.setStatus('mandatory')
deFieldTable = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 1), )
if mibBuilder.loadTexts: deFieldTable.setStatus('mandatory')
deFieldEntry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 1, 1), ).setIndexNames((0, "SL81-STD-MIB", "deFieldIndex"))
if mibBuilder.loadTexts: deFieldEntry.setStatus('mandatory')
deFieldIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: deFieldIndex.setStatus('mandatory')
deFieldStart = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 1, 1, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deFieldStart.setStatus('mandatory')
deFieldLength = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 1, 1, 3), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deFieldLength.setStatus('mandatory')
deFieldName = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 1, 1, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deFieldName.setStatus('mandatory')
deConfigTable = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2), )
if mibBuilder.loadTexts: deConfigTable.setStatus('mandatory')
deConfigEntry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1), ).setIndexNames((0, "SL81-STD-MIB", "deConfigIndex"))
if mibBuilder.loadTexts: deConfigEntry.setStatus('mandatory')
deConfigIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: deConfigIndex.setStatus('mandatory')
deConfigEnabled = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigEnabled.setStatus('mandatory')
deConfigName = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigName.setStatus('mandatory')
deConfigEquation = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigEquation.setStatus('mandatory')
deConfigThreshold = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigThreshold.setStatus('mandatory')
deConfigClearMode = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 6), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigClearMode.setStatus('mandatory')
deConfigClearTime = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 7), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigClearTime.setStatus('mandatory')
deConfigAutoClear = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 8), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigAutoClear.setStatus('mandatory')
deConfigActions = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 9), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigActions.setStatus('mandatory')
deConfigTrapNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 10), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigTrapNumber.setStatus('mandatory')
deConfigClass = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 2, 2, 1, 11), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: deConfigClass.setStatus('mandatory')
numberPorts = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: numberPorts.setStatus('mandatory')
portConfigTable = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2), )
if mibBuilder.loadTexts: portConfigTable.setStatus('mandatory')
portConfigEntry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2, 1), ).setIndexNames((0, "SL81-STD-MIB", "portConfigIndex"))
if mibBuilder.loadTexts: portConfigEntry.setStatus('mandatory')
portConfigIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: portConfigIndex.setStatus('mandatory')
portConfigBaud = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2, 1, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: portConfigBaud.setStatus('mandatory')
portConfigDataFormat = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2, 1, 3), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: portConfigDataFormat.setStatus('mandatory')
portConfigStripPtOutputLfs = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2, 1, 4), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: portConfigStripPtOutputLfs.setStatus('mandatory')
portConfigStripPtInputLfs = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2, 1, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: portConfigStripPtInputLfs.setStatus('mandatory')
portConfigDTRLowIdle = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2, 1, 6), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: portConfigDTRLowIdle.setStatus('mandatory')
portConfigMaskEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2, 1, 7), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: portConfigMaskEnable.setStatus('mandatory')
portConfigDAEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 3, 2, 1, 8), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: portConfigDAEnable.setStatus('mandatory')
ipConfigStatic = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 4, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipConfigStatic.setStatus('mandatory')
ipConfigAddress = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 4, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ipConfigAddress.setStatus('mandatory')
ipConfigSubnetMask = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 4, 3), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ipConfigSubnetMask.setStatus('mandatory')
ipConfigDefaultRouter = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 4, 4), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ipConfigDefaultRouter.setStatus('mandatory')
ipConfigEngage = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 4, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ipConfigEngage.setStatus('mandatory')
telnetDuplex = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 4, 6), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: telnetDuplex.setStatus('mandatory')
modemDataFormat = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 5, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: modemDataFormat.setStatus('mandatory')
modemUserSetup = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 5, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: modemUserSetup.setStatus('mandatory')
modemTAPSetup = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 5, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: modemTAPSetup.setStatus('mandatory')
modemTimeBetweenOutbound = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 5, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: modemTimeBetweenOutbound.setStatus('mandatory')
smTable = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 2, 6, 1), )
if mibBuilder.loadTexts: smTable.setStatus('mandatory')
smEntry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 2, 6, 1, 1), ).setIndexNames((0, "SL81-STD-MIB", "smIndex"))
if mibBuilder.loadTexts: smEntry.setStatus('mandatory')
smIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 6, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: smIndex.setStatus('mandatory')
smAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 6, 1, 1, 2), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: smAddress.setStatus('mandatory')
pagerRetries = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: pagerRetries.setStatus('mandatory')
pagerTable = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7, 2), )
if mibBuilder.loadTexts: pagerTable.setStatus('mandatory')
pagerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7, 2, 1), ).setIndexNames((0, "SL81-STD-MIB", "pagerIndex"))
if mibBuilder.loadTexts: pagerEntry.setStatus('mandatory')
pagerIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7, 2, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: pagerIndex.setStatus('mandatory')
pagerType = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7, 2, 1, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: pagerType.setStatus('mandatory')
pagerPhoneNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7, 2, 1, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: pagerPhoneNumber.setStatus('mandatory')
pagerID = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7, 2, 1, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: pagerID.setStatus('mandatory')
pagerPostCalloutDelay = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7, 2, 1, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: pagerPostCalloutDelay.setStatus('mandatory')
pagerIDDelay = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 2, 7, 2, 1, 6), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: pagerIDDelay.setStatus('mandatory')
clock = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 8, 1), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: clock.setStatus('mandatory')
autoDSTAdjust = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 8, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: autoDSTAdjust.setStatus('mandatory')
commandTimeout = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 9, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: commandTimeout.setStatus('mandatory')
passthroughTimeout = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 2, 9, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: passthroughTimeout.setStatus('mandatory')
siteID = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 3, 1), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: siteID.setStatus('mandatory')
thisProduct = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 3, 2), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: thisProduct.setStatus('mandatory')
stockTrapString = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 3, 3), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: stockTrapString.setStatus('mandatory')
trapEventTypeNumber = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 3, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: trapEventTypeNumber.setStatus('mandatory')
trapEventTypeName = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 3, 5), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: trapEventTypeName.setStatus('mandatory')
trapIncludedValue = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 3, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-32768, 32767))).setMaxAccess("readonly")
if mibBuilder.loadTexts: trapIncludedValue.setStatus('mandatory')
trapIncludedString = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 3, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: trapIncludedString.setStatus('mandatory')
trapEventClassNumber = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 3, 9), Integer32())
if mibBuilder.loadTexts: trapEventClassNumber.setStatus('mandatory')
trapEventClassName = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 3, 10), Integer32())
if mibBuilder.loadTexts: trapEventClassName.setStatus('mandatory')
techSupport1 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport1.setStatus('mandatory')
techSupport2 = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 99, 2))
techSupport2n1 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 2, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport2n1.setStatus('mandatory')
techSupport2n2 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 2, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport2n2.setStatus('mandatory')
techSupport3 = MibIdentifier((1, 3, 6, 1, 4, 1, 3052, 5, 99, 3))
techSupport3n1 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 3, 1), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport3n1.setStatus('mandatory')
techSupport3n2 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 3, 2), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport3n2.setStatus('mandatory')
techSupport3n3 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 3, 3), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport3n3.setStatus('mandatory')
techSupport3n4 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 3, 4), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport3n4.setStatus('mandatory')
techSupport3n5 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 3, 5), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport3n5.setStatus('mandatory')
techSupport4 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 4), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport4.setStatus('mandatory')
techSupport7 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 7), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport7.setStatus('mandatory')
techSupport9 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 9), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport9.setStatus('mandatory')
techSupport10 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 10), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport10.setStatus('mandatory')
techSupport11 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 11), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport11.setStatus('mandatory')
techSupport16 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 16), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport16.setStatus('mandatory')
techSupport17 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 17), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport17.setStatus('mandatory')
techSupport18 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 18), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport18.setStatus('mandatory')
techSupport19 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 19), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport19.setStatus('mandatory')
techSupport20Table = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 99, 20), )
if mibBuilder.loadTexts: techSupport20Table.setStatus('mandatory')
techSupport20Entry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 99, 20, 1), ).setIndexNames((0, "SL81-STD-MIB", "techSupport20Index"))
if mibBuilder.loadTexts: techSupport20Entry.setStatus('mandatory')
techSupport20Index = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 99, 20, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: techSupport20Index.setStatus('mandatory')
techSupport20 = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 99, 20, 1, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport20.setStatus('mandatory')
techSupport21Table = MibTable((1, 3, 6, 1, 4, 1, 3052, 5, 99, 21), )
if mibBuilder.loadTexts: techSupport21Table.setStatus('mandatory')
techSupport21Entry = MibTableRow((1, 3, 6, 1, 4, 1, 3052, 5, 99, 21, 1), ).setIndexNames((0, "SL81-STD-MIB", "techSupport21Index"))
if mibBuilder.loadTexts: techSupport21Entry.setStatus('mandatory')
techSupport21Index = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 99, 21, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: techSupport21Index.setStatus('mandatory')
techSupport21 = MibTableColumn((1, 3, 6, 1, 4, 1, 3052, 5, 99, 21, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: techSupport21.setStatus('mandatory')
techSupport22 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 22), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport22.setStatus('mandatory')
techSupport24 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 24), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport24.setStatus('mandatory')
techSupport25 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 25), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport25.setStatus('mandatory')
techSupport26 = MibScalar((1, 3, 6, 1, 4, 1, 3052, 5, 99, 26), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: techSupport26.setStatus('mandatory')
sl81TestTrap = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "stockTrapString"))
sl81StockESDisconnectTrap = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,50)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "stockTrapString"))
sl81StockDataEventTrap = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,100)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "stockTrapString"))
sl81StockContactClosureTrap = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,110)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "stockTrapString"))
sl81StockTempTrap = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,120)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "stockTrapString"))
sl81StockHumidityTrap = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,130)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "stockTrapString"))
sl81StockAnalogTrap = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,140)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "stockTrapString"))
sl81StockCTSTrap = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,160)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "stockTrapString"))
sl81StockSchedTrap = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,170)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "stockTrapString"))
sl81UserTrap1000 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1000)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1001 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1001)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1002 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1002)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1003 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1003)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1004 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1004)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1005 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1005)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1006 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1006)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1007 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1007)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1008 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1008)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1009 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1009)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1010 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1010)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1011 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1011)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1012 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1012)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1013 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1013)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1014 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1014)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1015 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1015)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1016 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1016)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1017 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1017)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1018 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1018)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1019 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1019)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1020 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1020)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1021 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1021)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1022 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1022)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1023 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1023)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1024 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1024)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1025 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1025)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1026 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1026)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1027 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1027)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1028 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1028)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1029 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1029)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1030 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1030)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1031 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1031)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1032 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1032)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1033 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1033)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1034 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1034)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1035 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1035)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1036 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1036)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1037 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1037)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1038 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1038)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1039 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1039)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1040 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1040)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1041 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1041)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1042 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1042)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1043 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1043)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1044 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1044)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1045 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1045)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1046 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1046)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1047 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1047)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1048 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1048)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1049 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1049)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1050 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1050)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1051 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1051)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1052 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1052)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1053 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1053)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1054 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1054)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1055 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1055)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1056 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1056)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1057 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1057)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1058 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1058)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1059 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1059)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1060 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1060)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1061 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1061)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1062 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1062)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1063 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1063)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1064 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1064)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1065 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1065)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1066 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1066)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1067 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1067)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1068 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1068)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1069 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1069)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1070 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1070)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1071 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1071)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1072 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1072)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1073 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1073)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1074 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1074)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1075 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1075)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1076 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1076)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1077 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1077)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1078 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1078)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1079 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1079)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1080 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1080)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1081 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1081)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1082 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1082)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1083 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1083)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1084 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1084)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1085 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1085)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1086 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1086)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1087 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1087)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1088 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1088)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1089 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1089)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1090 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1090)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1091 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1091)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1092 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1092)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1093 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1093)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1094 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1094)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1095 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1095)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1096 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1096)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1097 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1097)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1098 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1098)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1099 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1099)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1100 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1100)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1101 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1101)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1102 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1102)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1103 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1103)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1104 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1104)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1105 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1105)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1106 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1106)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1107 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1107)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1108 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1108)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1109 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1109)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1110 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1110)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1111 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1111)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1112 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1112)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1113 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1113)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1114 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1114)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1115 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1115)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1116 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1116)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1117 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1117)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1118 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1118)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1119 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1119)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1120 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1120)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1121 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1121)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1122 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1122)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1123 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1123)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1124 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1124)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1125 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1125)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1126 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1126)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1127 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1127)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1128 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1128)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1129 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1129)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1130 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1130)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1131 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1131)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1132 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1132)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1133 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1133)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1134 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1134)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1135 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1135)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1136 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1136)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1137 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1137)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1138 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1138)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1139 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1139)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1140 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1140)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1141 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1141)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1142 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1142)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1143 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1143)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1144 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1144)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1145 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1145)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1146 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1146)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1147 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1147)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1148 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1148)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1149 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1149)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1150 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1150)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1151 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1151)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1152 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1152)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1153 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1153)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1154 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1154)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1155 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1155)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1156 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1156)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1157 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1157)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1158 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1158)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1159 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1159)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1160 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1160)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1161 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1161)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1162 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1162)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1163 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1163)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1164 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1164)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1165 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1165)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1166 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1166)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1167 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1167)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1168 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1168)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1169 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1169)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1170 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1170)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1171 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1171)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1172 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1172)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1173 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1173)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1174 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1174)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1175 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1175)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1176 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1176)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1177 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1177)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1178 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1178)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1179 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1179)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1180 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1180)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1181 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1181)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1182 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1182)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1183 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1183)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1184 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1184)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1185 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1185)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1186 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1186)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1187 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1187)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1188 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1188)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1189 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1189)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1190 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1190)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1191 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1191)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1192 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1192)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1193 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1193)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1194 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1194)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1195 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1195)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1196 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1196)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1197 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1197)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1198 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1198)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
sl81UserTrap1199 = NotificationType((1, 3, 6, 1, 4, 1, 3052, 5) + (0,1199)).setObjects(("SL81-STD-MIB", "siteID"), ("SL81-STD-MIB", "esIndex"), ("SL81-STD-MIB", "esName"), ("SL81-STD-MIB", "trapEventTypeNumber"), ("SL81-STD-MIB", "trapEventTypeName"), ("SL81-STD-MIB", "esIndexPoint"), ("SL81-STD-MIB", "esPointName"), ("SL81-STD-MIB", "esID"), ("SL81-STD-MIB", "clock"), ("SL81-STD-MIB", "trapIncludedValue"), ("SL81-STD-MIB", "trapIncludedString"), ("SL81-STD-MIB", "trapEventClassNumber"), ("SL81-STD-MIB", "trapEventClassName"))
mibBuilder.exportSymbols("SL81-STD-MIB", sl81UserTrap1121=sl81UserTrap1121, sl81UserTrap1140=sl81UserTrap1140, sl81UserTrap1171=sl81UserTrap1171, sl81UserTrap1192=sl81UserTrap1192, techSupport3=techSupport3, trapIncludedValue=trapIncludedValue, sl81UserTrap1191=sl81UserTrap1191, smTable=smTable, trapEventTypeName=trapEventTypeName, sl81UserTrap1024=sl81UserTrap1024, ipConfigAddress=ipConfigAddress, sl81StockContactClosureTrap=sl81StockContactClosureTrap, clock=clock, sl81UserTrap1103=sl81UserTrap1103, sl81UserTrap1048=sl81UserTrap1048, sl81UserTrap1097=sl81UserTrap1097, sl81UserTrap1059=sl81UserTrap1059, sl81UserTrap1198=sl81UserTrap1198, sl81UserTrap1100=sl81UserTrap1100, smEntry=smEntry, sl81UserTrap1039=sl81UserTrap1039, deConfigTrapNumber=deConfigTrapNumber, sl81UserTrap1108=sl81UserTrap1108, sl81UserTrap1117=sl81UserTrap1117, techSupport11=techSupport11, techSupport4=techSupport4, modemTimeBetweenOutbound=modemTimeBetweenOutbound, sl81UserTrap1138=sl81UserTrap1138, sl81UserTrap1170=sl81UserTrap1170, sl81UserTrap1023=sl81UserTrap1023, sl81UserTrap1007=sl81UserTrap1007, deConfigClearTime=deConfigClearTime, esCCReportingMode=esCCReportingMode, techSupport21Table=techSupport21Table, sl81UserTrap1053=sl81UserTrap1053, esNumberRelayOutputs=esNumberRelayOutputs, portConfigStripPtInputLfs=portConfigStripPtInputLfs, sl81UserTrap1173=sl81UserTrap1173, sl81UserTrap1081=sl81UserTrap1081, sl81UserTrap1144=sl81UserTrap1144, sl81UserTrap1185=sl81UserTrap1185, trapEventClassName=trapEventClassName, deConfigName=deConfigName, techSupport3n4=techSupport3n4, productIds=productIds, portConfigTable=portConfigTable, techSupport26=techSupport26, sl81UserTrap1153=sl81UserTrap1153, sl81UserTrap1042=sl81UserTrap1042, pagerType=pagerType, sl81UserTrap1037=sl81UserTrap1037, sl81UserTrap1118=sl81UserTrap1118, techSupport9=techSupport9, sl81UserTrap1186=sl81UserTrap1186, numberPorts=numberPorts, sl81UserTrap1182=sl81UserTrap1182, modemDataFormat=modemDataFormat, esNoiseReportingMode=esNoiseReportingMode, techSupport3n2=techSupport3n2, modemTAPSetup=modemTAPSetup, sl81UserTrap1113=sl81UserTrap1113, sl81UserTrap1165=sl81UserTrap1165, sl81UserTrap1107=sl81UserTrap1107, sl81StockCTSTrap=sl81StockCTSTrap, techSupport=techSupport, sl81UserTrap1096=sl81UserTrap1096, sl81UserTrap1004=sl81UserTrap1004, sl81UserTrap1065=sl81UserTrap1065, deConfigEnabled=deConfigEnabled, sl81UserTrap1000=sl81UserTrap1000, sl81UserTrap1141=sl81UserTrap1141, sl81UserTrap1102=sl81UserTrap1102, stockTrapString=stockTrapString, techSupport20Table=techSupport20Table, sl81UserTrap1074=sl81UserTrap1074, sl81UserTrap1086=sl81UserTrap1086, esTable=esTable, portConfigDAEnable=portConfigDAEnable, deFieldTable=deFieldTable, telnetDuplex=telnetDuplex, techSupport10=techSupport10, sl81UserTrap1050=sl81UserTrap1050, smAddress=smAddress, sl81UserTrap1146=sl81UserTrap1146, sl81UserTrap1106=sl81UserTrap1106, sl81UserTrap1062=sl81UserTrap1062, portConfigMaskEnable=portConfigMaskEnable, techSupport21=techSupport21, deFieldLength=deFieldLength, sl81UserTrap1087=sl81UserTrap1087, sl81UserTrap1003=sl81UserTrap1003, esPointName=esPointName, sl81UserTrap1045=sl81UserTrap1045, deFieldEntry=deFieldEntry, techSupport20Entry=techSupport20Entry, sl81UserTrap1012=sl81UserTrap1012, eventSensorStatus=eventSensorStatus, esIndexPoint=esIndexPoint, sl81StockTempTrap=sl81StockTempTrap, sl81UserTrap1068=sl81UserTrap1068, deConfigAutoClear=deConfigAutoClear, esPointTimetickLastChange=esPointTimetickLastChange, sl81UserTrap1030=sl81UserTrap1030, sl81UserTrap1047=sl81UserTrap1047, deStatusTable=deStatusTable, sl81UserTrap1018=sl81UserTrap1018, sl81UserTrap1126=sl81UserTrap1126, sl81UserTrap1008=sl81UserTrap1008, esNumberAirflowSensors=esNumberAirflowSensors, sl81UserTrap1026=sl81UserTrap1026, deConfigClass=deConfigClass, techSupport20=techSupport20, sl81UserTrap1070=sl81UserTrap1070, sl81UserTrap1163=sl81UserTrap1163, sl81UserTrap1172=sl81UserTrap1172, esAnalogReportingMode=esAnalogReportingMode, sl81UserTrap1021=sl81UserTrap1021, esNumberCCs=esNumberCCs, sl81StockSchedTrap=sl81StockSchedTrap, sl81UserTrap1084=sl81UserTrap1084, sl81UserTrap1199=sl81UserTrap1199, sl81UserTrap1079=sl81UserTrap1079, sl81UserTrap1178=sl81UserTrap1178, deConfigThreshold=deConfigThreshold, deConfigActions=deConfigActions, sl81UserTrap1005=sl81UserTrap1005, sl81UserTrap1128=sl81UserTrap1128, sl81UserTrap1032=sl81UserTrap1032, sl81UserTrap1188=sl81UserTrap1188, sl81UserTrap1058=sl81UserTrap1058, sl81UserTrap1089=sl81UserTrap1089, sl81UserTrap1035=sl81UserTrap1035, deStatusName=deStatusName, sl81=sl81, esIndexPC=esIndexPC, sl81UserTrap1054=sl81UserTrap1054, sl81UserTrap1161=sl81UserTrap1161, ipConfigEngage=ipConfigEngage, sl81StockHumidityTrap=sl81StockHumidityTrap, sl81UserTrap1189=sl81UserTrap1189, sl81UserTrap1130=sl81UserTrap1130, sl81UserTrap1181=sl81UserTrap1181, sl81UserTrap1093=sl81UserTrap1093, sl81UserTrap1190=sl81UserTrap1190, portConfigEntry=portConfigEntry, sl81UserTrap1111=sl81UserTrap1111, sl81UserTrap1052=sl81UserTrap1052, timeouts=timeouts, esNumberAnalog=esNumberAnalog, sl81UserTrap1112=sl81UserTrap1112, esPointValueStr=esPointValueStr, sl81UserTrap1043=sl81UserTrap1043, sl81UserTrap1080=sl81UserTrap1080, sl81UserTrap1193=sl81UserTrap1193, esName=esName, sl81UserTrap1049=sl81UserTrap1049, sl81UserTrap1099=sl81UserTrap1099, sl81UserTrap1197=sl81UserTrap1197, sl81UserTrap1028=sl81UserTrap1028, sl81UserTrap1041=sl81UserTrap1041, sl81UserTrap1092=sl81UserTrap1092, techSupport3n5=techSupport3n5, sl81UserTrap1009=sl81UserTrap1009, sl81UserTrap1011=sl81UserTrap1011, sl81UserTrap1044=sl81UserTrap1044, sl81UserTrap1077=sl81UserTrap1077, sl81UserTrap1157=sl81UserTrap1157, deFieldIndex=deFieldIndex, esPointEntry=esPointEntry, sl81UserTrap1110=sl81UserTrap1110, sl81UserTrap1057=sl81UserTrap1057, sl81UserTrap1162=sl81UserTrap1162, sl81UserTrap1147=sl81UserTrap1147, techSupport25=techSupport25, trapEventTypeNumber=trapEventTypeNumber, sl81UserTrap1027=sl81UserTrap1027, sl81UserTrap1114=sl81UserTrap1114, sl81UserTrap1169=sl81UserTrap1169, sl81UserTrap1071=sl81UserTrap1071, autoDSTAdjust=autoDSTAdjust, sl81UserTrap1015=sl81UserTrap1015, techSupport19=techSupport19, sl81UserTrap1075=sl81UserTrap1075, sl81UserTrap1179=sl81UserTrap1179, sl81UserTrap1139=sl81UserTrap1139, sl81UserTrap1196=sl81UserTrap1196, esIndex=esIndex, sl81StockDataEventTrap=sl81StockDataEventTrap, pagerIndex=pagerIndex, sl81UserTrap1167=sl81UserTrap1167, pagerID=pagerID, sl81UserTrap1104=sl81UserTrap1104, sl81StockESDisconnectTrap=sl81StockESDisconnectTrap, sl81UserTrap1175=sl81UserTrap1175, esNumberNoiseSensors=esNumberNoiseSensors, time=time, thisProduct=thisProduct, sl81UserTrap1133=sl81UserTrap1133, dataEventStatus=dataEventStatus, sl81UserTrap1014=sl81UserTrap1014, sl81UserTrap1055=sl81UserTrap1055, deStatusLastTriggerTime=deStatusLastTriggerTime, sl81UserTrap1115=sl81UserTrap1115, deConfigTable=deConfigTable, passthroughTimeout=passthroughTimeout, sl81UserTrap1046=sl81UserTrap1046, sl81UserTrap1040=sl81UserTrap1040, sl81UserTrap1051=sl81UserTrap1051, sl81UserTrap1083=sl81UserTrap1083, sl81UserTrap1152=sl81UserTrap1152, sl81UserTrap1116=sl81UserTrap1116, esIndexES=esIndexES, sl81UserTrap1119=sl81UserTrap1119, dataEventConfig=dataEventConfig, portConfigIndex=portConfigIndex, pagerEntry=pagerEntry, sl81UserTrap1149=sl81UserTrap1149, sl81UserTrap1038=sl81UserTrap1038, techSupport17=techSupport17, sl81UserTrap1176=sl81UserTrap1176, techSupport21Entry=techSupport21Entry, sl81UserTrap1131=sl81UserTrap1131, sl81UserTrap1061=sl81UserTrap1061, sl81UserTrap1002=sl81UserTrap1002, sl81UserTrap1124=sl81UserTrap1124, techSupport16=techSupport16, portConfigStripPtOutputLfs=portConfigStripPtOutputLfs, sl81UserTrap1109=sl81UserTrap1109, esPointTable=esPointTable, siteID=siteID, sl81UserTrap1166=sl81UserTrap1166, sl81UserTrap1064=sl81UserTrap1064, techSupport3n1=techSupport3n1, sl81UserTrap1136=sl81UserTrap1136, sl81UserTrap1125=sl81UserTrap1125, sl81UserTrap1066=sl81UserTrap1066, esAirflowReportingMode=esAirflowReportingMode, sl81UserTrap1019=sl81UserTrap1019, sl81UserTrap1036=sl81UserTrap1036, techSupport18=techSupport18, sl81UserTrap1127=sl81UserTrap1127, portConfigDataFormat=portConfigDataFormat, trapEventClassNumber=trapEventClassNumber, sl81UserTrap1184=sl81UserTrap1184, esHumidReportingMode=esHumidReportingMode, pagerPostCalloutDelay=pagerPostCalloutDelay, sl81UserTrap1159=sl81UserTrap1159, sl81UserTrap1123=sl81UserTrap1123, smIndex=smIndex, techSupport20Index=techSupport20Index, techSupport7=techSupport7, deStatusThreshold=deStatusThreshold, techSupport22=techSupport22, sl81UserTrap1158=sl81UserTrap1158)
mibBuilder.exportSymbols("SL81-STD-MIB", sl81UserTrap1073=sl81UserTrap1073, portConfigBaud=portConfigBaud, sl81UserTrap1120=sl81UserTrap1120, ipConfigSubnetMask=ipConfigSubnetMask, deFieldName=deFieldName, sl81UserTrap1164=sl81UserTrap1164, sl81UserTrap1006=sl81UserTrap1006, sl81UserTrap1017=sl81UserTrap1017, sl81TestTrap=sl81TestTrap, sl81UserTrap1105=sl81UserTrap1105, sl81UserTrap1088=sl81UserTrap1088, sl81UserTrap1098=sl81UserTrap1098, ipConfigStatic=ipConfigStatic, pagerRetries=pagerRetries, sl81UserTrap1129=sl81UserTrap1129, sl81StockAnalogTrap=sl81StockAnalogTrap, techSupport2n2=techSupport2n2, sl81UserTrap1063=sl81UserTrap1063, deConfigEquation=deConfigEquation, sl81UserTrap1091=sl81UserTrap1091, sl81UserTrap1187=sl81UserTrap1187, sl81UserTrap1025=sl81UserTrap1025, sl81UserTrap1010=sl81UserTrap1010, config=config, sl81UserTrap1067=sl81UserTrap1067, deConfigEntry=deConfigEntry, portConfigDTRLowIdle=portConfigDTRLowIdle, sl81UserTrap1160=sl81UserTrap1160, pagerIDDelay=pagerIDDelay, sl81UserTrap1132=sl81UserTrap1132, modem=modem, sl81UserTrap1033=sl81UserTrap1033, sl81UserTrap1148=sl81UserTrap1148, techSupport3n3=techSupport3n3, sl81UserTrap1029=sl81UserTrap1029, sl81UserTrap1150=sl81UserTrap1150, sl81UserTrap1069=sl81UserTrap1069, sl81UserTrap1194=sl81UserTrap1194, sl81UserTrap1134=sl81UserTrap1134, sl81UserTrap1122=sl81UserTrap1122, techSupport24=techSupport24, sl81UserTrap1056=sl81UserTrap1056, esID=esID, sl81UserTrap1151=sl81UserTrap1151, sl81UserTrap1082=sl81UserTrap1082, techSupport1=techSupport1, snmp=snmp, deConfigClearMode=deConfigClearMode, sl81UserTrap1143=sl81UserTrap1143, sl81UserTrap1078=sl81UserTrap1078, pagers=pagers, network=network, ipConfigDefaultRouter=ipConfigDefaultRouter, sl81UserTrap1177=sl81UserTrap1177, sl81UserTrap1174=sl81UserTrap1174, sl81UserTrap1060=sl81UserTrap1060, deStatusIndex=deStatusIndex, sl81UserTrap1156=sl81UserTrap1156, commandTimeout=commandTimeout, deStatusEntry=deStatusEntry, sl81UserTrap1031=sl81UserTrap1031, sl81UserTrap1142=sl81UserTrap1142, serialPorts=serialPorts, sl81UserTrap1101=sl81UserTrap1101, status=status, sl81UserTrap1013=sl81UserTrap1013, deStatusCounter=deStatusCounter, esNumberTempSensors=esNumberTempSensors, pagerPhoneNumber=pagerPhoneNumber, sl81UserTrap1137=sl81UserTrap1137, deFieldStart=deFieldStart, trapIncludedString=trapIncludedString, esNumberEventSensors=esNumberEventSensors, esPointValueInt=esPointValueInt, modemUserSetup=modemUserSetup, sl81UserTrap1145=sl81UserTrap1145, sl81UserTrap1095=sl81UserTrap1095, esPointInEventState=esPointInEventState, techSupport2n1=techSupport2n1, sl81UserTrap1085=sl81UserTrap1085, esRelayReportingMode=esRelayReportingMode, sl81UserTrap1154=sl81UserTrap1154, pagerTable=pagerTable, sl81UserTrap1168=sl81UserTrap1168, sl81UserTrap1090=sl81UserTrap1090, deConfigIndex=deConfigIndex, deStatusLastTriggerData=deStatusLastTriggerData, sl81UserTrap1001=sl81UserTrap1001, eventSensorBasics=eventSensorBasics, esNumberHumidSensors=esNumberHumidSensors, sl81UserTrap1016=sl81UserTrap1016, sl81UserTrap1135=sl81UserTrap1135, sl81UserTrap1094=sl81UserTrap1094, sl81UserTrap1180=sl81UserTrap1180, esEntry=esEntry, sl81UserTrap1155=sl81UserTrap1155, sl81UserTrap1195=sl81UserTrap1195, techSupport21Index=techSupport21Index, sl81UserTrap1072=sl81UserTrap1072, sl81UserTrap1034=sl81UserTrap1034, sl81UserTrap1020=sl81UserTrap1020, esTempReportingMode=esTempReportingMode, sl81UserTrap1076=sl81UserTrap1076, techSupport2=techSupport2, sl81UserTrap1183=sl81UserTrap1183, sl81UserTrap1022=sl81UserTrap1022, omnitronix=omnitronix, esPointTimeLastChange=esPointTimeLastChange)
| 285.679688 | 8,614 | 0.685769 | 17,373 | 146,268 | 5.773672 | 0.039832 | 0.183818 | 0.262596 | 0.014436 | 0.769735 | 0.765597 | 0.750454 | 0.729498 | 0.71898 | 0.700287 | 0 | 0.110767 | 0.072873 | 146,268 | 511 | 8,615 | 286.238748 | 0.628904 | 0.002174 | 0 | 0 | 0 | 0 | 0.456082 | 0.000301 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.005952 | 0.011905 | 0 | 0.011905 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
86e4271b5788b6c79996ecb024155d3262b0fba6 | 46,410 | py | Python | buildscripts/tests/test_update_test_lifecycle.py | gaboull/mongodb-legacy | 3a7474c102ede496175400c0d916736ac8f8d4d2 | [
"Apache-2.0"
] | null | null | null | buildscripts/tests/test_update_test_lifecycle.py | gaboull/mongodb-legacy | 3a7474c102ede496175400c0d916736ac8f8d4d2 | [
"Apache-2.0"
] | null | null | null | buildscripts/tests/test_update_test_lifecycle.py | gaboull/mongodb-legacy | 3a7474c102ede496175400c0d916736ac8f8d4d2 | [
"Apache-2.0"
] | null | null | null | """
Tests for buildscripts/update_test_lifecycle.py.
"""
from __future__ import absolute_import
import collections
import copy
import datetime
import unittest
from buildscripts import test_failures
from buildscripts import update_test_lifecycle
from buildscripts.ciconfig import tags as ci_tags
class TestValidateConfig(unittest.TestCase):
"""
Tests for the validate_config() function.
"""
CONFIG = update_test_lifecycle.Config(
test_fail_rates=update_test_lifecycle.Rates(acceptable=0, unacceptable=1),
task_fail_rates=update_test_lifecycle.Rates(acceptable=0, unacceptable=1),
variant_fail_rates=update_test_lifecycle.Rates(acceptable=0, unacceptable=1),
distro_fail_rates=update_test_lifecycle.Rates(acceptable=0, unacceptable=1),
reliable_min_runs=2,
reliable_time_period=datetime.timedelta(days=1),
unreliable_min_runs=2,
unreliable_time_period=datetime.timedelta(days=1))
def test_acceptable_test_fail_rate(self):
"""
Tests the validation of the 'test_fail_rates.acceptable' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(acceptable="not a number"))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(acceptable=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(acceptable=2))
update_test_lifecycle.validate_config(config)
def test_unacceptable_test_fail_rate(self):
"""
Tests the validation of the 'test_fail_rates.unacceptable' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(unacceptable="not a number"))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(unacceptable=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(unacceptable=2))
update_test_lifecycle.validate_config(config)
def test_test_fail_rates(self):
"""
Tests the validation of the 'test_fail_rates' attribute.
"""
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(acceptable=0.9,
unacceptable=0.1))
update_test_lifecycle.validate_config(config)
def test_acceptable_task_fail_rate(self):
"""
Tests the validation of the 'test_fail_rates.acceptable' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(
task_fail_rates=self.CONFIG.task_fail_rates._replace(acceptable="not a number"))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
task_fail_rates=self.CONFIG.task_fail_rates._replace(acceptable=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
task_fail_rates=self.CONFIG.task_fail_rates._replace(acceptable=2))
update_test_lifecycle.validate_config(config)
def test_unacceptable_task_fail_rate(self):
"""
Tests the validation of the 'task_fail_rates.unacceptable' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(
task_fail_rates=self.CONFIG.task_fail_rates._replace(unacceptable="not a number"))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
task_fail_rates=self.CONFIG.task_fail_rates._replace(unacceptable=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
task_fail_rates=self.CONFIG.task_fail_rates._replace(unacceptable=2))
update_test_lifecycle.validate_config(config)
def test_task_fail_rates(self):
"""
Tests the validation of the 'task_fail_rates' attribute.
"""
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
task_fail_rates=self.CONFIG.task_fail_rates._replace(acceptable=0.9,
unacceptable=0.1))
update_test_lifecycle.validate_config(config)
def test_acceptable_variant_fail_rate(self):
"""
Tests the validation of the 'variant_fail_rates.acceptable' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(
acceptable="not a number"))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(acceptable=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(acceptable=2))
update_test_lifecycle.validate_config(config)
def test_unacceptable_variant_fail_rate(self):
"""
Tests the validation of the 'variant_fail_rates.unacceptable' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(
unacceptable="not a number"))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(unacceptable=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(unacceptable=2))
update_test_lifecycle.validate_config(config)
def test_variant_fail_rates(self):
"""
Tests the validation of the 'variant_fail_rates' attribute.
"""
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(acceptable=0.9,
unacceptable=0.1))
update_test_lifecycle.validate_config(config)
def test_acceptable_distro_fail_rate(self):
"""
Tests the validation of the 'distro_fail_rates.acceptable' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(acceptable="not a number"))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(acceptable=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(acceptable=2))
update_test_lifecycle.validate_config(config)
def test_unacceptable_distro_fail_rate(self):
"""
Tests the validation of the 'distro_fail_rates.unacceptable' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(
unacceptable="not a number"))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(unacceptable=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(unacceptable=2))
update_test_lifecycle.validate_config(config)
def test_distro_fail_rates(self):
"""
Tests the validation of the 'distro_fail_rates' attribute.
"""
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(acceptable=0.9,
unacceptable=0.1))
update_test_lifecycle.validate_config(config)
def test_reliable_min_runs(self):
"""
Tests the validation of the 'reliable_min_runs' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(reliable_min_runs="not a number")
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(reliable_min_runs=-1)
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(reliable_min_runs=0)
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(reliable_min_runs=1.5)
update_test_lifecycle.validate_config(config)
def test_reliable_time_period(self):
"""
Tests the validation of the 'reliable_time_period' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(reliable_time_period="not a datetime.timedelta")
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(reliable_time_period=datetime.timedelta(days=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(reliable_time_period=datetime.timedelta(days=0))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(reliable_time_period=datetime.timedelta(days=1, hours=1))
update_test_lifecycle.validate_config(config)
def test_unreliable_min_runs(self):
"""
Tests the validation of the 'unreliable_min_runs' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(unreliable_min_runs="not a number")
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(unreliable_min_runs=-1)
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(unreliable_min_runs=0)
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(unreliable_min_runs=1.5)
update_test_lifecycle.validate_config(config)
def test_unreliable_time_period(self):
"""
Tests the validation of the 'unreliable_time_period' attribute.
"""
with self.assertRaises(TypeError):
config = self.CONFIG._replace(unreliable_time_period="not a datetime.timedelta")
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(unreliable_time_period=datetime.timedelta(days=-1))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(unreliable_time_period=datetime.timedelta(days=0))
update_test_lifecycle.validate_config(config)
with self.assertRaises(ValueError):
config = self.CONFIG._replace(
unreliable_time_period=datetime.timedelta(days=1, hours=1))
update_test_lifecycle.validate_config(config)
class TestUpdateTags(unittest.TestCase):
"""
Tests for the update_tags() function.
"""
CONFIG = update_test_lifecycle.Config(
test_fail_rates=update_test_lifecycle.Rates(acceptable=0, unacceptable=1),
task_fail_rates=update_test_lifecycle.Rates(acceptable=0, unacceptable=1),
variant_fail_rates=update_test_lifecycle.Rates(acceptable=0, unacceptable=1),
distro_fail_rates=update_test_lifecycle.Rates(acceptable=0, unacceptable=1),
reliable_min_runs=2,
reliable_time_period=datetime.timedelta(days=1),
unreliable_min_runs=2,
unreliable_time_period=datetime.timedelta(days=1))
ENTRY = test_failures.ReportEntry(test="jstests/core/all.js",
task="jsCore_WT",
variant="linux-64",
distro="rhel62",
start_date=datetime.date(2017, 6, 3),
end_date=datetime.date(2017, 6, 3),
num_pass=0,
num_fail=0)
def assert_has_only_js_tests(self, lifecycle):
"""
Raises an AssertionError exception if 'lifecycle' is not of the following form:
selector:
js_test:
...
"""
self.assertIn("selector", lifecycle.raw)
self.assertEqual(1, len(lifecycle.raw), msg=str(lifecycle.raw))
self.assertIn("js_test", lifecycle.raw["selector"])
self.assertEqual(1, len(lifecycle.raw["selector"]), msg=str(lifecycle.raw))
return lifecycle.raw["selector"]["js_test"]
def transition_from_reliable_to_unreliable(self, config, expected_tags):
"""
Tests that update_tags() tags a formerly reliable combination as being unreliable.
"""
initial_tags = collections.OrderedDict()
lifecycle = ci_tags.TagsConfig.from_dict(
dict(selector=dict(js_test=copy.deepcopy(initial_tags))))
summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
self.assertEqual(collections.OrderedDict(), self.assert_has_only_js_tests(lifecycle))
tests = ["jstests/core/all.js"]
report = test_failures.Report([
self.ENTRY._replace(num_pass=0, num_fail=1),
self.ENTRY._replace(num_pass=0, num_fail=1, task="jsCore"),
self.ENTRY._replace(num_pass=0, num_fail=1, variant="linux-64-debug"),
self.ENTRY._replace(num_pass=1, num_fail=0),
self.ENTRY._replace(num_pass=0, num_fail=1, distro="rhel55"),
])
update_test_lifecycle.validate_config(config)
update_test_lifecycle.update_tags(summary_lifecycle, config, report, tests)
updated_tags = self.assert_has_only_js_tests(lifecycle)
self.assertEqual(updated_tags, expected_tags)
def test_transition_test_from_reliable_to_unreliable(self):
"""
Tests that update_tags() tags a formerly reliable (test,) combination as being unreliable.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(unacceptable=0.1))
self.transition_from_reliable_to_unreliable(config, collections.OrderedDict([
("jstests/core/all.js", ["unreliable"]),
]))
def test_transition_task_from_reliable_to_unreliable(self):
"""
Tests that update_tags() tags a formerly reliable (test, task) combination as being
unreliable.
"""
config = self.CONFIG._replace(
task_fail_rates=self.CONFIG.task_fail_rates._replace(unacceptable=0.1))
self.transition_from_reliable_to_unreliable(config, collections.OrderedDict([
("jstests/core/all.js", ["unreliable|jsCore_WT"]),
]))
def test_transition_variant_from_reliable_to_unreliable(self):
"""
Tests that update_tags() tags a formerly reliable (test, task, variant) combination as being
unreliable.
"""
config = self.CONFIG._replace(
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(unacceptable=0.1))
self.transition_from_reliable_to_unreliable(config, collections.OrderedDict([
("jstests/core/all.js", ["unreliable|jsCore_WT|linux-64"]),
]))
def test_transition_distro_from_reliable_to_unreliable(self):
"""
Tests that update_tags() tags a formerly reliable (test, task, variant, distro) combination
as being unreliable.
"""
config = self.CONFIG._replace(
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(unacceptable=0.1))
self.transition_from_reliable_to_unreliable(config, collections.OrderedDict([
("jstests/core/all.js", ["unreliable|jsCore_WT|linux-64|rhel62"]),
]))
def test_transition_from_reliable_to_unreliable(self):
"""
Tests that update_tags() tags multiple formerly reliable combination as being unreliable.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(unacceptable=0.1),
task_fail_rates=self.CONFIG.task_fail_rates._replace(unacceptable=0.1),
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(unacceptable=0.1),
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(unacceptable=0.1))
self.transition_from_reliable_to_unreliable(config, collections.OrderedDict([
("jstests/core/all.js", [
"unreliable",
"unreliable|jsCore_WT",
"unreliable|jsCore_WT|linux-64",
"unreliable|jsCore_WT|linux-64|rhel62",
]),
]))
def transition_from_unreliable_to_reliable(self, config, initial_tags):
"""
Tests that update_tags() untags a formerly unreliable combination after it has become
reliable again.
"""
lifecycle = ci_tags.TagsConfig.from_dict(
dict(selector=dict(js_test=copy.deepcopy(initial_tags))))
summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
self.assertEqual(initial_tags, self.assert_has_only_js_tests(lifecycle))
tests = ["jstests/core/all.js"]
report = test_failures.Report([
self.ENTRY._replace(num_pass=1, num_fail=0),
self.ENTRY._replace(num_pass=1, num_fail=0, task="jsCore"),
self.ENTRY._replace(num_pass=1, num_fail=0, variant="linux-64-debug"),
self.ENTRY._replace(num_pass=0, num_fail=1),
self.ENTRY._replace(num_pass=1, num_fail=0, distro="rhel55"),
])
update_test_lifecycle.validate_config(config)
update_test_lifecycle.update_tags(summary_lifecycle, config, report, tests)
updated_tags = self.assert_has_only_js_tests(lifecycle)
self.assertEqual(updated_tags, collections.OrderedDict())
def test_non_running_in_reliable_period_is_reliable(self):
"""
Tests that tests that have a failure rate above the unacceptable rate during the unreliable
period but haven't run during the reliable period are marked as reliable.
"""
# Unreliable period is 2 days: 2017-06-03 to 2017-06-04.
# Reliable period is 1 day: 2016-06-04.
reliable_period_date = datetime.date(2017, 6, 4)
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(unacceptable=0.1),
task_fail_rates=self.CONFIG.task_fail_rates._replace(unacceptable=0.1),
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(unacceptable=0.1),
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(unacceptable=0.1),
unreliable_time_period=datetime.timedelta(days=2))
tests = ["jstests/core/all.js"]
initial_tags = collections.OrderedDict([
("jstests/core/all.js", [
"unreliable",
"unreliable|jsCore_WT",
"unreliable|jsCore_WT|linux-64",
"unreliable|jsCore_WT|linux-64|rhel62",
]),
])
lifecycle = ci_tags.TagsConfig.from_dict(
dict(selector=dict(js_test=copy.deepcopy(initial_tags))))
summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
self.assertEqual(initial_tags, self.assert_has_only_js_tests(lifecycle))
# The test did not run on the reliable period on linux-64.
report = test_failures.Report([
# Failing.
self.ENTRY._replace(num_pass=0,
num_fail=2),
# Passing on a different variant.
self.ENTRY._replace(start_date=reliable_period_date,
end_date=reliable_period_date,
num_pass=3,
num_fail=0,
variant="linux-alt",
distro="debian8"),
])
update_test_lifecycle.validate_config(config)
update_test_lifecycle.update_tags(summary_lifecycle, config, report, tests)
updated_tags = self.assert_has_only_js_tests(lifecycle)
# The tags for variant and distro have been removed.
self.assertEqual(updated_tags, collections.OrderedDict([
("jstests/core/all.js", ["unreliable", "unreliable|jsCore_WT"])]))
def test_non_running_at_all_is_reliable(self):
"""
Tests that tests that are tagged as unreliable but no longer running (either during the
reliable or the unreliable period) have their tags removed.
"""
config = self.CONFIG
tests = ["jstests/core/all.js", "jstests/core/all2.js"]
initial_tags = collections.OrderedDict([
("jstests/core/all2.js", [
"unreliable",
"unreliable|jsCore_WT",
"unreliable|jsCore_WT|linux-64",
"unreliable|jsCore_WT|linux-64|rhel62",
]),
])
lifecycle = ci_tags.TagsConfig.from_dict(
dict(selector=dict(js_test=copy.deepcopy(initial_tags))))
summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
self.assertEqual(initial_tags, self.assert_has_only_js_tests(lifecycle))
# all2.js did not run at all
report = test_failures.Report([self.ENTRY])
update_test_lifecycle.validate_config(config)
update_test_lifecycle.update_tags(summary_lifecycle, config, report, tests)
updated_tags = self.assert_has_only_js_tests(lifecycle)
# The tags for variant and distro have been removed.
self.assertEqual(updated_tags, collections.OrderedDict([]))
def test_transition_test_from_unreliable_to_reliable(self):
"""
Tests that update_tags() untags a formerly unreliable (test,) combination after it has
become reliable again.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(acceptable=0.9))
self.transition_from_unreliable_to_reliable(config, collections.OrderedDict([
("jstests/core/all.js", ["unreliable"]),
]))
def test_transition_task_from_unreliable_to_reliable(self):
"""
Tests that update_tags() untags a formerly unreliable (test, task) combination after it has
become reliable again.
"""
config = self.CONFIG._replace(
task_fail_rates=self.CONFIG.task_fail_rates._replace(acceptable=0.9))
self.transition_from_unreliable_to_reliable(config, collections.OrderedDict([
("jstests/core/all.js", ["unreliable|jsCore_WT"]),
]))
def test_transition_variant_from_unreliable_to_reliable(self):
"""
Tests that update_tags() untags a formerly unreliable (test, task, variant) combination
after it has become reliable again.
"""
config = self.CONFIG._replace(
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(acceptable=0.9))
self.transition_from_unreliable_to_reliable(config, collections.OrderedDict([
("jstests/core/all.js", ["unreliable|jsCore_WT|linux-64"]),
]))
def test_transition_distro_from_unreliable_to_reliable(self):
"""
Tests that update_tags() untags a formerly unreliable (test, task, variant, distro)
combination after it has become reliable again.
"""
config = self.CONFIG._replace(
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(acceptable=0.9))
self.transition_from_unreliable_to_reliable(config, collections.OrderedDict([
("jstests/core/all.js", ["unreliable|jsCore_WT|linux-64|rhel62"]),
]))
def test_transition_from_unreliable_to_reliable(self):
"""
Tests that update_tags() untags multiple formerly unreliable combination after it has become
reliable again.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(acceptable=0.9),
task_fail_rates=self.CONFIG.task_fail_rates._replace(acceptable=0.9),
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(acceptable=0.9),
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(acceptable=0.9))
self.transition_from_unreliable_to_reliable(config, collections.OrderedDict([
("jstests/core/all.js", [
"unreliable",
"unreliable|jsCore_WT",
"unreliable|jsCore_WT|linux-64",
"unreliable|jsCore_WT|linux-64|rhel62",
]),
]))
def test_remain_reliable(self):
"""
Tests that update_tags() preserves the absence of tags for reliable combinations.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(acceptable=0.9),
task_fail_rates=self.CONFIG.task_fail_rates._replace(acceptable=0.9),
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(acceptable=0.9),
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(acceptable=0.9))
initial_tags = collections.OrderedDict()
lifecycle = ci_tags.TagsConfig.from_dict(
dict(selector=dict(js_test=copy.deepcopy(initial_tags))))
summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
self.assertEqual(initial_tags, self.assert_has_only_js_tests(lifecycle))
tests = ["jstests/core/all.js"]
report = test_failures.Report([
self.ENTRY._replace(num_pass=1, num_fail=0),
self.ENTRY._replace(num_pass=1, num_fail=0, task="jsCore"),
self.ENTRY._replace(num_pass=1, num_fail=0, variant="linux-64-debug"),
self.ENTRY._replace(num_pass=0, num_fail=1),
self.ENTRY._replace(num_pass=1, num_fail=0, distro="rhel55"),
])
update_test_lifecycle.validate_config(config)
update_test_lifecycle.update_tags(summary_lifecycle, config, report, tests)
updated_tags = self.assert_has_only_js_tests(lifecycle)
self.assertEqual(updated_tags, initial_tags)
def test_remain_unreliable(self):
"""
Tests that update_tags() preserves the tags for unreliable combinations.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(unacceptable=0.1),
task_fail_rates=self.CONFIG.task_fail_rates._replace(unacceptable=0.1),
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(unacceptable=0.1),
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(unacceptable=0.1))
initial_tags = collections.OrderedDict([
("jstests/core/all.js", [
"unreliable",
"unreliable|jsCore_WT",
"unreliable|jsCore_WT|linux-64",
"unreliable|jsCore_WT|linux-64|rhel62",
]),
])
lifecycle = ci_tags.TagsConfig.from_dict(
dict(selector=dict(js_test=copy.deepcopy(initial_tags))))
summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
self.assertEqual(initial_tags, self.assert_has_only_js_tests(lifecycle))
tests = ["jstests/core/all.js"]
report = test_failures.Report([
self.ENTRY._replace(num_pass=0, num_fail=1),
self.ENTRY._replace(num_pass=0, num_fail=1, task="jsCore"),
self.ENTRY._replace(num_pass=0, num_fail=1, variant="linux-64-debug"),
self.ENTRY._replace(num_pass=1, num_fail=0),
self.ENTRY._replace(num_pass=0, num_fail=1, distro="rhel55"),
])
update_test_lifecycle.validate_config(config)
update_test_lifecycle.update_tags(summary_lifecycle, config, report, tests)
updated_tags = self.assert_has_only_js_tests(lifecycle)
self.assertEqual(updated_tags, initial_tags)
def test_obeys_reliable_min_runs(self):
"""
Tests that update_tags() considers a test reliable if it has fewer than 'reliable_min_runs'.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(acceptable=0.9),
task_fail_rates=self.CONFIG.task_fail_rates._replace(acceptable=0.9),
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(acceptable=0.9),
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(acceptable=0.9),
reliable_min_runs=100)
self.transition_from_unreliable_to_reliable(config, collections.OrderedDict([
("jstests/core/all.js", [
"unreliable",
"unreliable|jsCore_WT",
"unreliable|jsCore_WT|linux-64",
"unreliable|jsCore_WT|linux-64|rhel62",
]),
]))
def test_obeys_reliable_time_period(self):
"""
Tests that update_tags() ignores passes from before 'reliable_time_period'.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(acceptable=0.9),
task_fail_rates=self.CONFIG.task_fail_rates._replace(acceptable=0.9),
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(acceptable=0.9),
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(acceptable=0.9))
initial_tags = collections.OrderedDict()
lifecycle = ci_tags.TagsConfig.from_dict(
dict(selector=dict(js_test=copy.deepcopy(initial_tags))))
summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
self.assertEqual(initial_tags, self.assert_has_only_js_tests(lifecycle))
tests = ["jstests/core/all.js"]
report = test_failures.Report([
self.ENTRY._replace(start_date=(self.ENTRY.start_date - datetime.timedelta(days=1)),
end_date=(self.ENTRY.end_date - datetime.timedelta(days=1)),
num_pass=1,
num_fail=0),
self.ENTRY._replace(start_date=(self.ENTRY.start_date - datetime.timedelta(days=2)),
end_date=(self.ENTRY.end_date - datetime.timedelta(days=2)),
num_pass=1,
num_fail=0),
self.ENTRY._replace(num_pass=0, num_fail=1),
self.ENTRY._replace(num_pass=0, num_fail=1),
self.ENTRY._replace(num_pass=0, num_fail=1, task="jsCore"),
self.ENTRY._replace(num_pass=0, num_fail=1, variant="linux-64-debug"),
self.ENTRY._replace(num_pass=0, num_fail=1, distro="rhel55"),
])
update_test_lifecycle.validate_config(config)
update_test_lifecycle.update_tags(summary_lifecycle, config, report, tests)
updated_tags = self.assert_has_only_js_tests(lifecycle)
self.assertEqual(updated_tags, collections.OrderedDict([
("jstests/core/all.js", [
"unreliable",
"unreliable|jsCore_WT",
"unreliable|jsCore_WT|linux-64",
"unreliable|jsCore_WT|linux-64|rhel62",
]),
]))
def test_obeys_unreliable_min_runs(self):
"""
Tests that update_tags() only considers a test unreliable if it has more than
'unreliable_min_runs'.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(unacceptable=0.1),
task_fail_rates=self.CONFIG.task_fail_rates._replace(unacceptable=0.1),
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(unacceptable=0.1),
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(unacceptable=0.1),
unreliable_min_runs=100)
initial_tags = collections.OrderedDict()
lifecycle = ci_tags.TagsConfig.from_dict(
dict(selector=dict(js_test=copy.deepcopy(initial_tags))))
summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
self.assertEqual(initial_tags, self.assert_has_only_js_tests(lifecycle))
tests = ["jstests/core/all.js"]
report = test_failures.Report([
self.ENTRY._replace(num_pass=0, num_fail=1),
self.ENTRY._replace(num_pass=0, num_fail=1, task="jsCore"),
self.ENTRY._replace(num_pass=0, num_fail=1, variant="linux-64-debug"),
self.ENTRY._replace(num_pass=1, num_fail=0),
self.ENTRY._replace(num_pass=0, num_fail=1, distro="rhel55"),
])
update_test_lifecycle.validate_config(config)
update_test_lifecycle.update_tags(summary_lifecycle, config, report, tests)
updated_tags = self.assert_has_only_js_tests(lifecycle)
self.assertEqual(updated_tags, initial_tags)
def test_obeys_unreliable_time_period(self):
"""
Tests that update_tags() ignores failures from before 'unreliable_time_period'.
"""
config = self.CONFIG._replace(
test_fail_rates=self.CONFIG.test_fail_rates._replace(unacceptable=0.1),
task_fail_rates=self.CONFIG.task_fail_rates._replace(unacceptable=0.1),
variant_fail_rates=self.CONFIG.variant_fail_rates._replace(unacceptable=0.1),
distro_fail_rates=self.CONFIG.distro_fail_rates._replace(unacceptable=0.1))
initial_tags = collections.OrderedDict([
("jstests/core/all.js", [
"unreliable",
"unreliable|jsCore_WT",
"unreliable|jsCore_WT|linux-64",
"unreliable|jsCore_WT|linux-64|rhel62",
]),
])
lifecycle = ci_tags.TagsConfig.from_dict(
dict(selector=dict(js_test=copy.deepcopy(initial_tags))))
summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
self.assertEqual(initial_tags, self.assert_has_only_js_tests(lifecycle))
tests = ["jstests/core/all.js"]
report = test_failures.Report([
self.ENTRY._replace(start_date=(self.ENTRY.start_date - datetime.timedelta(days=1)),
end_date=(self.ENTRY.end_date - datetime.timedelta(days=1)),
num_pass=0,
num_fail=1),
self.ENTRY._replace(start_date=(self.ENTRY.start_date - datetime.timedelta(days=2)),
end_date=(self.ENTRY.end_date - datetime.timedelta(days=2)),
num_pass=0,
num_fail=1),
self.ENTRY._replace(num_pass=1, num_fail=0),
self.ENTRY._replace(num_pass=1, num_fail=0),
self.ENTRY._replace(num_pass=1, num_fail=0, task="jsCore"),
self.ENTRY._replace(num_pass=1, num_fail=0, variant="linux-64-debug"),
self.ENTRY._replace(num_pass=1, num_fail=0, distro="rhel55"),
])
update_test_lifecycle.validate_config(config)
update_test_lifecycle.update_tags(summary_lifecycle, config, report, tests)
updated_tags = self.assert_has_only_js_tests(lifecycle)
self.assertEqual(updated_tags, collections.OrderedDict())
class TestCombinationHelpers(unittest.TestCase):
def test_from_entry(self):
entry = test_failures._ReportEntry(
"testA", "taskA", "variantA", "distroA",
datetime.date.today(),
datetime.date.today(), 0, 0)
combination = update_test_lifecycle._test_combination_from_entry(
entry, test_failures.Report.TEST)
self.assertEqual(combination, ("testA",))
combination = update_test_lifecycle._test_combination_from_entry(
entry, test_failures.Report.TEST_TASK)
self.assertEqual(combination, ("testA", "taskA"))
combination = update_test_lifecycle._test_combination_from_entry(
entry, test_failures.Report.TEST_TASK_VARIANT)
self.assertEqual(combination, ("testA", "taskA", "variantA"))
combination = update_test_lifecycle._test_combination_from_entry(
entry, test_failures.Report.TEST_TASK_VARIANT_DISTRO)
self.assertEqual(combination, ("testA", "taskA", "variantA", "distroA"))
def test_make_from_tag(self):
test = "testA"
combination = update_test_lifecycle._test_combination_from_tag(
test, "unreliable")
self.assertEqual(combination, ("testA",))
combination = update_test_lifecycle._test_combination_from_tag(
test, "unreliable|taskA")
self.assertEqual(combination, ("testA", "taskA"))
combination = update_test_lifecycle._test_combination_from_tag(
test, "unreliable|taskA|variantA")
self.assertEqual(combination, ("testA", "taskA", "variantA"))
combination = update_test_lifecycle._test_combination_from_tag(
test, "unreliable|taskA|variantA|distroA")
self.assertEqual(combination, ("testA", "taskA", "variantA", "distroA"))
class TestCleanUpTags(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.evg = MockEvergreenConfig(["task1", "task2", "task3"],
{"variant1": {"tasks": ["task1", "task2"],
"distros": ["distro1"]},
"variant2": {"tasks": ["task3"],
"distros": ["distro2"]}})
def test_is_unreliable_tag_relevant(self):
self.assertTrue(update_test_lifecycle._is_tag_still_relevant(self.evg, "unreliable"))
def test_is_unknown_task_relevant(self):
self.assertFalse(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task_unknown"))
def test_is_known_task_relevant(self):
self.assertTrue(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task1"))
self.assertTrue(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task2"))
self.assertTrue(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task3"))
def test_is_unknown_variant_relevant(self):
self.assertFalse(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task1|variant3"
))
def test_is_unknown_task_variant_relevant(self):
self.assertFalse(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task3|variant1"))
self.assertFalse(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task1|variant2"))
def test_is_known_task_variant_relevant(self):
self.assertTrue(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task1|variant1"))
self.assertTrue(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task2|variant1"))
self.assertTrue(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task3|variant2"))
def test_is_unknown_task_variant_distro_relevant(self):
self.assertFalse(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task1|variant1|distro2"))
self.assertFalse(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task3|variant2|distro1"))
def test_is_known_task_variant_distro_relevant(self):
self.assertTrue(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task1|variant1|distro1"))
self.assertTrue(update_test_lifecycle._is_tag_still_relevant(
self.evg, "unreliable|task3|variant2|distro2"))
class MockEvergreenConfig(object):
def __init__(self, tasks, variants):
self.task_names = tasks
self.variants = {}
for name, fields in variants.items():
self.variants[name] = MockVariant(fields["tasks"], fields["distros"])
def get_variant(self, variant_name):
return self.variants.get(variant_name)
class MockVariant(object):
def __init__(self, task_names, distros):
self.task_names = task_names
self.distros = distros
class TestJiraIssueCreator(unittest.TestCase):
def test_description(self):
data = {"js_test": {"testfile1": {"tag1": 0.1, "tag2": 0.2},
"testfile2": {"tag1": 0.1, "tag3": 0.3}}}
desc = update_test_lifecycle.JiraIssueCreator._make_updated_tags_description(data)
expected = ("- *js_test*\n"
"-- {{testfile1}}\n"
"--- {{tag1}} (0.10)\n"
"--- {{tag2}} (0.20)\n"
"-- {{testfile2}}\n"
"--- {{tag1}} (0.10)\n"
"--- {{tag3}} (0.30)")
self.assertEqual(expected, desc)
def test_description_empty(self):
data = {}
desc = update_test_lifecycle.JiraIssueCreator._make_updated_tags_description(data)
expected = "_None_"
self.assertEqual(expected, desc)
def test_clean_up_description(self):
data = {"js_test": {"testfile1": ["tag1", "tag2"],
"testfile2": []}}
desc = update_test_lifecycle.JiraIssueCreator._make_tags_cleaned_up_description(data)
expected = ("- *js_test*\n"
"-- {{testfile1}}\n"
"--- {{tag1}}\n"
"--- {{tag2}}\n"
"-- {{testfile2}}\n"
"--- ALL (test file removed or renamed as part of an earlier commit)")
self.assertEqual(expected, desc)
def test_clean_up_description_empty(self):
data = {}
desc = update_test_lifecycle.JiraIssueCreator._make_tags_cleaned_up_description(data)
expected = "_None_"
self.assertEqual(expected, desc)
def test_truncate_description(self):
desc = "a" * (update_test_lifecycle.JiraIssueCreator._MAX_DESCRIPTION_SIZE - 1)
self.assertTrue(desc == update_test_lifecycle.JiraIssueCreator._truncate_description(desc))
desc += "a"
self.assertTrue(desc == update_test_lifecycle.JiraIssueCreator._truncate_description(desc))
desc += "a"
self.assertTrue(len(update_test_lifecycle.JiraIssueCreator._truncate_description(desc)) <=
update_test_lifecycle.JiraIssueCreator._MAX_DESCRIPTION_SIZE)
class TestTagsConfigWithChangelog(unittest.TestCase):
def setUp(self):
lifecycle = ci_tags.TagsConfig({"selector": {}})
self.summary_lifecycle = update_test_lifecycle.TagsConfigWithChangelog(lifecycle)
def test_add_tag(self):
self.summary_lifecycle.add_tag("js_test", "testfile1", "tag1", 0.1)
self.assertEqual({"js_test": {"testfile1": {"tag1": 0.1}}}, self.summary_lifecycle.added)
def test_remove_tag(self):
self.summary_lifecycle.lifecycle.add_tag("js_test", "testfile1", "tag1")
self.summary_lifecycle.remove_tag("js_test", "testfile1", "tag1", 0.1)
self.assertEqual({"js_test": {"testfile1": {"tag1": 0.1}}}, self.summary_lifecycle.removed)
def test_add_remove_tag(self):
self.summary_lifecycle.add_tag("js_test", "testfile1", "tag1", 0.1)
self.summary_lifecycle.remove_tag("js_test", "testfile1", "tag1", 0.4)
self.assertEqual({}, self.summary_lifecycle.added)
self.assertEqual({}, self.summary_lifecycle.removed)
def test_remove_add_tag(self):
self.summary_lifecycle.lifecycle.add_tag("js_test", "testfile1", "tag1")
self.summary_lifecycle.remove_tag("js_test", "testfile1", "tag1", 0.1)
self.summary_lifecycle.add_tag("js_test", "testfile1", "tag1", 0.1)
self.assertEqual({}, self.summary_lifecycle.added)
self.assertEqual({}, self.summary_lifecycle.removed)
| 44.2 | 100 | 0.652532 | 5,234 | 46,410 | 5.466183 | 0.047 | 0.052849 | 0.077036 | 0.047815 | 0.906466 | 0.894443 | 0.877246 | 0.853618 | 0.837784 | 0.823209 | 0 | 0.014832 | 0.246046 | 46,410 | 1,049 | 101 | 44.242135 | 0.802806 | 0.079746 | 0 | 0.695105 | 0 | 0 | 0.076991 | 0.024297 | 0 | 0 | 0 | 0 | 0.156643 | 1 | 0.085315 | false | 0.058741 | 0.011189 | 0.001399 | 0.114685 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
81059fea219c408e2dc4755437f08b018cbb621e | 1,829 | py | Python | hdwallet/exceptions.py | phoenixburton/xhdwallet | 37e450fd4eaceff25786b42c56946e29b041375d | [
"ISC"
] | 188 | 2020-10-29T14:26:16.000Z | 2022-03-29T12:18:42.000Z | hdwallet/exceptions.py | phoenixburton/xhdwallet | 37e450fd4eaceff25786b42c56946e29b041375d | [
"ISC"
] | 31 | 2020-12-22T16:22:27.000Z | 2022-03-28T08:29:10.000Z | hdwallet/exceptions.py | phoenixburton/xhdwallet | 37e450fd4eaceff25786b42c56946e29b041375d | [
"ISC"
] | 55 | 2021-03-08T04:35:24.000Z | 2022-03-17T18:40:12.000Z | #!/usr/bin/env python
from typing import Optional
class DerivationError(Exception):
def __init__(self, error_message: str, error_detail: Optional[str] = None):
self.error_message = error_message
self.error_detail = error_detail
def __str__(self):
if self.error_detail:
return f"{self.error_message}, {self.error_detail}"
return f"{self.error_message}"
class SemanticError(Exception):
def __init__(self, error_message: str, error_detail: Optional[str] = None):
self.error_message = error_message
self.error_detail = error_detail
def __str__(self):
if self.error_detail:
return f"{self.error_message}, {self.error_detail}"
return f"{self.error_message}"
class AddressError(Exception):
def __init__(self, error_message: str, error_detail: Optional[str] = None):
self.error_message = error_message
self.error_detail = error_detail
def __str__(self):
if self.error_detail:
return f"{self.error_message}, {self.error_detail}"
return f"{self.error_message}"
class SymbolError(Exception):
def __init__(self, error_message: str, error_detail: Optional[str] = None):
self.error_message = error_message
self.error_detail = error_detail
def __str__(self):
if self.error_detail:
return f"{self.error_message}, {self.error_detail}"
return f"{self.error_message}"
class NetworkError(Exception):
def __init__(self, error_message: str, error_detail: Optional[str] = None):
self.error_message = error_message
self.error_detail = error_detail
def __str__(self):
if self.error_detail:
return f"{self.error_message}, {self.error_detail}"
return f"{self.error_message}"
| 28.578125 | 79 | 0.673592 | 228 | 1,829 | 5.008772 | 0.114035 | 0.275832 | 0.28021 | 0.183888 | 0.906305 | 0.906305 | 0.906305 | 0.906305 | 0.906305 | 0.906305 | 0 | 0 | 0.22526 | 1,829 | 63 | 80 | 29.031746 | 0.805928 | 0.010935 | 0 | 0.853659 | 0 | 0 | 0.168695 | 0.058075 | 0 | 0 | 0 | 0 | 0 | 1 | 0.243902 | false | 0 | 0.02439 | 0 | 0.634146 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
8139f396363f53a54f6b7a6a11e2a40fbfd6bbaf | 160 | py | Python | bomberman/game/board_elements/__init__.py | NaIwo/BomberManAI | 971673a133bc578d503b950bf655e37eeb75641b | [
"MIT"
] | 1 | 2022-03-21T22:48:51.000Z | 2022-03-21T22:48:51.000Z | bomberman/game/board_elements/__init__.py | NaIwo/BomberManAI | 971673a133bc578d503b950bf655e37eeb75641b | [
"MIT"
] | null | null | null | bomberman/game/board_elements/__init__.py | NaIwo/BomberManAI | 971673a133bc578d503b950bf655e37eeb75641b | [
"MIT"
] | null | null | null | from bomberman.game.board_elements.player import Player
from bomberman.game.board_elements.bomb import Bomb
from bomberman.game.board_elements.coin import Coin
| 40 | 55 | 0.86875 | 24 | 160 | 5.666667 | 0.375 | 0.286765 | 0.375 | 0.485294 | 0.661765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 160 | 3 | 56 | 53.333333 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
d4b6656ceaae9a4048614caece23d27139f3ce1d | 117 | py | Python | gym_gmazes/envs/__init__.py | perrin-isir/gym-gmazes | ceb94b4e07e6c75ba3315c0daabc210cf352ce4e | [
"BSD-3-Clause"
] | null | null | null | gym_gmazes/envs/__init__.py | perrin-isir/gym-gmazes | ceb94b4e07e6c75ba3315c0daabc210cf352ce4e | [
"BSD-3-Clause"
] | null | null | null | gym_gmazes/envs/__init__.py | perrin-isir/gym-gmazes | ceb94b4e07e6c75ba3315c0daabc210cf352ce4e | [
"BSD-3-Clause"
] | null | null | null | from gym_gmazes.envs.gmaze_dubins import GMazeDubins, GMazeGoalDubins
from gym_gmazes.envs.maze.maze import get_maze
| 39 | 69 | 0.871795 | 18 | 117 | 5.444444 | 0.611111 | 0.142857 | 0.265306 | 0.346939 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 117 | 2 | 70 | 58.5 | 0.907407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
be0c89f10ee45d00418c8bbf34918ffafb060413 | 14,158 | py | Python | torchsparse/nn/modules/conv.py | ashawkey/torchsparse | 775120df77e8daea2011ff16580cbaa2c103540b | [
"MIT"
] | 1 | 2021-11-01T16:57:22.000Z | 2021-11-01T16:57:22.000Z | torchsparse/nn/modules/conv.py | ashawkey/torchsparse | 775120df77e8daea2011ff16580cbaa2c103540b | [
"MIT"
] | null | null | null | torchsparse/nn/modules/conv.py | ashawkey/torchsparse | 775120df77e8daea2011ff16580cbaa2c103540b | [
"MIT"
] | null | null | null | import math
import numpy as np
import torch
from torch import nn
from torchsparse.sparse_tensor import *
from torchsparse.utils import make_list
from ..functional import *
__all__ = ['Conv3d', 'DeformConv3d', 'Conv4d', 'DepthwiseConv4d', 'DepthwiseConv3d', 'Conv3_5d']
class Conv3d(nn.Module):
def __init__(self,
in_channels,
out_channels,
kernel_size=3,
stride=1,
dilation=1,
bias=False,
transpose=False):
super().__init__()
self.in_channels = in_channels
self.out_channels = out_channels
self.kernel_size = make_list(kernel_size)
self.stride = make_list(stride)
self.dilation = make_list(dilation)
if np.prod(self.kernel_size) > 1:
self.kernel = nn.Parameter(torch.zeros(np.prod(self.kernel_size), in_channels, out_channels))
else:
assert not transpose
self.kernel = nn.Parameter(torch.zeros(in_channels, out_channels))
self.bias = None if not bias else nn.Parameter(torch.zeros(out_channels))
self.t = transpose
self.init_weight()
def __repr__(self):
if not self.t:
return 'Conv3d(in_channels=%d, out_channels=%d, kernel_size=%d, stride=%d, dilation=%d)' % (
self.in_channels, self.out_channels, self.kernel_size,
self.stride, self.dilation)
else:
return 'TransposedConv3d(in_channels=%d, out_channels=%d, kernel_size=%d, stride=%d, dilation=%d)' % (
self.in_channels, self.out_channels, self.kernel_size,
self.stride, self.dilation)
def init_weight(self):
std = 1. / math.sqrt(self.out_channels if self.t else self.in_channels * np.prod(self.kernel_size))
self.kernel.data.uniform_(-std, std)
if self.bias is not None:
self.bias.data.uniform_(-std, std)
def forward(self, inputs):
return conv3d(inputs,
self.kernel,
self.bias,
kernel_size=self.kernel_size,
stride=self.stride,
dilation=self.dilation,
transpose=self.t)
'''
class XConv3d(nn.Module):
def __init__(self,
in_channels: int,
out_channels: int,
kernel_size: int = 3,
stride: int = 1,
dilation: int = 1,
bias: bool = False,
transpose: bool = False) -> None:
super().__init__()
self.in_channels = in_channels = in_channels
self.out_channels = out_channels = out_channels
self.kernel_size = kernel_size
self.stride = stride
self.dilation = dilation
self.kernel = nn.Parameter(torch.zeros(self.kernel_size ** 3, in_channels, out_channels)) if self.kernel_size > 1 else \
nn.Parameter(torch.zeros(in_channels, out_channels))
self.bias = None if not bias else nn.Parameter(torch.zeros(out_channels))
self.t = transpose
self.init_weight()
if kernel_size == 1:
assert not transpose
def __repr__(self):
if not self.t:
return 'XConv3d(in_channels=%d, out_channels=%d, kernel_size=%d, stride=%d, dilation=%d)' % (
self.in_channels, self.out_channels, self.kernel_size,
self.stride, self.dilation)
else:
return 'XConv3d(in_channels=%d, out_channels=%d, kernel_size=%d, stride=%d, dilation=%d)' % (
self.in_channels, self.out_channels, self.kernel_size,
self.stride, self.dilation)
def init_weight(self):
std = 1. / math.sqrt(
self.out_channels if self.t else self.in_channels *
(self.kernel_size ** 3))
self.kernel.data.uniform_(-std, std)
if self.bias is not None:
self.bias.data.uniform_(-std, std)
def forward(self, inputs, space):
return xconv3d(inputs, space,
self.kernel,
self.bias,
stride=self.stride,
dilation=self.dilation,
transpose=self.t)
'''
class DeformConv3d(nn.Module):
def __init__(self,
in_channels,
out_channels,
kernel_size=3,
stride=1,
dilation=1,
bias=False,
transpose=False):
super().__init__()
self.in_channels = in_channels
self.out_channels = out_channels
self.kernel_size = make_list(kernel_size)
self.stride = make_list(stride)
self.dilation = make_list(dilation)
assert(np.prod(kernel_size) > 1)
assert(not transpose)
self.kernel = nn.Parameter(torch.zeros(np.prod(self.kernel_size), in_channels, out_channels))
self.bias = None if not bias else nn.Parameter(torch.zeros(out_channels))
self.pkernel = nn.Parameter(torch.zeros(np.prod(self.kernel_size), in_channels, 3 * np.prod(self.kernel_size)))
self.pkernel.register_hook(self._set_lr)
self.init_weight()
@staticmethod
def _set_lr(grad):
return grad * 0.1
def __repr__(self):
return 'DeformConv3d(in_channels=%d, out_channels=%d, kernel_size=%d, stride=%d, dilation=%d)' % (
self.in_channels, self.out_channels, self.kernel_size,
self.stride, self.dilation)
def init_weight(self):
std = 1. / math.sqrt(self.in_channels * np.prod(self.kernel_size))
self.kernel.data.uniform_(-std, std)
if self.bias is not None:
self.bias.data.uniform_(-std, std)
self.pkernel.data.zero_()
def forward(self, inputs):
return deformconv3d(inputs,
self.kernel,
self.bias,
self.pkernel,
kernel_size=self.kernel_size,
stride=self.stride,
dilation=self.dilation,
transpose=self.t)
class Conv4d(nn.Module):
def __init__(self,
in_channels,
out_channels,
kernel_size=3,
stride=1,
dilation=1,
bias=False,
transpose=False):
super().__init__()
self.in_channels = in_channels
self.out_channels = out_channels
self.kernel_size = make_list(kernel_size, 4)
self.stride = make_list(stride, 4)
self.dilation = make_list(dilation, 4)
if isinstance(self.kernel_size, str):
if self.kernel_size == 'hypercross':
self.kernel = nn.Parameter(torch.zeros(9, in_channels, out_channels))
else:
raise NotImplementedError
elif np.prod(self.kernel_size) > 1:
self.kernel = nn.Parameter(torch.zeros(np.prod(self.kernel_size), in_channels, out_channels))
else:
assert not transpose
self.kernel = nn.Parameter(torch.zeros(in_channels, out_channels))
self.bias = None if not bias else nn.Parameter(torch.zeros(out_channels))
self.t = transpose
self.init_weight()
if kernel_size == 1:
assert not transpose
def __repr__(self):
if not self.t:
return 'Conv4d(in_channels={}, out_channels={}, kernel_size={}, stride={}, dilation={})'.format(
self.in_channels, self.out_channels, self.kernel_size,
self.stride, self.dilation)
else:
return 'TransposedConv4d(in_channels={}, out_channels={}, kernel_size={}, stride={}, dilation={})'.format(
self.in_channels, self.out_channels, self.kernel_size,
self.stride, self.dilation)
def init_weight(self):
std = 1. / math.sqrt(self.out_channels if self.t else self.in_channels * self.kernel.shape[0])
self.kernel.data.uniform_(-std, std)
if self.bias is not None:
self.bias.data.uniform_(-std, std)
def forward(self, inputs):
return conv4d(inputs,
self.kernel,
self.bias,
kernel_size=self.kernel_size,
stride=self.stride,
dilation=self.dilation,
transpose=self.t)
# channel-separated (depth-wise group conv)
class DepthwiseConv3d(nn.Module):
def __init__(self,
in_channels,
kernel_size=3,
stride=1,
dilation=1,
bias=False,
transpose=False):
super().__init__()
self.in_channels = in_channels
self.kernel_size = make_list(kernel_size)
self.stride = make_list(stride)
self.dilation = make_list(dilation)
assert (np.prod(self.kernel_size) > 1)
self.kernel = nn.Parameter(torch.zeros(np.prod(self.kernel_size), in_channels, 1))
self.bias = None
self.t = transpose
self.init_weight()
def __repr__(self):
return 'DepthwiseConv3d(in_channels=%d, kernel_size=%d, stride=%d, dilation=%d)' % (
self.in_channels, self.kernel_size,
self.stride, self.dilation)
def init_weight(self):
std = 1. / math.sqrt(self.in_channels * np.prod(self.kernel_size))
self.kernel.data.uniform_(-std, std)
if self.bias is not None:
self.bias.data.uniform_(-std, std)
def forward(self, inputs):
return dwconv3d(inputs,
self.kernel,
self.bias,
kernel_size=self.kernel_size,
stride=self.stride,
dilation=self.dilation,
transpose=self.t)
class DepthwiseConv4d(nn.Module):
def __init__(self,
in_channels,
kernel_size=3,
stride=1,
dilation=1,
bias=False,
transpose=False):
super().__init__()
self.in_channels = in_channels
self.kernel_size = make_list(kernel_size, 4)
self.stride = make_list(stride, 4)
self.dilation = make_list(dilation, 4)
assert (np.prod(self.kernel_size) > 1)
self.kernel = nn.Parameter(torch.zeros(np.prod(self.kernel_size), in_channels, 1))
self.bias = None
self.t = transpose
self.init_weight()
def __repr__(self):
return 'DepthwiseConv4d(in_channels=%d, kernel_size=%d, stride=%d, dilation=%d)' % (
self.in_channels, self.kernel_size,
self.stride, self.dilation)
def init_weight(self):
std = 1. / math.sqrt(self.in_channels * np.prod(self.kernel_size))
self.kernel.data.uniform_(-std, std)
if self.bias is not None:
self.bias.data.uniform_(-std, std)
def forward(self, inputs):
return dwconv4d(inputs,
self.kernel,
self.bias,
kernel_size=self.kernel_size,
stride=self.stride,
dilation=self.dilation,
transpose=self.t)
class Conv3_5d(nn.Module):
def __init__(self,
in_channels,
out_channels,
kernel_size=3,
stride=1,
dilation=1,
bias=False,
transpose=False):
super().__init__()
self.in_channels = in_channels
self.out_channels = out_channels
assert isinstance(kernel_size, int)
assert isinstance(stride, int)
assert isinstance(dilation, int)
self.kernel_size = np.array([kernel_size, kernel_size, kernel_size])
self.stride = np.array([1, stride, stride, stride])
self.dilation = np.array([1, dilation, dilation, dilation])
if np.prod(self.kernel_size) > 1:
self.kernel = nn.Parameter(torch.zeros(np.prod(self.kernel_size), in_channels, out_channels))
else:
assert not transpose
self.kernel = nn.Parameter(torch.zeros(in_channels, out_channels))
self.bias = None if not bias else nn.Parameter(torch.zeros(out_channels))
self.t = transpose
self.init_weight()
def __repr__(self):
if not self.t:
return 'Conv3d(in_channels=%d, out_channels=%d, kernel_size=%d, stride=%d, dilation=%d)' % (
self.in_channels, self.out_channels, self.kernel_size,
self.stride, self.dilation)
else:
return 'TransposedConv3d(in_channels=%d, out_channels=%d, kernel_size=%d, stride=%d, dilation=%d)' % (
self.in_channels, self.out_channels, self.kernel_size,
self.stride, self.dilation)
def init_weight(self):
std = 1. / math.sqrt(self.out_channels if self.t else self.in_channels * np.prod(self.kernel_size))
self.kernel.data.uniform_(-std, std)
if self.bias is not None:
self.bias.data.uniform_(-std, std)
def forward(self, inputs):
return conv3_5d(inputs,
self.kernel,
self.bias,
kernel_size=self.kernel_size,
stride=self.stride,
dilation=self.dilation,
transpose=self.t) | 35.662469 | 129 | 0.542026 | 1,577 | 14,158 | 4.6487 | 0.060241 | 0.114582 | 0.089756 | 0.054017 | 0.858273 | 0.847497 | 0.826899 | 0.816396 | 0.816396 | 0.807666 | 0 | 0.009688 | 0.358455 | 14,158 | 397 | 130 | 35.662469 | 0.797424 | 0.002896 | 0 | 0.82397 | 0 | 0.018727 | 0.068803 | 0.021592 | 0 | 0 | 0 | 0 | 0.041199 | 1 | 0.093633 | false | 0 | 0.026217 | 0.037453 | 0.202247 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
07e5ec7bb12e3ac98b4fbe6b44190225310eadc9 | 122 | py | Python | Python/XML/XML1.py | abivilion/Hackerank-Solutions- | e195fb1fce1588171cf12d99d38da32ca5c8276a | [
"MIT"
] | null | null | null | Python/XML/XML1.py | abivilion/Hackerank-Solutions- | e195fb1fce1588171cf12d99d38da32ca5c8276a | [
"MIT"
] | null | null | null | Python/XML/XML1.py | abivilion/Hackerank-Solutions- | e195fb1fce1588171cf12d99d38da32ca5c8276a | [
"MIT"
] | null | null | null |
def get_attr_number(node):
# print(etree.tostring(node))
return etree.tostring(node).count(b'=')
| 17.428571 | 50 | 0.598361 | 15 | 122 | 4.733333 | 0.733333 | 0.366197 | 0.478873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.254098 | 122 | 6 | 51 | 20.333333 | 0.78022 | 0.221311 | 0 | 0 | 0 | 0 | 0.012195 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
ed37368127218c1a7f8f9e81d2f2a08b5a101e48 | 73,629 | py | Python | pybind/slxos/v16r_1_00b/igmp_snooping_state/igmp_statistics/igmp_statistics_/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/igmp_snooping_state/igmp_statistics/igmp_statistics_/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/igmp_snooping_state/igmp_statistics/igmp_statistics_/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
class igmp_statistics(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-mc-hms-operational - based on the path /igmp-snooping-state/igmp-statistics/igmp-statistics. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__interface_name','__query_edge_recvd','__query_edge_sent','__query_edge_rx_errors','__v1_report_edge_recvd','__v1_report_edge_sent','__v1_report_edge_rx_errors','__v2_report_edge_recvd','__v2_report_edge_sent','__v2_report_edge_rx_errors','__v3_report_edge_recvd','__v3_report_edge_sent','__v3_report_edge_rx_errors','__grp_leave_edge_recvd','__grp_leave_edge_sent','__grp_leave_edge_rx_errors','__pim_hello_edge_recvd','__pim_hello_edge_sent','__pim_hello_edge_rx_errors','__error_unknown_types','__error_bad_length','__error_bad_checksum',)
_yang_name = 'igmp-statistics'
_rest_name = 'igmp-statistics'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__error_unknown_types = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-unknown-types", rest_name="error-unknown-types", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__pim_hello_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-recvd", rest_name="pim-hello-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__query_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-recvd", rest_name="query-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__error_bad_checksum = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-bad-checksum", rest_name="error-bad-checksum", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__v2_report_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-recvd", rest_name="v2-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__pim_hello_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-sent", rest_name="pim-hello-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__error_bad_length = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-bad-length", rest_name="error-bad-length", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__v1_report_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-sent", rest_name="v1-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__v1_report_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-recvd", rest_name="v1-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__v2_report_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-sent", rest_name="v2-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__query_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-rx-errors", rest_name="query-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__interface_name = YANGDynClass(base=unicode, is_leaf=True, yang_name="interface-name", rest_name="interface-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)
self.__v2_report_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-rx-errors", rest_name="v2-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__v3_report_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-rx-errors", rest_name="v3-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__v1_report_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-rx-errors", rest_name="v1-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__pim_hello_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-rx-errors", rest_name="pim-hello-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__grp_leave_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-sent", rest_name="grp-leave-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__query_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-sent", rest_name="query-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__grp_leave_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-rx-errors", rest_name="grp-leave-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__grp_leave_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-recvd", rest_name="grp-leave-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__v3_report_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-recvd", rest_name="v3-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
self.__v3_report_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-sent", rest_name="v3-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'igmp-snooping-state', u'igmp-statistics', u'igmp-statistics']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'igmp-snooping-state', u'igmp-statistics', u'igmp-statistics']
def _get_interface_name(self):
"""
Getter method for interface_name, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/interface_name (string)
YANG Description: interface_name
"""
return self.__interface_name
def _set_interface_name(self, v, load=False):
"""
Setter method for interface_name, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/interface_name (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_interface_name is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_interface_name() directly.
YANG Description: interface_name
"""
parent = getattr(self, "_parent", None)
if parent is not None and load is False:
raise AttributeError("Cannot set keys directly when" +
" within an instantiated list")
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="interface-name", rest_name="interface-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """interface_name must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="interface-name", rest_name="interface-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)""",
})
self.__interface_name = t
if hasattr(self, '_set'):
self._set()
def _unset_interface_name(self):
self.__interface_name = YANGDynClass(base=unicode, is_leaf=True, yang_name="interface-name", rest_name="interface-name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, is_keyval=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='string', is_config=False)
def _get_query_edge_recvd(self):
"""
Getter method for query_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/query_edge_recvd (uint32)
YANG Description: query_edge_recvd
"""
return self.__query_edge_recvd
def _set_query_edge_recvd(self, v, load=False):
"""
Setter method for query_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/query_edge_recvd (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_query_edge_recvd is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_query_edge_recvd() directly.
YANG Description: query_edge_recvd
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-recvd", rest_name="query-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """query_edge_recvd must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-recvd", rest_name="query-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__query_edge_recvd = t
if hasattr(self, '_set'):
self._set()
def _unset_query_edge_recvd(self):
self.__query_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-recvd", rest_name="query-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_query_edge_sent(self):
"""
Getter method for query_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/query_edge_sent (uint32)
YANG Description: query_edge_sent
"""
return self.__query_edge_sent
def _set_query_edge_sent(self, v, load=False):
"""
Setter method for query_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/query_edge_sent (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_query_edge_sent is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_query_edge_sent() directly.
YANG Description: query_edge_sent
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-sent", rest_name="query-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """query_edge_sent must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-sent", rest_name="query-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__query_edge_sent = t
if hasattr(self, '_set'):
self._set()
def _unset_query_edge_sent(self):
self.__query_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-sent", rest_name="query-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_query_edge_rx_errors(self):
"""
Getter method for query_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/query_edge_rx_errors (uint32)
YANG Description: query_edge_rx_errors
"""
return self.__query_edge_rx_errors
def _set_query_edge_rx_errors(self, v, load=False):
"""
Setter method for query_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/query_edge_rx_errors (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_query_edge_rx_errors is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_query_edge_rx_errors() directly.
YANG Description: query_edge_rx_errors
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-rx-errors", rest_name="query-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """query_edge_rx_errors must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-rx-errors", rest_name="query-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__query_edge_rx_errors = t
if hasattr(self, '_set'):
self._set()
def _unset_query_edge_rx_errors(self):
self.__query_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="query-edge-rx-errors", rest_name="query-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_v1_report_edge_recvd(self):
"""
Getter method for v1_report_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v1_report_edge_recvd (uint32)
YANG Description: v1_report_edge_recvd
"""
return self.__v1_report_edge_recvd
def _set_v1_report_edge_recvd(self, v, load=False):
"""
Setter method for v1_report_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v1_report_edge_recvd (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_v1_report_edge_recvd is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_v1_report_edge_recvd() directly.
YANG Description: v1_report_edge_recvd
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-recvd", rest_name="v1-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """v1_report_edge_recvd must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-recvd", rest_name="v1-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__v1_report_edge_recvd = t
if hasattr(self, '_set'):
self._set()
def _unset_v1_report_edge_recvd(self):
self.__v1_report_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-recvd", rest_name="v1-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_v1_report_edge_sent(self):
"""
Getter method for v1_report_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v1_report_edge_sent (uint32)
YANG Description: v1_report_edge_sent
"""
return self.__v1_report_edge_sent
def _set_v1_report_edge_sent(self, v, load=False):
"""
Setter method for v1_report_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v1_report_edge_sent (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_v1_report_edge_sent is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_v1_report_edge_sent() directly.
YANG Description: v1_report_edge_sent
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-sent", rest_name="v1-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """v1_report_edge_sent must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-sent", rest_name="v1-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__v1_report_edge_sent = t
if hasattr(self, '_set'):
self._set()
def _unset_v1_report_edge_sent(self):
self.__v1_report_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-sent", rest_name="v1-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_v1_report_edge_rx_errors(self):
"""
Getter method for v1_report_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v1_report_edge_rx_errors (uint32)
YANG Description: v1_report_edge_rx_errors
"""
return self.__v1_report_edge_rx_errors
def _set_v1_report_edge_rx_errors(self, v, load=False):
"""
Setter method for v1_report_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v1_report_edge_rx_errors (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_v1_report_edge_rx_errors is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_v1_report_edge_rx_errors() directly.
YANG Description: v1_report_edge_rx_errors
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-rx-errors", rest_name="v1-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """v1_report_edge_rx_errors must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-rx-errors", rest_name="v1-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__v1_report_edge_rx_errors = t
if hasattr(self, '_set'):
self._set()
def _unset_v1_report_edge_rx_errors(self):
self.__v1_report_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v1-report-edge-rx-errors", rest_name="v1-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_v2_report_edge_recvd(self):
"""
Getter method for v2_report_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v2_report_edge_recvd (uint32)
YANG Description: v2_report_edge_recvd
"""
return self.__v2_report_edge_recvd
def _set_v2_report_edge_recvd(self, v, load=False):
"""
Setter method for v2_report_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v2_report_edge_recvd (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_v2_report_edge_recvd is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_v2_report_edge_recvd() directly.
YANG Description: v2_report_edge_recvd
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-recvd", rest_name="v2-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """v2_report_edge_recvd must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-recvd", rest_name="v2-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__v2_report_edge_recvd = t
if hasattr(self, '_set'):
self._set()
def _unset_v2_report_edge_recvd(self):
self.__v2_report_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-recvd", rest_name="v2-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_v2_report_edge_sent(self):
"""
Getter method for v2_report_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v2_report_edge_sent (uint32)
YANG Description: v2_report_edge_sent
"""
return self.__v2_report_edge_sent
def _set_v2_report_edge_sent(self, v, load=False):
"""
Setter method for v2_report_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v2_report_edge_sent (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_v2_report_edge_sent is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_v2_report_edge_sent() directly.
YANG Description: v2_report_edge_sent
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-sent", rest_name="v2-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """v2_report_edge_sent must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-sent", rest_name="v2-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__v2_report_edge_sent = t
if hasattr(self, '_set'):
self._set()
def _unset_v2_report_edge_sent(self):
self.__v2_report_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-sent", rest_name="v2-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_v2_report_edge_rx_errors(self):
"""
Getter method for v2_report_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v2_report_edge_rx_errors (uint32)
YANG Description: v2_report_edge_rx_errors
"""
return self.__v2_report_edge_rx_errors
def _set_v2_report_edge_rx_errors(self, v, load=False):
"""
Setter method for v2_report_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v2_report_edge_rx_errors (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_v2_report_edge_rx_errors is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_v2_report_edge_rx_errors() directly.
YANG Description: v2_report_edge_rx_errors
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-rx-errors", rest_name="v2-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """v2_report_edge_rx_errors must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-rx-errors", rest_name="v2-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__v2_report_edge_rx_errors = t
if hasattr(self, '_set'):
self._set()
def _unset_v2_report_edge_rx_errors(self):
self.__v2_report_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v2-report-edge-rx-errors", rest_name="v2-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_v3_report_edge_recvd(self):
"""
Getter method for v3_report_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v3_report_edge_recvd (uint32)
YANG Description: v3_report_edge_recvd
"""
return self.__v3_report_edge_recvd
def _set_v3_report_edge_recvd(self, v, load=False):
"""
Setter method for v3_report_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v3_report_edge_recvd (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_v3_report_edge_recvd is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_v3_report_edge_recvd() directly.
YANG Description: v3_report_edge_recvd
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-recvd", rest_name="v3-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """v3_report_edge_recvd must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-recvd", rest_name="v3-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__v3_report_edge_recvd = t
if hasattr(self, '_set'):
self._set()
def _unset_v3_report_edge_recvd(self):
self.__v3_report_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-recvd", rest_name="v3-report-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_v3_report_edge_sent(self):
"""
Getter method for v3_report_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v3_report_edge_sent (uint32)
YANG Description: v3_report_edge_sent
"""
return self.__v3_report_edge_sent
def _set_v3_report_edge_sent(self, v, load=False):
"""
Setter method for v3_report_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v3_report_edge_sent (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_v3_report_edge_sent is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_v3_report_edge_sent() directly.
YANG Description: v3_report_edge_sent
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-sent", rest_name="v3-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """v3_report_edge_sent must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-sent", rest_name="v3-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__v3_report_edge_sent = t
if hasattr(self, '_set'):
self._set()
def _unset_v3_report_edge_sent(self):
self.__v3_report_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-sent", rest_name="v3-report-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_v3_report_edge_rx_errors(self):
"""
Getter method for v3_report_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v3_report_edge_rx_errors (uint32)
YANG Description: v3_report_edge_rx_errors
"""
return self.__v3_report_edge_rx_errors
def _set_v3_report_edge_rx_errors(self, v, load=False):
"""
Setter method for v3_report_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/v3_report_edge_rx_errors (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_v3_report_edge_rx_errors is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_v3_report_edge_rx_errors() directly.
YANG Description: v3_report_edge_rx_errors
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-rx-errors", rest_name="v3-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """v3_report_edge_rx_errors must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-rx-errors", rest_name="v3-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__v3_report_edge_rx_errors = t
if hasattr(self, '_set'):
self._set()
def _unset_v3_report_edge_rx_errors(self):
self.__v3_report_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="v3-report-edge-rx-errors", rest_name="v3-report-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_grp_leave_edge_recvd(self):
"""
Getter method for grp_leave_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/grp_leave_edge_recvd (uint32)
YANG Description: grp_leave_edge_recvd
"""
return self.__grp_leave_edge_recvd
def _set_grp_leave_edge_recvd(self, v, load=False):
"""
Setter method for grp_leave_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/grp_leave_edge_recvd (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_grp_leave_edge_recvd is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_grp_leave_edge_recvd() directly.
YANG Description: grp_leave_edge_recvd
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-recvd", rest_name="grp-leave-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """grp_leave_edge_recvd must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-recvd", rest_name="grp-leave-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__grp_leave_edge_recvd = t
if hasattr(self, '_set'):
self._set()
def _unset_grp_leave_edge_recvd(self):
self.__grp_leave_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-recvd", rest_name="grp-leave-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_grp_leave_edge_sent(self):
"""
Getter method for grp_leave_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/grp_leave_edge_sent (uint32)
YANG Description: grp_leave_edge_sent
"""
return self.__grp_leave_edge_sent
def _set_grp_leave_edge_sent(self, v, load=False):
"""
Setter method for grp_leave_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/grp_leave_edge_sent (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_grp_leave_edge_sent is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_grp_leave_edge_sent() directly.
YANG Description: grp_leave_edge_sent
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-sent", rest_name="grp-leave-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """grp_leave_edge_sent must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-sent", rest_name="grp-leave-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__grp_leave_edge_sent = t
if hasattr(self, '_set'):
self._set()
def _unset_grp_leave_edge_sent(self):
self.__grp_leave_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-sent", rest_name="grp-leave-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_grp_leave_edge_rx_errors(self):
"""
Getter method for grp_leave_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/grp_leave_edge_rx_errors (uint32)
YANG Description: grp_leave_edge_rx_errors
"""
return self.__grp_leave_edge_rx_errors
def _set_grp_leave_edge_rx_errors(self, v, load=False):
"""
Setter method for grp_leave_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/grp_leave_edge_rx_errors (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_grp_leave_edge_rx_errors is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_grp_leave_edge_rx_errors() directly.
YANG Description: grp_leave_edge_rx_errors
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-rx-errors", rest_name="grp-leave-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """grp_leave_edge_rx_errors must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-rx-errors", rest_name="grp-leave-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__grp_leave_edge_rx_errors = t
if hasattr(self, '_set'):
self._set()
def _unset_grp_leave_edge_rx_errors(self):
self.__grp_leave_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="grp-leave-edge-rx-errors", rest_name="grp-leave-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_pim_hello_edge_recvd(self):
"""
Getter method for pim_hello_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/pim_hello_edge_recvd (uint32)
YANG Description: pim_hello_edge_recvd
"""
return self.__pim_hello_edge_recvd
def _set_pim_hello_edge_recvd(self, v, load=False):
"""
Setter method for pim_hello_edge_recvd, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/pim_hello_edge_recvd (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_pim_hello_edge_recvd is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_pim_hello_edge_recvd() directly.
YANG Description: pim_hello_edge_recvd
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-recvd", rest_name="pim-hello-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """pim_hello_edge_recvd must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-recvd", rest_name="pim-hello-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__pim_hello_edge_recvd = t
if hasattr(self, '_set'):
self._set()
def _unset_pim_hello_edge_recvd(self):
self.__pim_hello_edge_recvd = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-recvd", rest_name="pim-hello-edge-recvd", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_pim_hello_edge_sent(self):
"""
Getter method for pim_hello_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/pim_hello_edge_sent (uint32)
YANG Description: pim_hello_edge_sent
"""
return self.__pim_hello_edge_sent
def _set_pim_hello_edge_sent(self, v, load=False):
"""
Setter method for pim_hello_edge_sent, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/pim_hello_edge_sent (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_pim_hello_edge_sent is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_pim_hello_edge_sent() directly.
YANG Description: pim_hello_edge_sent
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-sent", rest_name="pim-hello-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """pim_hello_edge_sent must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-sent", rest_name="pim-hello-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__pim_hello_edge_sent = t
if hasattr(self, '_set'):
self._set()
def _unset_pim_hello_edge_sent(self):
self.__pim_hello_edge_sent = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-sent", rest_name="pim-hello-edge-sent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_pim_hello_edge_rx_errors(self):
"""
Getter method for pim_hello_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/pim_hello_edge_rx_errors (uint32)
YANG Description: pim_hello_edge_rx_errors
"""
return self.__pim_hello_edge_rx_errors
def _set_pim_hello_edge_rx_errors(self, v, load=False):
"""
Setter method for pim_hello_edge_rx_errors, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/pim_hello_edge_rx_errors (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_pim_hello_edge_rx_errors is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_pim_hello_edge_rx_errors() directly.
YANG Description: pim_hello_edge_rx_errors
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-rx-errors", rest_name="pim-hello-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """pim_hello_edge_rx_errors must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-rx-errors", rest_name="pim-hello-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__pim_hello_edge_rx_errors = t
if hasattr(self, '_set'):
self._set()
def _unset_pim_hello_edge_rx_errors(self):
self.__pim_hello_edge_rx_errors = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="pim-hello-edge-rx-errors", rest_name="pim-hello-edge-rx-errors", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_error_unknown_types(self):
"""
Getter method for error_unknown_types, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/error_unknown_types (uint32)
YANG Description: error_unknown_types
"""
return self.__error_unknown_types
def _set_error_unknown_types(self, v, load=False):
"""
Setter method for error_unknown_types, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/error_unknown_types (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_error_unknown_types is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_error_unknown_types() directly.
YANG Description: error_unknown_types
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-unknown-types", rest_name="error-unknown-types", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """error_unknown_types must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-unknown-types", rest_name="error-unknown-types", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__error_unknown_types = t
if hasattr(self, '_set'):
self._set()
def _unset_error_unknown_types(self):
self.__error_unknown_types = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-unknown-types", rest_name="error-unknown-types", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_error_bad_length(self):
"""
Getter method for error_bad_length, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/error_bad_length (uint32)
YANG Description: error_bad_length
"""
return self.__error_bad_length
def _set_error_bad_length(self, v, load=False):
"""
Setter method for error_bad_length, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/error_bad_length (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_error_bad_length is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_error_bad_length() directly.
YANG Description: error_bad_length
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-bad-length", rest_name="error-bad-length", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """error_bad_length must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-bad-length", rest_name="error-bad-length", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__error_bad_length = t
if hasattr(self, '_set'):
self._set()
def _unset_error_bad_length(self):
self.__error_bad_length = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-bad-length", rest_name="error-bad-length", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
def _get_error_bad_checksum(self):
"""
Getter method for error_bad_checksum, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/error_bad_checksum (uint32)
YANG Description: error_bad_checksum
"""
return self.__error_bad_checksum
def _set_error_bad_checksum(self, v, load=False):
"""
Setter method for error_bad_checksum, mapped from YANG variable /igmp_snooping_state/igmp_statistics/igmp_statistics/error_bad_checksum (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_error_bad_checksum is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_error_bad_checksum() directly.
YANG Description: error_bad_checksum
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-bad-checksum", rest_name="error-bad-checksum", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """error_bad_checksum must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-bad-checksum", rest_name="error-bad-checksum", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)""",
})
self.__error_bad_checksum = t
if hasattr(self, '_set'):
self._set()
def _unset_error_bad_checksum(self):
self.__error_bad_checksum = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="error-bad-checksum", rest_name="error-bad-checksum", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-mc-hms-operational', defining_module='brocade-mc-hms-operational', yang_type='uint32', is_config=False)
interface_name = __builtin__.property(_get_interface_name)
query_edge_recvd = __builtin__.property(_get_query_edge_recvd)
query_edge_sent = __builtin__.property(_get_query_edge_sent)
query_edge_rx_errors = __builtin__.property(_get_query_edge_rx_errors)
v1_report_edge_recvd = __builtin__.property(_get_v1_report_edge_recvd)
v1_report_edge_sent = __builtin__.property(_get_v1_report_edge_sent)
v1_report_edge_rx_errors = __builtin__.property(_get_v1_report_edge_rx_errors)
v2_report_edge_recvd = __builtin__.property(_get_v2_report_edge_recvd)
v2_report_edge_sent = __builtin__.property(_get_v2_report_edge_sent)
v2_report_edge_rx_errors = __builtin__.property(_get_v2_report_edge_rx_errors)
v3_report_edge_recvd = __builtin__.property(_get_v3_report_edge_recvd)
v3_report_edge_sent = __builtin__.property(_get_v3_report_edge_sent)
v3_report_edge_rx_errors = __builtin__.property(_get_v3_report_edge_rx_errors)
grp_leave_edge_recvd = __builtin__.property(_get_grp_leave_edge_recvd)
grp_leave_edge_sent = __builtin__.property(_get_grp_leave_edge_sent)
grp_leave_edge_rx_errors = __builtin__.property(_get_grp_leave_edge_rx_errors)
pim_hello_edge_recvd = __builtin__.property(_get_pim_hello_edge_recvd)
pim_hello_edge_sent = __builtin__.property(_get_pim_hello_edge_sent)
pim_hello_edge_rx_errors = __builtin__.property(_get_pim_hello_edge_rx_errors)
error_unknown_types = __builtin__.property(_get_error_unknown_types)
error_bad_length = __builtin__.property(_get_error_bad_length)
error_bad_checksum = __builtin__.property(_get_error_bad_checksum)
_pyangbind_elements = {'interface_name': interface_name, 'query_edge_recvd': query_edge_recvd, 'query_edge_sent': query_edge_sent, 'query_edge_rx_errors': query_edge_rx_errors, 'v1_report_edge_recvd': v1_report_edge_recvd, 'v1_report_edge_sent': v1_report_edge_sent, 'v1_report_edge_rx_errors': v1_report_edge_rx_errors, 'v2_report_edge_recvd': v2_report_edge_recvd, 'v2_report_edge_sent': v2_report_edge_sent, 'v2_report_edge_rx_errors': v2_report_edge_rx_errors, 'v3_report_edge_recvd': v3_report_edge_recvd, 'v3_report_edge_sent': v3_report_edge_sent, 'v3_report_edge_rx_errors': v3_report_edge_rx_errors, 'grp_leave_edge_recvd': grp_leave_edge_recvd, 'grp_leave_edge_sent': grp_leave_edge_sent, 'grp_leave_edge_rx_errors': grp_leave_edge_rx_errors, 'pim_hello_edge_recvd': pim_hello_edge_recvd, 'pim_hello_edge_sent': pim_hello_edge_sent, 'pim_hello_edge_rx_errors': pim_hello_edge_rx_errors, 'error_unknown_types': error_unknown_types, 'error_bad_length': error_bad_length, 'error_bad_checksum': error_bad_checksum, }
| 77.422713 | 1,024 | 0.76311 | 10,498 | 73,629 | 5.018765 | 0.01867 | 0.049538 | 0.04783 | 0.077268 | 0.957371 | 0.933247 | 0.904777 | 0.890485 | 0.882438 | 0.869474 | 0 | 0.025896 | 0.111043 | 73,629 | 950 | 1,025 | 77.504211 | 0.779063 | 0.191039 | 0 | 0.513672 | 0 | 0.042969 | 0.38191 | 0.220636 | 0 | 0 | 0 | 0 | 0 | 1 | 0.134766 | false | 0 | 0.015625 | 0 | 0.257813 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed4ca99c56b27b90c470c6de0d2aed9479bc5b9e | 1,364 | py | Python | old_algorithm/solution.py | justinkwan1216/clown-coin-machine-simulation | 28bb1eb35ee525504a7529569d03bb6fae16042a | [
"MIT"
] | null | null | null | old_algorithm/solution.py | justinkwan1216/clown-coin-machine-simulation | 28bb1eb35ee525504a7529569d03bb6fae16042a | [
"MIT"
] | null | null | null | old_algorithm/solution.py | justinkwan1216/clown-coin-machine-simulation | 28bb1eb35ee525504a7529569d03bb6fae16042a | [
"MIT"
] | null | null | null | def best_solution(array):
case=0
solution=[5,4,4,4,3,3,3,3,4,4,6,6,2,2,2,2,5,5,6,5,5,3,3,3,6,6,6,6,1,7,8,1,5,5,5,3,3,3,3,3,3,3,3,3,3,3,3,3,6,6,7,5,5,5,5,4,7,6,6,6,8,1,1,9,5,4,4,4,4,3,3,3,3,3,3,3,3,3,3,3,6,6,5,5,5,5,4,4,6,6,6,6,2,2,8,9,5,5,5,5,5,4,4,4,5,5,5,5,3,3,3,3,6,6,5,5,5,5,5,5,7,6,6,6,8,7,9,9]
solution2=[5,4,4,4,3,3,3,3,4,4,6,6,2,2,2,2,5,5,6,5,5,3,3,3,6,6,6,6,1,7,8,1,5,5,5,3,3,3,3,3,3,3,3,3,3,3,3,3,6,6,7,5,5,5,5,4,7,6,6,6,8,1,1,9,5,4,4,4,4,3,3,3,3,3,3,3,3,3,3,3,6,6,5,5,5,5,4,4,6,6,6,6,2,2,8,2,5,5,5,5,5,4,4,4,5,5,5,5,3,3,3,3,6,6,5,5,5,5,5,5,7,6,6,6,8,7,9,9]
solution3=[5,4,4,4,3,3,3,3,4,4,6,6,2,2,2,2,5,5,6,5,5,3,3,3,6,6,6,6,1,7,8,1,5,5,5,3,3,3,3,3,3,3,3,3,3,3,3,3,6,6,7,5,5,5,5,4,7,6,6,6,8,1,1,1,5,4,4,4,4,3,3,3,3,3,3,3,3,3,3,3,6,6,5,5,5,5,4,4,6,6,6,6,2,2,8,2,5,5,5,5,5,4,4,4,5,5,5,5,3,3,3,3,6,6,5,5,5,5,5,5,7,6,6,6,8,7,8,9]
solution4=[5,4,4,4,3,3,3,3,4,4,4,4,2,2,2,2,6,5,6,4,5,4,4,3,7,6,6,6,1,7,8,1,5,5,5,4,3,4,3,3,5,6,3,3,3,3,3,3,6,6,6,5,5,5,5,4,7,6,6,6,1,1,1,1,5,4,4,4,4,3,3,3,5,3,3,3,3,3,3,3,6,6,5,5,5,5,4,4,7,6,6,6,2,2,8,2,5,5,5,5,5,4,4,4,5,6,6,5,3,3,3,3,6,6,6,5,5,5,5,5,7,6,6,6,8,7,8,9]
#solution4 is symmetric
for i in range(7):
if array[i+1]==1:
case+=2**(6-i)
return solution4[case]
#best_solution([0,1,1,1,0,1,1,1,0])
#17 0 0 1 0 0 0 1
#18 0 0 1 0 0 1 0
#19 0 0 1 0 0 1 1
| 75.777778 | 272 | 0.505865 | 577 | 1,364 | 1.192374 | 0.05026 | 0.319767 | 0.388081 | 0.401163 | 0.773256 | 0.748547 | 0.731105 | 0.729651 | 0.725291 | 0.718023 | 0 | 0.447558 | 0.084311 | 1,364 | 17 | 273 | 80.235294 | 0.103283 | 0.076246 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
ed68efc5573cce6f343dce180f17ff1c728689c6 | 48,779 | py | Python | astrobase/periodbase/abls.py | pierfra-ro/astrobase | b9f62c59a3ab9cdc1388d409fa281c26f1e6db6c | [
"MIT"
] | 45 | 2017-03-09T19:08:44.000Z | 2022-03-24T00:36:28.000Z | astrobase/periodbase/abls.py | pierfra-ro/astrobase | b9f62c59a3ab9cdc1388d409fa281c26f1e6db6c | [
"MIT"
] | 92 | 2016-12-21T19:01:20.000Z | 2022-01-03T15:28:45.000Z | astrobase/periodbase/abls.py | pierfra-ro/astrobase | b9f62c59a3ab9cdc1388d409fa281c26f1e6db6c | [
"MIT"
] | 20 | 2016-12-20T23:01:29.000Z | 2021-03-07T16:24:15.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
# abls.py - Waqas Bhatti (wbhatti@astro.princeton.edu) - Jan 2017
'''
Contains the Kovacs, et al. (2002) Box-Least-squared-Search period-search
algorithm implementation for periodbase. This uses the implementation in Astropy
3.1, so requires that version.
'''
#############
## LOGGING ##
#############
import logging
from astrobase import log_sub, log_fmt, log_date_fmt
DEBUG = False
if DEBUG:
level = logging.DEBUG
else:
level = logging.INFO
LOGGER = logging.getLogger(__name__)
logging.basicConfig(
level=level,
style=log_sub,
format=log_fmt,
datefmt=log_date_fmt,
)
LOGDEBUG = LOGGER.debug
LOGINFO = LOGGER.info
LOGWARNING = LOGGER.warning
LOGERROR = LOGGER.error
LOGEXCEPTION = LOGGER.exception
#############
## IMPORTS ##
#############
from multiprocessing import Pool, cpu_count
from math import fmod
from numpy import (
nan as npnan, arange as nparange, array as nparray,
isfinite as npisfinite, argmax as npargmax, linspace as nplinspace,
ceil as npceil, argsort as npargsort, concatenate as npconcatenate
)
try:
from astropy.stats import BoxLeastSquares
except ImportError:
from astropy.timeseries import BoxLeastSquares
from astropy import units as u
###################
## LOCAL IMPORTS ##
###################
from ..lcmath import sigclip_magseries
from .utils import resort_by_time
############
## CONFIG ##
############
NCPUS = cpu_count()
#######################
## UTILITY FUNCTIONS ##
#######################
def bls_serial_pfind(times, mags, errs,
magsarefluxes=False,
startp=0.1, # search from 0.1 d to...
endp=100.0, # ... 100.0 d -- don't search full timebase
stepsize=5.0e-4,
mintransitduration=0.01, # minimum transit length in phase
maxtransitduration=0.4, # maximum transit length in phase
ndurations=100,
autofreq=True, # figure out f0, nf, and df automatically
blsobjective='likelihood',
blsmethod='fast',
blsoversample=10,
blsmintransits=3,
blsfreqfactor=10.0,
periodepsilon=0.1,
nbestpeaks=5,
sigclip=10.0,
endp_timebase_check=True,
verbose=True,
raiseonfail=False):
'''Runs the Box Least Squares Fitting Search for transit-shaped signals.
Based on the version of BLS in Astropy 3.1:
`astropy.stats.BoxLeastSquares`. If you don't have Astropy 3.1, this module
will fail to import. Note that by default, this implementation of
`bls_serial_pfind` doesn't use the `.autoperiod()` function from
`BoxLeastSquares` but uses the same auto frequency-grid generation as the
functions in `periodbase.kbls`. If you want to use Astropy's implementation,
set the value of `autofreq` kwarg to 'astropy'.
The dict returned from this function contains a `blsmodel` key, which is the
generated model from Astropy's BLS. Use the `.compute_stats()` method to
calculate the required stats like SNR, depth, duration, etc.
Parameters
----------
times,mags,errs : np.array
The magnitude/flux time-series to search for transits.
magsarefluxes : bool
If the input measurement values in `mags` and `errs` are in fluxes, set
this to True.
startp,endp : float
The minimum and maximum periods to consider for the transit search.
stepsize : float
The step-size in frequency to use when constructing a frequency grid for
the period search.
mintransitduration,maxtransitduration : float
The minimum and maximum transitdurations (in units of phase) to consider
for the transit search.
ndurations : int
The number of transit durations to use in the period-search.
autofreq : bool or str
If this is True, the values of `stepsize` and `nphasebins` will be
ignored, and these, along with a frequency-grid, will be determined
based on the following relations::
nphasebins = int(ceil(2.0/mintransitduration))
if nphasebins > 3000:
nphasebins = 3000
stepsize = 0.25*mintransitduration/(times.max()-times.min())
minfreq = 1.0/endp
maxfreq = 1.0/startp
nfreq = int(ceil((maxfreq - minfreq)/stepsize))
If this is False, you must set `startp`, `endp`, and `stepsize` as
appropriate.
If this is str == 'astropy', will use the
`astropy.stats.BoxLeastSquares.autoperiod()` function to calculate the
frequency grid instead of the kbls method.
blsobjective : {'likelihood','snr'}
Sets the type of objective to optimize in the `BoxLeastSquares.power()`
function.
blsmethod : {'fast','slow'}
Sets the type of method to use in the `BoxLeastSquares.power()`
function.
blsoversample : {'likelihood','snr'}
Sets the `oversample` kwarg for the `BoxLeastSquares.power()` function.
blsmintransits : int
Sets the `min_n_transits` kwarg for the `BoxLeastSquares.autoperiod()`
function.
blsfreqfactor : float
Sets the `frequency_factor` kwarg for the `BoxLeastSquares.autperiod()`
function.
periodepsilon : float
The fractional difference between successive values of 'best' periods
when sorting by periodogram power to consider them as separate periods
(as opposed to part of the same periodogram peak). This is used to avoid
broad peaks in the periodogram and make sure the 'best' periods returned
are all actually independent.
nbestpeaks : int
The number of 'best' peaks to return from the periodogram results,
starting from the global maximum of the periodogram peak values.
sigclip : float or int or sequence of two floats/ints or None
If a single float or int, a symmetric sigma-clip will be performed using
the number provided as the sigma-multiplier to cut out from the input
time-series.
If a list of two ints/floats is provided, the function will perform an
'asymmetric' sigma-clip. The first element in this list is the sigma
value to use for fainter flux/mag values; the second element in this
list is the sigma value to use for brighter flux/mag values. For
example, `sigclip=[10., 3.]`, will sigclip out greater than 10-sigma
dimmings and greater than 3-sigma brightenings. Here the meaning of
"dimming" and "brightening" is set by *physics* (not the magnitude
system), which is why the `magsarefluxes` kwarg must be correctly set.
If `sigclip` is None, no sigma-clipping will be performed, and the
time-series (with non-finite elems removed) will be passed through to
the output.
endp_timebase_check : bool
If True, will check if the ``endp`` value is larger than the time-base
of the observations. If it is, will change the ``endp`` value such that
it is half of the time-base. If False, will allow an ``endp`` larger
than the time-base of the observations.
verbose : bool
If this is True, will indicate progress and details about the frequency
grid used for the period search.
raiseonfail : bool
If True, raises an exception if something goes wrong. Otherwise, returns
None.
Returns
-------
dict
This function returns a dict, referred to as an `lspinfo` dict in other
astrobase functions that operate on periodogram results. This is a
standardized format across all astrobase period-finders, and is of the
form below::
{'bestperiod': the best period value in the periodogram,
'bestlspval': the periodogram peak associated with the best period,
'nbestpeaks': the input value of nbestpeaks,
'nbestlspvals': nbestpeaks-size list of best period peak values,
'nbestperiods': nbestpeaks-size list of best periods,
'lspvals': the full array of periodogram powers,
'frequencies': the full array of frequencies considered,
'periods': the full array of periods considered,
'durations': the array of durations used to run BLS,
'blsresult': Astropy BLS result object (BoxLeastSquaresResult),
'blsmodel': Astropy BLS BoxLeastSquares object used for work,
'stepsize': the actual stepsize used,
'nfreq': the actual nfreq used,
'durations': the durations array used,
'mintransitduration': the input mintransitduration,
'maxtransitduration': the input maxtransitdurations,
'method':'bls' -> the name of the period-finder method,
'kwargs':{ dict of all of the input kwargs for record-keeping}}
'''
# get rid of nans first and sigclip
stimes, smags, serrs = sigclip_magseries(times,
mags,
errs,
magsarefluxes=magsarefluxes,
sigclip=sigclip)
# resort by time
stimes, smags, errs = resort_by_time(stimes, smags, serrs)
# make sure there are enough points to calculate a spectrum
if len(stimes) > 9 and len(smags) > 9 and len(serrs) > 9:
# if we're setting up everything automatically
if isinstance(autofreq, bool) and autofreq:
# use heuristic to figure out best timestep
stepsize = 0.25*mintransitduration/(stimes.max()-stimes.min())
# now figure out the frequencies to use
minfreq = 1.0/endp
maxfreq = 1.0/startp
nfreq = int(npceil((maxfreq - minfreq)/stepsize))
# say what we're using
if verbose:
LOGINFO('min P: %s, max P: %s, nfreq: %s, '
'minfreq: %s, maxfreq: %s' % (startp, endp, nfreq,
minfreq, maxfreq))
LOGINFO('autofreq = True: using AUTOMATIC values for '
'freq stepsize: %s, ndurations: %s, '
'min transit duration: %s, max transit duration: %s' %
(stepsize, ndurations,
mintransitduration, maxtransitduration))
use_autoperiod = False
elif isinstance(autofreq, bool) and not autofreq:
minfreq = 1.0/endp
maxfreq = 1.0/startp
nfreq = int(npceil((maxfreq - minfreq)/stepsize))
# say what we're using
if verbose:
LOGINFO('min P: %s, max P: %s, nfreq: %s, '
'minfreq: %s, maxfreq: %s' % (startp, endp, nfreq,
minfreq, maxfreq))
LOGINFO('autofreq = False: using PROVIDED values for '
'freq stepsize: %s, ndurations: %s, '
'min transit duration: %s, max transit duration: %s' %
(stepsize, ndurations,
mintransitduration, maxtransitduration))
use_autoperiod = False
elif isinstance(autofreq, str) and autofreq == 'astropy':
use_autoperiod = True
minfreq = 1.0/endp
maxfreq = 1.0/startp
else:
LOGERROR("unknown autofreq kwarg encountered. can't continue...")
return None
# check the minimum frequency
if ((minfreq < (1.0/(stimes.max() - stimes.min()))) and
endp_timebase_check):
LOGWARNING('the requested max P = %.3f is larger than '
'the time base of the observations = %.3f, '
' will make minfreq = 2 x 1/timebase'
% (endp, stimes.max() - stimes.min()))
minfreq = 2.0/(stimes.max() - stimes.min())
LOGWARNING('new minfreq: %s, maxfreq: %s' %
(minfreq, maxfreq))
# run BLS
try:
# astropy's BLS requires durations in units of time
durations = nplinspace(mintransitduration*startp,
maxtransitduration*startp,
ndurations)
# set up the correct units for the BLS model
if magsarefluxes:
blsmodel = BoxLeastSquares(
stimes*u.day,
smags*u.dimensionless_unscaled,
dy=serrs*u.dimensionless_unscaled
)
else:
blsmodel = BoxLeastSquares(
stimes*u.day,
smags*u.mag,
dy=serrs*u.mag
)
# use autoperiod if requested
if use_autoperiod:
periods = nparray(
blsmodel.autoperiod(
durations,
minimum_period=startp,
maximum_period=endp,
minimum_n_transit=blsmintransits,
frequency_factor=blsfreqfactor
)
)
nfreq = periods.size
if verbose:
LOGINFO(
"autofreq = 'astropy', used .autoperiod() with "
"minimum_n_transit = %s, freq_factor = %s "
"to generate the frequency grid" %
(blsmintransits, blsfreqfactor)
)
LOGINFO('stepsize = %.5f, nfreq = %s, minfreq = %.5f, '
'maxfreq = %.5f, ndurations = %s' %
(abs(1.0/periods[1] - 1.0/periods[0]),
nfreq,
1.0/periods.max(),
1.0/periods.min(),
durations.size))
# otherwise, use kbls method
else:
frequencies = minfreq + nparange(nfreq)*stepsize
periods = 1.0/frequencies
if nfreq > 5.0e5:
if verbose:
LOGWARNING('more than 5.0e5 frequencies to go through; '
'this will take a while. '
'you might want to use the '
'abls.bls_parallel_pfind function instead')
# run the periodogram
blsresult = blsmodel.power(
periods*u.day,
durations*u.day,
objective=blsobjective,
method=blsmethod,
oversample=blsoversample
)
# get the peak values
lsp = nparray(blsresult.power)
# find the nbestpeaks for the periodogram: 1. sort the lsp array
# by highest value first 2. go down the values until we find
# five values that are separated by at least periodepsilon in
# period
# make sure to get only the finite peaks in the periodogram
# this is needed because BLS may produce infs for some peaks
finitepeakind = npisfinite(lsp)
finlsp = lsp[finitepeakind]
finperiods = periods[finitepeakind]
# make sure that finlsp has finite values before we work on it
try:
bestperiodind = npargmax(finlsp)
except ValueError:
LOGERROR('no finite periodogram values '
'for this mag series, skipping...')
return {'bestperiod':npnan,
'bestlspval':npnan,
'nbestpeaks':nbestpeaks,
'nbestinds':None,
'nbestlspvals':None,
'nbestperiods':None,
'lspvals':None,
'periods':None,
'durations':None,
'method':'bls',
'blsresult':None,
'blsmodel':None,
'kwargs':{'startp':startp,
'endp':endp,
'stepsize':stepsize,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'ndurations':ndurations,
'blsobjective':blsobjective,
'blsmethod':blsmethod,
'blsoversample':blsoversample,
'blsntransits':blsmintransits,
'blsfreqfactor':blsfreqfactor,
'autofreq':autofreq,
'periodepsilon':periodepsilon,
'nbestpeaks':nbestpeaks,
'sigclip':sigclip,
'magsarefluxes':magsarefluxes}}
sortedlspind = npargsort(finlsp)[::-1]
sortedlspperiods = finperiods[sortedlspind]
sortedlspvals = finlsp[sortedlspind]
# now get the nbestpeaks
nbestperiods, nbestlspvals, nbestinds, peakcount = (
[finperiods[bestperiodind]],
[finlsp[bestperiodind]],
[bestperiodind],
1
)
prevperiod = sortedlspperiods[0]
# find the best nbestpeaks in the lsp and their periods
for period, lspval, ind in zip(sortedlspperiods,
sortedlspvals,
sortedlspind):
if peakcount == nbestpeaks:
break
perioddiff = abs(period - prevperiod)
bestperiodsdiff = [abs(period - x) for x in nbestperiods]
# print('prevperiod = %s, thisperiod = %s, '
# 'perioddiff = %s, peakcount = %s' %
# (prevperiod, period, perioddiff, peakcount))
# this ensures that this period is different from the last
# period and from all the other existing best periods by
# periodepsilon to make sure we jump to an entire different
# peak in the periodogram
if (perioddiff > (periodepsilon*prevperiod) and
all(x > (periodepsilon*period)
for x in bestperiodsdiff)):
nbestperiods.append(period)
nbestlspvals.append(lspval)
nbestinds.append(ind)
peakcount = peakcount + 1
prevperiod = period
# generate the return dict
resultdict = {
'bestperiod':finperiods[bestperiodind],
'bestlspval':finlsp[bestperiodind],
'nbestpeaks':nbestpeaks,
'nbestinds':nbestinds,
'nbestlspvals':nbestlspvals,
'nbestperiods':nbestperiods,
'lspvals':lsp,
'frequencies':frequencies,
'periods':periods,
'durations':durations,
'blsresult':blsresult,
'blsmodel':blsmodel,
'stepsize':stepsize,
'nfreq':nfreq,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'method':'bls',
'kwargs':{'startp':startp,
'endp':endp,
'stepsize':stepsize,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'ndurations':ndurations,
'blsobjective':blsobjective,
'blsmethod':blsmethod,
'blsoversample':blsoversample,
'blsntransits':blsmintransits,
'blsfreqfactor':blsfreqfactor,
'autofreq':autofreq,
'periodepsilon':periodepsilon,
'nbestpeaks':nbestpeaks,
'sigclip':sigclip,
'magsarefluxes':magsarefluxes}
}
return resultdict
except Exception:
LOGEXCEPTION('BLS failed!')
if raiseonfail:
raise
return {'bestperiod':npnan,
'bestlspval':npnan,
'nbestinds':None,
'nbestpeaks':nbestpeaks,
'nbestlspvals':None,
'nbestperiods':None,
'lspvals':None,
'periods':None,
'durations':None,
'blsresult':None,
'blsmodel':None,
'stepsize':stepsize,
'nfreq':nfreq,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'method':'bls',
'kwargs':{'startp':startp,
'endp':endp,
'stepsize':stepsize,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'ndurations':ndurations,
'blsobjective':blsobjective,
'blsmethod':blsmethod,
'blsoversample':blsoversample,
'blsntransits':blsmintransits,
'blsfreqfactor':blsfreqfactor,
'autofreq':autofreq,
'periodepsilon':periodepsilon,
'nbestpeaks':nbestpeaks,
'sigclip':sigclip,
'magsarefluxes':magsarefluxes}}
else:
LOGERROR('no good detections for these times and mags, skipping...')
return {'bestperiod':npnan,
'bestlspval':npnan,
'nbestinds':None,
'nbestpeaks':nbestpeaks,
'nbestlspvals':None,
'nbestperiods':None,
'lspvals':None,
'periods':None,
'durations':None,
'blsresult':None,
'blsmodel':None,
'stepsize':stepsize,
'nfreq':None,
'nphasebins':None,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'method':'bls',
'kwargs':{'startp':startp,
'endp':endp,
'stepsize':stepsize,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'ndurations':ndurations,
'blsobjective':blsobjective,
'blsmethod':blsmethod,
'blsoversample':blsoversample,
'blsntransits':blsmintransits,
'blsfreqfactor':blsfreqfactor,
'autofreq':autofreq,
'periodepsilon':periodepsilon,
'nbestpeaks':nbestpeaks,
'sigclip':sigclip,
'magsarefluxes':magsarefluxes}}
def _parallel_bls_worker(task):
'''
This wraps Astropy's BoxLeastSquares for use with bls_parallel_pfind below.
`task` is a tuple::
task[0] = times
task[1] = mags
task[2] = errs
task[3] = magsarefluxes
task[4] = minfreq
task[5] = nfreq
task[6] = stepsize
task[7] = ndurations
task[8] = mintransitduration
task[9] = maxtransitduration
task[10] = blsobjective
task[11] = blsmethod
task[12] = blsoversample
'''
try:
times, mags, errs = task[:3]
magsarefluxes = task[3]
minfreq, nfreq, stepsize = task[4:7]
ndurations, mintransitduration, maxtransitduration = task[7:10]
blsobjective, blsmethod, blsoversample = task[10:]
frequencies = minfreq + nparange(nfreq)*stepsize
periods = 1.0/frequencies
# astropy's BLS requires durations in units of time
durations = nplinspace(mintransitduration*periods.min(),
maxtransitduration*periods.min(),
ndurations)
# set up the correct units for the BLS model
if magsarefluxes:
blsmodel = BoxLeastSquares(
times*u.day,
mags*u.dimensionless_unscaled,
dy=errs*u.dimensionless_unscaled
)
else:
blsmodel = BoxLeastSquares(
times*u.day,
mags*u.mag,
dy=errs*u.mag
)
blsresult = blsmodel.power(
periods*u.day,
durations*u.day,
objective=blsobjective,
method=blsmethod,
oversample=blsoversample
)
return {
'blsresult': blsresult,
'blsmodel': blsmodel,
'durations': durations,
'power': nparray(blsresult.power)
}
except Exception:
LOGEXCEPTION('BLS for frequency chunk: (%.6f, %.6f) failed.' %
(frequencies[0], frequencies[-1]))
return {
'blsresult': None,
'blsmodel': None,
'durations': durations,
'power': nparray([npnan for x in range(nfreq)]),
}
def bls_parallel_pfind(
times, mags, errs,
magsarefluxes=False,
startp=0.1, # by default, search from 0.1 d to...
endp=100.0, # ... 100.0 d -- don't search full timebase
stepsize=1.0e-4,
mintransitduration=0.01, # minimum transit length in phase
maxtransitduration=0.4, # maximum transit length in phase
ndurations=100,
autofreq=True, # figure out f0, nf, and df automatically
blsobjective='likelihood',
blsmethod='fast',
blsoversample=5,
blsmintransits=3,
blsfreqfactor=10.0,
nbestpeaks=5,
periodepsilon=0.1, # 0.1
sigclip=10.0,
endp_timebase_check=True,
verbose=True,
nworkers=None,
):
'''Runs the Box Least Squares Fitting Search for transit-shaped signals.
Breaks up the full frequency space into chunks and passes them to parallel
BLS workers.
Based on the version of BLS in Astropy 3.1:
`astropy.stats.BoxLeastSquares`. If you don't have Astropy 3.1, this module
will fail to import. Note that by default, this implementation of
`bls_parallel_pfind` doesn't use the `.autoperiod()` function from
`BoxLeastSquares` but uses the same auto frequency-grid generation as the
functions in `periodbase.kbls`. If you want to use Astropy's implementation,
set the value of `autofreq` kwarg to 'astropy'. The generated period array
will then be broken up into chunks and sent to the individual workers.
NOTE: the combined BLS spectrum produced by this function is not identical
to that produced by running BLS in one shot for the entire frequency
space. There are differences on the order of 1.0e-3 or so in the respective
peak values, but peaks appear at the same frequencies for both methods. This
is likely due to different aliasing caused by smaller chunks of the
frequency space used by the parallel workers in this function. When in
doubt, confirm results for this parallel implementation by comparing to
those from the serial implementation above.
In particular, when you want to get reliable estimates of the SNR, transit
depth, duration, etc. that Astropy's BLS gives you, rerun `bls_serial_pfind`
with `startp`, and `endp` close to the best period you want to characterize
the transit at. The dict returned from that function contains a `blsmodel`
key, which is the generated model from Astropy's BLS. Use the
`.compute_stats()` method to calculate the required stats.
Parameters
----------
times,mags,errs : np.array
The magnitude/flux time-series to search for transits.
magsarefluxes : bool
If the input measurement values in `mags` and `errs` are in fluxes, set
this to True.
startp,endp : float
The minimum and maximum periods to consider for the transit search.
stepsize : float
The step-size in frequency to use when constructing a frequency grid for
the period search.
mintransitduration,maxtransitduration : float
The minimum and maximum transitdurations (in units of phase) to consider
for the transit search.
ndurations : int
The number of transit durations to use in the period-search.
autofreq : bool or str
If this is True, the values of `stepsize` and `nphasebins` will be
ignored, and these, along with a frequency-grid, will be determined
based on the following relations::
nphasebins = int(ceil(2.0/mintransitduration))
if nphasebins > 3000:
nphasebins = 3000
stepsize = 0.25*mintransitduration/(times.max()-times.min())
minfreq = 1.0/endp
maxfreq = 1.0/startp
nfreq = int(ceil((maxfreq - minfreq)/stepsize))
If this is False, you must set `startp`, `endp`, and `stepsize` as
appropriate.
If this is str == 'astropy', will use the
`astropy.stats.BoxLeastSquares.autoperiod()` function to calculate the
frequency grid instead of the kbls method.
blsobjective : {'likelihood','snr'}
Sets the type of objective to optimize in the `BoxLeastSquares.power()`
function.
blsmethod : {'fast','slow'}
Sets the type of method to use in the `BoxLeastSquares.power()`
function.
blsoversample : {'likelihood','snr'}
Sets the `oversample` kwarg for the `BoxLeastSquares.power()` function.
blsmintransits : int
Sets the `min_n_transits` kwarg for the `BoxLeastSquares.autoperiod()`
function.
blsfreqfactor : float
Sets the `frequency_factor` kwarg for the `BoxLeastSquares.autoperiod()`
function.
periodepsilon : float
The fractional difference between successive values of 'best' periods
when sorting by periodogram power to consider them as separate periods
(as opposed to part of the same periodogram peak). This is used to avoid
broad peaks in the periodogram and make sure the 'best' periods returned
are all actually independent.
nbestpeaks : int
The number of 'best' peaks to return from the periodogram results,
starting from the global maximum of the periodogram peak values.
sigclip : float or int or sequence of two floats/ints or None
If a single float or int, a symmetric sigma-clip will be performed using
the number provided as the sigma-multiplier to cut out from the input
time-series.
If a list of two ints/floats is provided, the function will perform an
'asymmetric' sigma-clip. The first element in this list is the sigma
value to use for fainter flux/mag values; the second element in this
list is the sigma value to use for brighter flux/mag values. For
example, `sigclip=[10., 3.]`, will sigclip out greater than 10-sigma
dimmings and greater than 3-sigma brightenings. Here the meaning of
"dimming" and "brightening" is set by *physics* (not the magnitude
system), which is why the `magsarefluxes` kwarg must be correctly set.
If `sigclip` is None, no sigma-clipping will be performed, and the
time-series (with non-finite elems removed) will be passed through to
the output.
endp_timebase_check : bool
If True, will check if the ``endp`` value is larger than the time-base
of the observations. If it is, will change the ``endp`` value such that
it is half of the time-base. If False, will allow an ``endp`` larger
than the time-base of the observations.
verbose : bool
If this is True, will indicate progress and details about the frequency
grid used for the period search.
nworkers : int or None
The number of parallel workers to launch for period-search. If None,
nworkers = NCPUS.
Returns
-------
dict
This function returns a dict, referred to as an `lspinfo` dict in other
astrobase functions that operate on periodogram results. This is a
standardized format across all astrobase period-finders, and is of the
form below::
{'bestperiod': the best period value in the periodogram,
'bestlspval': the periodogram peak associated with the best period,
'nbestpeaks': the input value of nbestpeaks,
'nbestlspvals': nbestpeaks-size list of best period peak values,
'nbestperiods': nbestpeaks-size list of best periods,
'lspvals': the full array of periodogram powers,
'frequencies': the full array of frequencies considered,
'periods': the full array of periods considered,
'durations': the array of durations used to run BLS,
'blsresult': Astropy BLS result object (BoxLeastSquaresResult),
'blsmodel': Astropy BLS BoxLeastSquares object used for work,
'stepsize': the actual stepsize used,
'nfreq': the actual nfreq used,
'durations': the durations array used,
'mintransitduration': the input mintransitduration,
'maxtransitduration': the input maxtransitdurations,
'method':'bls' -> the name of the period-finder method,
'kwargs':{ dict of all of the input kwargs for record-keeping}}
'''
# get rid of nans first and sigclip
stimes, smags, serrs = sigclip_magseries(times,
mags,
errs,
magsarefluxes=magsarefluxes,
sigclip=sigclip)
# resort by time
stimes, smags, errs = resort_by_time(stimes, smags, serrs)
# make sure there are enough points to calculate a spectrum
if len(stimes) > 9 and len(smags) > 9 and len(serrs) > 9:
# if we're setting up everything automatically
if isinstance(autofreq, bool) and autofreq:
# use heuristic to figure out best timestep
stepsize = 0.25*mintransitduration/(stimes.max()-stimes.min())
# now figure out the frequencies to use
minfreq = 1.0/endp
maxfreq = 1.0/startp
nfreq = int(npceil((maxfreq - minfreq)/stepsize))
# say what we're using
if verbose:
LOGINFO('min P: %s, max P: %s, nfreq: %s, '
'minfreq: %s, maxfreq: %s' % (startp, endp, nfreq,
minfreq, maxfreq))
LOGINFO('autofreq = True: using AUTOMATIC values for '
'freq stepsize: %s, ndurations: %s, '
'min transit duration: %s, max transit duration: %s' %
(stepsize, ndurations,
mintransitduration, maxtransitduration))
use_autoperiod = False
elif isinstance(autofreq, bool) and not autofreq:
minfreq = 1.0/endp
maxfreq = 1.0/startp
nfreq = int(npceil((maxfreq - minfreq)/stepsize))
# say what we're using
if verbose:
LOGINFO('min P: %s, max P: %s, nfreq: %s, '
'minfreq: %s, maxfreq: %s' % (startp, endp, nfreq,
minfreq, maxfreq))
LOGINFO('autofreq = False: using PROVIDED values for '
'freq stepsize: %s, ndurations: %s, '
'min transit duration: %s, max transit duration: %s' %
(stepsize, ndurations,
mintransitduration, maxtransitduration))
use_autoperiod = False
elif isinstance(autofreq, str) and autofreq == 'astropy':
use_autoperiod = True
minfreq = 1.0/endp
maxfreq = 1.0/startp
else:
LOGERROR("unknown autofreq kwarg encountered. can't continue...")
return None
# check the minimum frequency
if ((minfreq < (1.0/(stimes.max() - stimes.min()))) and
endp_timebase_check):
LOGWARNING('the requested max P = %.3f is larger than '
'the time base of the observations = %.3f, '
' will make minfreq = 2 x 1/timebase'
% (endp, stimes.max() - stimes.min()))
minfreq = 2.0/(stimes.max() - stimes.min())
LOGWARNING('new minfreq: %s, maxfreq: %s' %
(minfreq, maxfreq))
#############################
## NOW RUN BLS IN PARALLEL ##
#############################
# fix number of CPUs if needed
if not nworkers or nworkers > NCPUS:
nworkers = NCPUS
if verbose:
LOGINFO('using %s workers...' % nworkers)
# check if autoperiod is True and get the correct period-grid
if use_autoperiod:
# astropy's BLS requires durations in units of time
durations = nplinspace(mintransitduration*startp,
maxtransitduration*startp,
ndurations)
# set up the correct units for the BLS model
if magsarefluxes:
blsmodel = BoxLeastSquares(
stimes*u.day,
smags*u.dimensionless_unscaled,
dy=serrs*u.dimensionless_unscaled
)
else:
blsmodel = BoxLeastSquares(
stimes*u.day,
smags*u.mag,
dy=serrs*u.mag
)
periods = nparray(
blsmodel.autoperiod(
durations*u.day,
minimum_period=startp,
maximum_period=endp,
minimum_n_transit=blsmintransits,
frequency_factor=blsfreqfactor
)
)
frequencies = 1.0/periods
nfreq = frequencies.size
if verbose:
LOGINFO(
"autofreq = 'astropy', used .autoperiod() with "
"minimum_n_transit = %s, freq_factor = %s "
"to generate the frequency grid" %
(blsmintransits, blsfreqfactor)
)
LOGINFO('stepsize = %s, nfreq = %s, minfreq = %.5f, '
'maxfreq = %.5f, ndurations = %s' %
(abs(frequencies[1] - frequencies[0]),
nfreq,
1.0/periods.max(),
1.0/periods.min(),
durations.size))
del blsmodel
del durations
# otherwise, use kbls method
else:
frequencies = minfreq + nparange(nfreq)*stepsize
# break up the tasks into chunks
csrem = int(fmod(nfreq, nworkers))
csint = int(float(nfreq/nworkers))
chunk_minfreqs, chunk_nfreqs = [], []
for x in range(nworkers):
this_minfreqs = frequencies[x*csint]
# handle usual nfreqs
if x < (nworkers - 1):
this_nfreqs = frequencies[x*csint:x*csint+csint].size
else:
this_nfreqs = frequencies[x*csint:x*csint+csint+csrem].size
chunk_minfreqs.append(this_minfreqs)
chunk_nfreqs.append(this_nfreqs)
# populate the tasks list
#
# task[0] = times
# task[1] = mags
# task[2] = errs
# task[3] = magsarefluxes
# task[4] = minfreq
# task[5] = nfreq
# task[6] = stepsize
# task[7] = nphasebins
# task[8] = mintransitduration
# task[9] = maxtransitduration
# task[10] = blsobjective
# task[11] = blsmethod
# task[12] = blsoversample
# populate the tasks list
tasks = [(stimes, smags, serrs, magsarefluxes,
chunk_minf, chunk_nf, stepsize,
ndurations, mintransitduration, maxtransitduration,
blsobjective, blsmethod, blsoversample)
for (chunk_minf, chunk_nf)
in zip(chunk_minfreqs, chunk_nfreqs)]
if verbose:
for ind, task in enumerate(tasks):
LOGINFO('worker %s: minfreq = %.6f, nfreqs = %s' %
(ind+1, task[4], task[5]))
LOGINFO('running...')
# return tasks
# start the pool
pool = Pool(nworkers)
results = pool.map(_parallel_bls_worker, tasks)
pool.close()
pool.join()
del pool
# now concatenate the output lsp arrays
lsp = npconcatenate([x['power'] for x in results])
periods = 1.0/frequencies
# find the nbestpeaks for the periodogram: 1. sort the lsp array
# by highest value first 2. go down the values until we find
# five values that are separated by at least periodepsilon in
# period
# make sure to get only the finite peaks in the periodogram
# this is needed because BLS may produce infs for some peaks
finitepeakind = npisfinite(lsp)
finlsp = lsp[finitepeakind]
finperiods = periods[finitepeakind]
# make sure that finlsp has finite values before we work on it
try:
bestperiodind = npargmax(finlsp)
except ValueError:
LOGERROR('no finite periodogram values '
'for this mag series, skipping...')
return {'bestperiod':npnan,
'bestlspval':npnan,
'nbestpeaks':nbestpeaks,
'nbestinds':None,
'nbestlspvals':None,
'nbestperiods':None,
'lspvals':None,
'periods':None,
'durations':None,
'method':'bls',
'blsresult':None,
'blsmodel':None,
'kwargs':{'startp':startp,
'endp':endp,
'stepsize':stepsize,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'ndurations':ndurations,
'blsobjective':blsobjective,
'blsmethod':blsmethod,
'blsoversample':blsoversample,
'autofreq':autofreq,
'periodepsilon':periodepsilon,
'nbestpeaks':nbestpeaks,
'sigclip':sigclip,
'magsarefluxes':magsarefluxes}}
sortedlspind = npargsort(finlsp)[::-1]
sortedlspperiods = finperiods[sortedlspind]
sortedlspvals = finlsp[sortedlspind]
# now get the nbestpeaks
nbestperiods, nbestlspvals, nbestinds, peakcount = (
[finperiods[bestperiodind]],
[finlsp[bestperiodind]],
[bestperiodind],
1
)
prevperiod = sortedlspperiods[0]
# find the best nbestpeaks in the lsp and their periods
for period, lspval, ind in zip(sortedlspperiods,
sortedlspvals,
sortedlspind):
if peakcount == nbestpeaks:
break
perioddiff = abs(period - prevperiod)
bestperiodsdiff = [abs(period - x) for x in nbestperiods]
# this ensures that this period is different from the last
# period and from all the other existing best periods by
# periodepsilon to make sure we jump to an entire different
# peak in the periodogram
if (perioddiff > (periodepsilon*prevperiod) and
all(x > (periodepsilon*period)
for x in bestperiodsdiff)):
nbestperiods.append(period)
nbestlspvals.append(lspval)
nbestinds.append(ind)
peakcount = peakcount + 1
prevperiod = period
# generate the return dict
resultdict = {
'bestperiod':finperiods[bestperiodind],
'bestlspval':finlsp[bestperiodind],
'nbestpeaks':nbestpeaks,
'nbestinds':nbestinds,
'nbestlspvals':nbestlspvals,
'nbestperiods':nbestperiods,
'lspvals':lsp,
'frequencies':frequencies,
'periods':periods,
'durations':[x['durations'] for x in results],
'blsresult':[x['blsresult'] for x in results],
'blsmodel':[x['blsmodel'] for x in results],
'stepsize':stepsize,
'nfreq':nfreq,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'method':'bls',
'kwargs':{'startp':startp,
'endp':endp,
'stepsize':stepsize,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'ndurations':ndurations,
'blsobjective':blsobjective,
'blsmethod':blsmethod,
'blsoversample':blsoversample,
'autofreq':autofreq,
'periodepsilon':periodepsilon,
'nbestpeaks':nbestpeaks,
'sigclip':sigclip,
'magsarefluxes':magsarefluxes}
}
return resultdict
else:
LOGERROR('no good detections for these times and mags, skipping...')
return {'bestperiod':npnan,
'bestlspval':npnan,
'nbestinds':None,
'nbestpeaks':nbestpeaks,
'nbestlspvals':None,
'nbestperiods':None,
'lspvals':None,
'periods':None,
'durations':None,
'blsresult':None,
'blsmodel':None,
'stepsize':stepsize,
'nfreq':None,
'nphasebins':None,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'method':'bls',
'kwargs':{'startp':startp,
'endp':endp,
'stepsize':stepsize,
'mintransitduration':mintransitduration,
'maxtransitduration':maxtransitduration,
'ndurations':ndurations,
'blsobjective':blsobjective,
'blsmethod':blsmethod,
'blsoversample':blsoversample,
'autofreq':autofreq,
'periodepsilon':periodepsilon,
'nbestpeaks':nbestpeaks,
'sigclip':sigclip,
'magsarefluxes':magsarefluxes}}
| 38.621536 | 80 | 0.536911 | 4,649 | 48,779 | 5.610239 | 0.121747 | 0.002224 | 0.024845 | 0.033126 | 0.834982 | 0.828464 | 0.827084 | 0.825128 | 0.822215 | 0.809677 | 0 | 0.009656 | 0.382152 | 48,779 | 1,262 | 81 | 38.652139 | 0.855763 | 0.357285 | 0 | 0.800305 | 0 | 0 | 0.147374 | 0.000769 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004573 | false | 0 | 0.016768 | 0 | 0.03811 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed763b05c737cb702d17d0fb86e57be719f3f1e6 | 65,116 | py | Python | tests/test_estimating_equations.py | pzivich/Deli | 761aa51c6949334b59fffb185be4266177454b6c | [
"MIT"
] | null | null | null | tests/test_estimating_equations.py | pzivich/Deli | 761aa51c6949334b59fffb185be4266177454b6c | [
"MIT"
] | null | null | null | tests/test_estimating_equations.py | pzivich/Deli | 761aa51c6949334b59fffb185be4266177454b6c | [
"MIT"
] | null | null | null | import pytest
import numpy as np
import numpy.testing as npt
import pandas as pd
import statsmodels.api as sm
import statsmodels.formula.api as smf
from scipy.stats import logistic
from lifelines import ExponentialFitter, WeibullFitter, WeibullAFTFitter
from delicatessen import MEstimator
from delicatessen.estimating_equations import (ee_mean, ee_mean_variance, ee_mean_robust,
# Regression models
ee_regression, ee_robust_regression, ee_ridge_regression,
# Survival models
ee_exponential_model, ee_exponential_measure, ee_weibull_model,
ee_weibull_measure, ee_aft_weibull, ee_aft_weibull_measure,
# Dose-Response
ee_2p_logistic, ee_3p_logistic, ee_4p_logistic, ee_effective_dose_delta,
# Causal inference
ee_gformula, ee_ipw, ee_aipw)
from delicatessen.data import load_inderjit
np.random.seed(236461)
class TestEstimatingEquationsBase:
def test_mean(self):
"""Tests mean with the built-in estimating equation.
"""
# Data set
y = np.array([5, 1, 2, 4, 2, 4, 5, 7, 11, 1, 6, 3, 4, 6])
def psi1(theta):
return y - theta
mcee = MEstimator(psi1, init=[0, ])
mcee.estimate()
def psi2(theta):
return ee_mean(theta, y=y)
mpee = MEstimator(psi2, init=[0, ])
mpee.estimate()
# Checking mean estimate
npt.assert_allclose(mcee.theta,
mpee.theta,
atol=1e-6)
# Checking variance estimates
npt.assert_allclose(mcee.asymptotic_variance,
mpee.asymptotic_variance,
atol=1e-6)
def test_mean_robust(self):
y = [-10, -1, 2, 3, -2, 0, 3, 5, 12]
yk = [-6, -1, 2, 3, -2, 0, 3, 5, 6]
def psi(theta):
return ee_mean_robust(theta=theta, y=y, k=6)
mestimator = MEstimator(psi, init=[0, ])
mestimator.estimate()
# Checking mean estimate
npt.assert_allclose(mestimator.theta[0],
np.mean(yk),
atol=1e-6)
def test_mean_variance(self):
"""Tests mean-variance with the built-in estimating equations.
"""
# Data set
y = np.array([5, 1, 2, 4, 2, 4, 5, 7, 11, 1, 6, 3, 4, 6])
def psi1(theta):
return y - theta[0], (y - theta[0]) ** 2 - theta[1]
mcee = MEstimator(psi1, init=[0, 0, ])
mcee.estimate()
def psi2(theta):
return ee_mean_variance(theta=theta, y=y)
mpee = MEstimator(psi2, init=[0, 0, ])
mpee.estimate()
# Checking mean estimate
npt.assert_allclose(mcee.theta,
mpee.theta,
atol=1e-6)
# Checking variance estimates
npt.assert_allclose(mcee.asymptotic_variance,
mpee.asymptotic_variance,
atol=1e-6)
class TestEstimatingEquationsRegression:
def test_error_regression(self):
"""Test for error raised when incorrect regression name is provided
"""
n = 100
data = pd.DataFrame()
data['x1'] = np.random.normal(size=n)
data['x2'] = data['x1'] + np.random.normal(scale=0.1, size=n)
data['c'] = 1
data['y'] = 5 + data['x1'] + np.random.normal(size=n)
Xvals = np.asarray(data[['c', 'x1', 'x2']])
yvals = np.asarray(data['y'])
def psi(theta):
return ee_regression(theta, X=Xvals, y=yvals, model=748)
estr = MEstimator(psi, init=[5, 1, 1])
with pytest.raises(ValueError, match="The model argument"):
estr.estimate(solver='lm')
def psi(theta):
return ee_regression(theta, X=Xvals, y=yvals, model='magic')
estr = MEstimator(psi, init=[5, 1, 1])
with pytest.raises(ValueError, match="Invalid input"):
estr.estimate(solver='lm')
def test_ols(self):
"""Tests linear regression with the built-in estimating equation.
"""
n = 500
data = pd.DataFrame()
data['X'] = np.random.normal(size=n)
data['Z'] = np.random.normal(size=n)
data['Y'] = 0.5 + 2*data['X'] - 1*data['Z'] + np.random.normal(loc=0, size=n)
data['C'] = 1
def psi_builtin_regression(theta):
return ee_regression(theta,
X=data[['C', 'X', 'Z']], y=data['Y'],
model='linear')
mpee = MEstimator(psi_builtin_regression, init=[0.1, 0.1, 0.1])
mpee.estimate()
# Statsmodels function equivalent
glm = smf.glm("Y ~ X + Z", data).fit(cov_type="HC1")
# Checking mean estimate
npt.assert_allclose(mpee.theta,
np.asarray(glm.params),
atol=1e-6)
# Checking variance estimates
npt.assert_allclose(mpee.variance,
np.asarray(glm.cov_params()),
atol=1e-6)
# Checking confidence interval estimates
npt.assert_allclose(mpee.confidence_intervals(),
np.asarray(glm.conf_int()),
atol=1e-6)
def test_wls(self):
"""Tests weighted linear regression by-hand with a single estimating equation.
"""
n = 500
data = pd.DataFrame()
data['X'] = np.random.normal(size=n)
data['Z'] = np.random.normal(size=n)
data['Y'] = 0.5 + 2 * data['X'] - 1 * data['Z'] + np.random.normal(loc=0, size=n)
data['C'] = 1
data['w'] = np.random.uniform(1, 10, size=n)
def psi_regression(theta):
return ee_regression(theta,
X=data[['C', 'X', 'Z']], y=data['Y'],
model='linear', weights=data['w'])
mestimator = MEstimator(psi_regression, init=[0.1, 0.1, 0.1])
mestimator.estimate()
# Comparing to statsmodels GLM (with robust covariance)
glm = smf.glm("Y ~ X + Z", data, freq_weights=data['w']).fit(cov_type="cluster",
cov_kwds={"groups": data.index,
"use_correction": False})
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(glm.params),
atol=1e-6)
# Checking variance estimates
npt.assert_allclose(mestimator.variance,
np.asarray(glm.cov_params()),
atol=1e-6)
# Checking confidence interval estimates
npt.assert_allclose(mestimator.confidence_intervals(),
np.asarray(glm.conf_int()),
atol=1e-6)
def test_ridge_ols(self):
"""Tests the ridge (L2) variation of the linear regression built-in estimating equation
"""
n = 1000
data = pd.DataFrame()
data['x1'] = np.random.normal(size=n)
data['x2'] = data['x1'] + np.random.normal(scale=0.1, size=n)
data['c'] = 1
data['y'] = 5 + data['x1'] + np.random.normal(size=n)
Xvals = np.asarray(data[['c', 'x1', 'x2']])
yvals = np.asarray(data['y'])
# Penalty of 0.5
def psi(theta):
return ee_ridge_regression(theta, X=Xvals, y=yvals, model='linear', penalty=0.5, weights=None)
estr = MEstimator(psi, init=[5, 1, 1])
estr.estimate(solver='lm')
ridge = sm.OLS(yvals, Xvals).fit_regularized(L1_wt=0., alpha=0.5 / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(estr.theta,
np.asarray(ridge.params),
atol=1e-6)
# Penalty of 5.0
def psi(theta):
return ee_ridge_regression(theta, X=Xvals, y=yvals, model='linear', penalty=5.0, weights=None)
estr = MEstimator(psi, init=[5, 1, 1])
estr.estimate(solver='lm')
ridge = sm.OLS(yvals, Xvals).fit_regularized(L1_wt=0., alpha=5. / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(estr.theta,
np.asarray(ridge.params),
atol=1e-6)
# Testing array of penalty terms
def psi(theta):
return ee_ridge_regression(theta, X=Xvals, y=yvals, model='linear', penalty=[0., 5., 2.], weights=None)
estr = MEstimator(psi, init=[5, 1, 1])
estr.estimate(solver='lm')
ridge = sm.OLS(yvals, Xvals).fit_regularized(L1_wt=0., alpha=np.array([0., 5., 2.]) / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(estr.theta,
np.asarray(ridge.params),
atol=1e-6)
def test_ridge_wls(self):
"""Tests the ridge (L2) variation of the weighted linear regression built-in estimating equation
"""
n = 1000
data = pd.DataFrame()
data['x1'] = np.random.normal(size=n)
data['x2'] = data['x1'] + np.random.normal(scale=0.1, size=n)
data['c'] = 1
data['y'] = 5 + data['x1'] + np.random.normal(size=n)
Xvals = np.asarray(data[['c', 'x1', 'x2']])
yvals = np.asarray(data['y'])
weights = np.random.uniform(0.1, 2.5, size=n)
# Penalty of 0.5
def psi(theta):
return ee_ridge_regression(theta, X=Xvals, y=yvals, model='linear', penalty=0.5, weights=weights)
estr = MEstimator(psi, init=[5, 1, 1])
estr.estimate(solver='lm')
wridge = sm.WLS(yvals, Xvals, weights=weights).fit_regularized(L1_wt=0., alpha=0.5 / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(estr.theta,
np.asarray(wridge.params),
atol=1e-6)
# Penalty of 5.0
def psi(theta):
return ee_ridge_regression(theta, X=Xvals, y=yvals, model='linear', penalty=5.0, weights=weights)
estr = MEstimator(psi, init=[5, 1, 1])
estr.estimate(solver='lm')
wridge = sm.WLS(yvals, Xvals, weights=weights).fit_regularized(L1_wt=0., alpha=5. / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(estr.theta,
np.asarray(wridge.params),
atol=1e-6)
# Testing array of penalty terms
def psi(theta):
return ee_ridge_regression(theta, X=Xvals, y=yvals, model='linear', penalty=[0., 5., 2.], weights=weights)
estr = MEstimator(psi, init=[5, 1, 1])
estr.estimate(solver='lm')
wridge = sm.WLS(yvals, Xvals, weights=weights).fit_regularized(L1_wt=0.,
alpha=np.array([0., 5., 2.]) / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(estr.theta,
np.asarray(wridge.params),
atol=1e-6)
def test_error_ridge(self):
n = 1000
data = pd.DataFrame()
data['x1'] = np.random.normal(size=n)
data['x2'] = data['x1'] + np.random.normal(scale=0.1, size=n)
data['c'] = 1
data['y'] = 5 + data['x1'] + np.random.normal(size=n)
Xvals = np.asarray(data[['c', 'x1', 'x2']])
yvals = np.asarray(data['y'])
def psi(theta):
return ee_ridge_regression(theta, X=Xvals, y=yvals, model='linear',
penalty=[0.5, 5.], weights=None)
estr = MEstimator(psi, init=[5, 1, 1])
with pytest.raises(ValueError, match="The penalty term must"):
estr.estimate(solver='lm')
def test_logistic(self):
n = 1000
data = pd.DataFrame()
data['X'] = np.random.normal(size=n)
data['Z'] = np.random.normal(size=n)
data['Y'] = np.random.binomial(n=1, p=logistic.cdf(0.5 + 2*data['X'] - 1*data['Z']), size=n)
data['C'] = 1
def psi_builtin_regression(theta):
return ee_regression(theta,
X=data[['C', 'X', 'Z']], y=data['Y'],
model='logistic')
mpee = MEstimator(psi_builtin_regression, init=[0., 0., 0.])
mpee.estimate()
# Comparing to statsmodels GLM (with robust covariance)
glm = smf.glm("Y ~ X + Z", data, family=sm.families.Binomial()).fit(cov_type="HC1")
# Checking mean estimate
npt.assert_allclose(mpee.theta,
np.asarray(glm.params),
atol=1e-6)
# Checking variance estimates
npt.assert_allclose(mpee.variance,
np.asarray(glm.cov_params()),
atol=1e-6)
# Checking confidence interval estimates
npt.assert_allclose(mpee.confidence_intervals(),
np.asarray(glm.conf_int()),
atol=1e-6)
def test_weighted_logistic(self):
"""Tests weighted logistic regression by-hand with a single estimating equation.
"""
n = 500
data = pd.DataFrame()
data['X'] = np.random.normal(size=n)
data['Z'] = np.random.normal(size=n)
data['Y'] = np.random.binomial(n=1, p=logistic.cdf(0.5 + 2*data['X'] - 1*data['Z']), size=n)
data['C'] = 1
data['w'] = np.random.uniform(1, 10, size=n)
def psi_regression(theta):
return ee_regression(theta,
X=data[['C', 'X', 'Z']], y=data['Y'],
model='logistic', weights=data['w'])
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate()
# Comparing to statsmodels GLM (with robust covariance)
glm = smf.glm("Y ~ X + Z", data, freq_weights=data['w'],
family=sm.families.Binomial()).fit(cov_type="cluster",
cov_kwds={"groups": data.index,
"use_correction": False})
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(glm.params),
atol=1e-6)
# Checking variance estimates
npt.assert_allclose(mestimator.variance,
np.asarray(glm.cov_params()),
atol=1e-6)
# Checking confidence interval estimates
npt.assert_allclose(mestimator.confidence_intervals(),
np.asarray(glm.conf_int()),
atol=1e-6)
def test_ridge_logistic(self):
"""Tests ridge logistic regression by-hand with a single estimating equation.
"""
n = 1000
data = pd.DataFrame()
data['X'] = np.random.normal(size=n)
data['Z'] = np.random.normal(size=n)
data['Y'] = np.random.binomial(n=1, p=logistic.cdf(0.5 + 2*data['X'] - 1*data['Z']), size=n)
data['C'] = 1
Xvals = np.asarray(data[['C', 'X', 'Z']])
yvals = np.asarray(data['Y'])
def psi_regression(theta):
return ee_ridge_regression(theta,
X=Xvals, y=yvals,
model='logistic', penalty=0.5, weights=None)
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='lm', tolerance=1e-12)
f = sm.families.Binomial()
lgt = sm.GLM(yvals, Xvals, family=f).fit_regularized(L1_wt=0., alpha=0.5 / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(lgt.params),
atol=1e-4)
def psi_regression(theta):
return ee_ridge_regression(theta,
X=Xvals, y=yvals,
model='logistic', penalty=5., weights=None)
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='hybr', tolerance=1e-12)
f = sm.families.Binomial()
lgt = sm.GLM(yvals, Xvals, family=f).fit_regularized(L1_wt=0., alpha=5. / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(lgt.params),
atol=1e-4)
def psi_regression(theta):
return ee_ridge_regression(theta,
X=Xvals, y=yvals,
model='logistic', penalty=[0., 5., 2.], weights=None)
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='hybr', tolerance=1e-12)
f = sm.families.Binomial()
lgt = sm.GLM(yvals, Xvals, family=f).fit_regularized(L1_wt=0., alpha=np.array([0., 5., 2.]) / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(lgt.params),
atol=1e-4)
def test_poisson(self):
"""Tests Poisson regression by-hand with a single estimating equation.
"""
np.random.seed(20212345)
n = 500
data = pd.DataFrame()
data['X'] = np.random.normal(size=n)
data['Z'] = np.random.normal(size=n)
data['Y'] = np.random.poisson(lam=np.exp(1 + 2*data['X'] - 1*data['Z']), size=n)
data['C'] = 1
def psi_regression(theta):
return ee_regression(theta,
X=data[['C', 'X', 'Z']], y=data['Y'],
model='poisson')
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='lm')
# Comparing to statsmodels GLM (with robust covariance)
glm = smf.glm("Y ~ X + Z", data, family=sm.families.Poisson()).fit(cov_type="HC1")
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(glm.params),
atol=1e-6)
# Checking variance estimates
npt.assert_allclose(mestimator.variance,
np.asarray(glm.cov_params()),
atol=1e-6)
# Checking confidence interval estimates
npt.assert_allclose(mestimator.confidence_intervals(),
np.asarray(glm.conf_int()),
atol=1e-6)
def test_weighted_poisson(self):
"""Tests weighted Poisson regression by-hand with a single estimating equation.
"""
np.random.seed(1234)
n = 500
data = pd.DataFrame()
data['X'] = np.random.normal(size=n)
data['Z'] = np.random.normal(size=n)
data['Y'] = np.random.poisson(lam=np.exp(1 + 2*data['X'] - 1*data['Z']), size=n)
data['C'] = 1
data['w'] = np.random.uniform(1, 3, size=n)
def psi_regression(theta):
return ee_regression(theta,
X=data[['C', 'X', 'Z']], y=data['Y'],
model='poisson', weights=data['w'])
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='lm')
# Comparing to statsmodels GLM (with robust covariance)
glm = smf.glm("Y ~ X + Z", data, freq_weights=data['w'],
family=sm.families.Poisson()).fit(cov_type="cluster",
cov_kwds={"groups": data.index,
"use_correction": False})
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(glm.params),
atol=1e-6)
# Checking variance estimates
npt.assert_allclose(mestimator.variance,
np.asarray(glm.cov_params()),
atol=1e-6)
# Checking confidence interval estimates
npt.assert_allclose(mestimator.confidence_intervals(),
np.asarray(glm.conf_int()),
atol=1e-6)
def test_ridge_poisson(self):
"""Tests ridge Poisson regression by-hand with a single estimating equation.
"""
n = 1000
data = pd.DataFrame()
data['X'] = np.random.normal(size=n)
data['Z'] = np.random.normal(size=n)
data['Y'] = np.random.poisson(lam=np.exp(1 + 2*data['X'] - 1*data['Z']), size=n)
data['C'] = 1
Xvals = np.asarray(data[['C', 'X', 'Z']])
yvals = np.asarray(data['Y'])
def psi_regression(theta):
return ee_ridge_regression(theta,
X=Xvals, y=yvals,
model='poisson', penalty=0.5, weights=None)
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='lm', tolerance=1e-12)
f = sm.families.Poisson()
lgt = sm.GLM(yvals, Xvals, family=f).fit_regularized(L1_wt=0., alpha=0.5 / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(lgt.params),
atol=1e-6)
def psi_regression(theta):
return ee_ridge_regression(theta,
X=Xvals, y=yvals,
model='poisson', penalty=2.5, weights=None)
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='lm', tolerance=1e-12)
lgt = sm.GLM(yvals, Xvals, family=f).fit_regularized(L1_wt=0., alpha=2.5 / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(lgt.params),
atol=1e-6)
def psi_regression(theta):
return ee_ridge_regression(theta,
X=Xvals, y=yvals,
model='poisson', penalty=[0., 5., 2.5], weights=None)
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='lm', tolerance=1e-12)
lgt = sm.GLM(yvals, Xvals, family=f).fit_regularized(L1_wt=0., alpha=np.asarray([0., 5., 2.5]) / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(lgt.params),
atol=1e-6)
def test_ridge_wpoisson(self):
"""Tests weighted ridge Poisson regression by-hand with a single estimating equation.
"""
n = 1000
data = pd.DataFrame()
data['X'] = np.random.normal(size=n)
data['Z'] = np.random.normal(size=n)
data['Y'] = np.random.poisson(lam=np.exp(1 + 2*data['X'] - 1*data['Z']), size=n)
data['C'] = 1
Xvals = np.asarray(data[['C', 'X', 'Z']])
yvals = np.asarray(data['Y'])
weights = np.random.uniform(0.5, 2, size=n)
def psi_regression(theta):
return ee_ridge_regression(theta,
X=Xvals, y=yvals,
model='poisson', penalty=0.5, weights=weights)
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='lm', tolerance=1e-12)
f = sm.families.Poisson()
lgt = sm.GLM(yvals, Xvals, family=f, freq_weights=weights).fit_regularized(L1_wt=0., alpha=0.5 / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(lgt.params),
atol=1e-5)
def psi_regression(theta):
return ee_ridge_regression(theta,
X=Xvals, y=yvals,
model='poisson', penalty=2.5, weights=weights)
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='lm', tolerance=1e-12)
lgt = sm.GLM(yvals, Xvals, family=f, freq_weights=weights).fit_regularized(L1_wt=0., alpha=2.5 / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(lgt.params),
atol=2e-5)
def psi_regression(theta):
return ee_ridge_regression(theta,
X=Xvals, y=yvals,
model='poisson', penalty=[0., 5., 2.5], weights=weights)
mestimator = MEstimator(psi_regression, init=[0., 0., 0.])
mestimator.estimate(solver='lm', tolerance=1e-12)
lgt = sm.GLM(yvals, Xvals, family=f, freq_weights=weights).fit_regularized(L1_wt=0.,
alpha=np.asarray([0., 5., 2.5]
) / Xvals.shape[0])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(lgt.params),
atol=5e-4)
class TestEstimatingEquationsSurvival:
@pytest.fixture
def surv_data(self):
np.random.seed(1123211)
n = 200
d = pd.DataFrame()
d['C'] = np.random.weibull(a=1, size=n)
d['C'] = np.where(d['C'] > 5, 5, d['C'])
d['T'] = 0.8 * np.random.weibull(a=0.75, size=n)
d['delta'] = np.where(d['T'] < d['C'], 1, 0)
d['t'] = np.where(d['delta'] == 1, d['T'], d['C'])
return np.asarray(d['t']), np.asarray(d['delta'])
@pytest.fixture
def data(self):
np.random.seed(131313131)
n = 200
d = pd.DataFrame()
d['X'] = np.random.binomial(n=1, p=0.5, size=n)
d['W'] = np.random.binomial(n=1, p=0.5, size=n)
d['T'] = (1 / 1.25 + 1 / np.exp(0.5) * d['X']) * np.random.weibull(a=0.75, size=n)
d['C'] = np.random.weibull(a=1, size=n)
d['C'] = np.where(d['C'] > 10, 10, d['C'])
d['delta'] = np.where(d['T'] < d['C'], 1, 0)
d['t'] = np.where(d['delta'] == 1, d['T'], d['C'])
d['weight'] = np.random.uniform(1, 5, size=n)
return d
def test_exponential_model(self, surv_data):
"""Tests exponential model estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
return ee_exponential_model(theta=theta[0],
t=times, delta=events)
mestimator = MEstimator(psi, init=[1.])
mestimator.estimate(solver="lm")
exf = ExponentialFitter()
exf.fit(times, events)
results = np.asarray(exf.summary[['coef', 'se(coef)', 'coef lower 95%', 'coef upper 95%']])
# Checking mean estimate
npt.assert_allclose(1 / mestimator.theta[0],
np.asarray(results[0, 0]),
atol=1e-5)
# No robust variance for lifeline's ExponentialFitter, so not checking against
# Checking variance estimates
# npt.assert_allclose(np.sqrt(np.diag(mestimator.variance)),
# np.asarray(results[0, 1]),
# atol=1e-6)
# Checking confidence interval estimates
# npt.assert_allclose(mestimator.confidence_intervals(),
# np.asarray(results[0, 2:]),
# atol=1e-5)
def test_exponential_survival(self, surv_data):
"""Tests exponential measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_exp = ee_exponential_model(theta=theta[0],
t=times, delta=events)
ee_surv = ee_exponential_measure(theta[1:], scale=theta[0],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="survival")
return np.vstack((ee_exp, ee_surv))
mestimator = MEstimator(psi, init=[1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
exf = ExponentialFitter()
exf.fit(times, events)
results = np.asarray(exf.survival_function_at_times(times=[0.5, 1, 2, 3]))
# Checking mean estimate
npt.assert_allclose(mestimator.theta[1:],
results,
atol=1e-5)
def test_exponential_risk(self, surv_data):
"""Tests exponential measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_exp = ee_exponential_model(theta=theta[0],
t=times, delta=events)
ee_surv = ee_exponential_measure(theta[1:], scale=theta[0],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="risk")
return np.vstack((ee_exp, ee_surv))
mestimator = MEstimator(psi, init=[1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
exf = ExponentialFitter()
exf.fit(times, events)
results = exf.cumulative_density_at_times(times=[0.5, 1, 2, 3])
# Checking mean estimate
npt.assert_allclose(mestimator.theta[1:],
results,
atol=1e-5)
def test_exponential_hazard(self, surv_data):
"""Tests exponential measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_exp = ee_exponential_model(theta=theta[0],
t=times, delta=events)
ee_surv = ee_exponential_measure(theta[1:], scale=theta[0],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="hazard")
return np.vstack((ee_exp, ee_surv))
mestimator = MEstimator(psi, init=[1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
exf = ExponentialFitter()
exf.fit(times, events)
results = np.asarray(exf.summary['coef'])[0]
# Checking mean estimate
npt.assert_allclose(mestimator.theta[1:],
[1/results]*4,
atol=1e-5)
def test_exponential_cumulative_hazard(self, surv_data):
"""Tests exponential measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_exp = ee_exponential_model(theta=theta[0],
t=times, delta=events)
ee_surv = ee_exponential_measure(theta[1:], scale=theta[0],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="cumulative_hazard")
return np.vstack((ee_exp, ee_surv))
mestimator = MEstimator(psi, init=[1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
exf = ExponentialFitter()
exf.fit(times, events)
results = exf.cumulative_hazard_at_times(times=[0.5, 1, 2, 3])
# Checking mean estimate
npt.assert_allclose(mestimator.theta[1:],
results,
atol=1e-5)
def test_exponential_density(self, surv_data):
"""Tests exponential measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_exp = ee_exponential_model(theta=theta[0],
t=times, delta=events)
ee_surv = ee_exponential_measure(theta[1:], scale=theta[0],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="density")
return np.vstack((ee_exp, ee_surv))
mestimator = MEstimator(psi, init=[1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
# NOTICE: lifelines fails here (some problem with the derivative), so skipping comparison
# the density measure is still covered by the Weibull density prediction (so not a testing coverage problem)
# exf = ExponentialFitter()
# exf.fit(times, events)
# results = exf.density_at_times(times=[0.5, 1, 2, 3])
#
# # Checking mean estimate
# npt.assert_allclose(mestimator.theta[1:],
# results,
# atol=1e-5)
pass
def test_weibull_model(self, surv_data):
"""Tests Weibull model estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
return ee_weibull_model(theta=theta,
t=times, delta=events)
mestimator = MEstimator(psi, init=[1., 1.])
mestimator.estimate(solver="lm")
wbf = WeibullFitter()
wbf.fit(times, events)
results = np.asarray(wbf.summary[['coef', 'se(coef)', 'coef lower 95%', 'coef upper 95%']])
# Checking mean estimate
npt.assert_allclose([(1 / mestimator.theta[0])**(1/mestimator.theta[1]), mestimator.theta[1]],
np.asarray(results[:, 0]),
atol=1e-4)
# No robust variance for lifeline's WeibullFitter, so not checking against
# Checking variance estimates
# npt.assert_allclose(np.sqrt(np.diag(mestimator.variance)),
# np.asarray(results[0, 1]),
# atol=1e-6)
# Checking confidence interval estimates
# npt.assert_allclose(mestimator.confidence_intervals(),
# np.asarray(results[0, 2:]),
# atol=1e-5)
def test_weibull_survival(self, surv_data):
"""Tests Weibull measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_wbl = ee_weibull_model(theta=theta[0:2],
t=times, delta=events)
ee_surv = ee_weibull_measure(theta[2:], scale=theta[0], shape=theta[1],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="survival")
return np.vstack((ee_wbl, ee_surv))
mestimator = MEstimator(psi, init=[1., 1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
wbf = WeibullFitter()
wbf.fit(times, events)
results = np.asarray(wbf.survival_function_at_times(times=[0.5, 1, 2, 3]))
# Checking mean estimate
npt.assert_allclose(mestimator.theta[2:],
results,
atol=1e-5)
def test_weibull_risk(self, surv_data):
"""Tests Weibull measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_wbl = ee_weibull_model(theta=theta[0:2],
t=times, delta=events)
ee_surv = ee_weibull_measure(theta[2:], scale=theta[0], shape=theta[1],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="risk")
return np.vstack((ee_wbl, ee_surv))
mestimator = MEstimator(psi, init=[1., 1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
wbf = WeibullFitter()
wbf.fit(times, events)
results = np.asarray(wbf.cumulative_density_at_times(times=[0.5, 1, 2, 3]))
# Checking mean estimate
npt.assert_allclose(mestimator.theta[2:],
results,
atol=1e-5)
def test_weibull_hazard(self, surv_data):
"""Tests Weibull measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_wbl = ee_weibull_model(theta=theta[0:2],
t=times, delta=events)
ee_surv = ee_weibull_measure(theta[2:], scale=theta[0], shape=theta[1],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="hazard")
return np.vstack((ee_wbl, ee_surv))
mestimator = MEstimator(psi, init=[1., 1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
wbf = WeibullFitter()
wbf.fit(times, events)
results = np.asarray(wbf.hazard_at_times(times=[0.5, 1, 2, 3]))
# Checking mean estimate
npt.assert_allclose(mestimator.theta[2:],
results,
atol=1e-4)
def test_weibull_cumulative_hazard(self, surv_data):
"""Tests Weibull measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_wbl = ee_weibull_model(theta=theta[0:2],
t=times, delta=events)
ee_surv = ee_weibull_measure(theta[2:], scale=theta[0], shape=theta[1],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="cumulative_hazard")
return np.vstack((ee_wbl, ee_surv))
mestimator = MEstimator(psi, init=[1., 1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
wbf = WeibullFitter()
wbf.fit(times, events)
results = np.asarray(wbf.cumulative_hazard_at_times(times=[0.5, 1, 2, 3]))
# Checking mean estimate
npt.assert_allclose(mestimator.theta[2:],
results,
atol=1e-4)
def test_weibull_density(self, surv_data):
"""Tests Weibull measures estimating equation to lifelines.
"""
times, events = surv_data
def psi(theta):
ee_wbl = ee_weibull_model(theta=theta[0:2],
t=times, delta=events)
ee_surv = ee_weibull_measure(theta[2:], scale=theta[0], shape=theta[1],
times=[0.5, 1, 2, 3], n=times.shape[0],
measure="density")
return np.vstack((ee_wbl, ee_surv))
mestimator = MEstimator(psi, init=[1., 1., 0.5, 0.5, 0.5, 0.5])
mestimator.estimate(solver="lm")
wbf = WeibullFitter()
wbf.fit(times, events)
results = np.asarray(wbf.density_at_times(times=[0.5, 1, 2, 3]))
# Checking mean estimate
npt.assert_allclose(mestimator.theta[2:],
results,
atol=1e-5)
def test_weibull_aft(self, data):
"""Tests Weibull AFT estimating equation to lifelines.
"""
def psi(theta):
return ee_aft_weibull(theta=theta,
t=data['t'], delta=data['delta'], X=data[['X', 'W']])
# M-estimator with built-in Weibull AFT
mestimator = MEstimator(psi, init=[-.5, 0.7, 0., 0.])
mestimator.estimate(solver="lm")
# Weibull AFT from lifelines for comparison
waft = WeibullAFTFitter()
waft.fit(data[['X', 'W', 't', 'delta']], 't', 'delta',
ancillary=False, robust=True)
results = np.asarray(waft.summary[['coef', 'se(coef)', 'coef lower 95%', 'coef upper 95%']])
results = results[[2, 1, 0, 3], :]
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(results[:, 0]),
atol=1e-5)
# Checking variance estimates
npt.assert_allclose(np.sqrt(np.diag(mestimator.variance)),
np.asarray(results[:, 1]),
atol=1e-6)
# Checking confidence interval estimates
npt.assert_allclose(mestimator.confidence_intervals(),
np.asarray(results[:, 2:]),
atol=1e-5)
def test_weighted_weibull_aft(self, data):
"""Tests weighted Weibull AFT estimating equation to lifelines.
"""
def psi(theta):
return ee_aft_weibull(theta=theta, weights=data['weight'],
t=data['t'], delta=data['delta'], X=data[['X', 'W']])
# M-estimator with built-in Weibull AFT
mestimator = MEstimator(psi, init=[0., 0., 0., 0.])
mestimator.estimate(solver="lm")
# Weibull AFT from lifelines for comparison
waft = WeibullAFTFitter()
waft.fit(data[['X', 'W', 't', 'delta', 'weight']], 't', 'delta',
weights_col='weight', ancillary=False, robust=True)
results = np.asarray(waft.summary[['coef', 'se(coef)', 'coef lower 95%', 'coef upper 95%']])
results = results[[2, 1, 0, 3], :]
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
np.asarray(results[:, 0]),
atol=1e-5)
# No variance check, since lifelines uses a different estimator
def test_weibull_aft_survival(self, data):
"""Tests predicted survival at several time points for Weibull AFT estimating equation to lifelines.
"""
# Times to evaluate and covariate pattern to examine
times_to_eval = [1, 1.25, 3, 5]
dta = data.copy()
dta['X'] = 1
dta['W'] = 1
def psi(theta):
aft = ee_aft_weibull(theta=theta[0:4], t=data['t'], delta=data['delta'], X=data[['X', 'W']])
pred_surv_t = ee_aft_weibull_measure(theta=theta[4:], X=dta[['X', 'W']],
times=times_to_eval, measure='survival',
mu=theta[0], beta=theta[1:3], sigma=theta[3])
return np.vstack((aft, pred_surv_t))
# M-estimator with built-in Weibull AFT
mestimator = MEstimator(psi, init=[-.5, 0.7, 0., -.2, ] + [0.5, ]*len(times_to_eval))
mestimator.estimate(solver="lm")
# Predictions from Weibull AFT from lifelines for comparison
waft = WeibullAFTFitter()
waft.fit(data[['X', 'W', 't', 'delta']], 't', 'delta', ancillary=False, robust=True)
preds = waft.predict_survival_function(dta.iloc[0], times=times_to_eval)
# Checking mean estimate
npt.assert_allclose(mestimator.theta[4:],
np.asarray(preds).T[0],
atol=1e-5)
def test_weibull_aft_risk(self, data):
"""Tests predicted risk at several time points for Weibull AFT estimating equation to lifelines.
"""
# Times to evaluate and covariate pattern to examine
times_to_eval = [1, 1.25, 3, 5]
dta = data.copy()
dta['X'] = 1
dta['W'] = 1
def psi(theta):
aft = ee_aft_weibull(theta=theta[0:4], t=data['t'], delta=data['delta'], X=data[['X', 'W']])
pred_surv_t = ee_aft_weibull_measure(theta=theta[4:], X=dta[['X', 'W']],
times=times_to_eval, measure='risk',
mu=theta[0], beta=theta[1:3], sigma=theta[3])
return np.vstack((aft, pred_surv_t))
# M-estimator with built-in Weibull AFT
mestimator = MEstimator(psi, init=[-.5, 0.7, 0., -.2, ] + [0.5, ]*len(times_to_eval))
mestimator.estimate(solver="lm")
# Predictions from Weibull AFT from lifelines for comparison
waft = WeibullAFTFitter()
waft.fit(data[['X', 'W', 't', 'delta']], 't', 'delta', ancillary=False, robust=True)
preds = 1 - waft.predict_survival_function(dta.iloc[0], times=times_to_eval)
# Checking mean estimate
npt.assert_allclose(mestimator.theta[4:],
np.asarray(preds).T[0],
atol=1e-5)
def test_weibull_aft_density(self, data):
"""Tests predicted density at several time points for Weibull AFT estimating equation to lifelines.
"""
# Times to evaluate and covariate pattern to examine
times_to_eval = [1, 1.25, 3, 5]
dta = data.copy()
dta['X'] = 1
dta['W'] = 1
def psi(theta):
aft = ee_aft_weibull(theta=theta[0:4], t=data['t'], delta=data['delta'], X=data[['X', 'W']])
pred_surv_t = ee_aft_weibull_measure(theta=theta[4:], X=dta[['X', 'W']],
times=times_to_eval, measure='density',
mu=theta[0], beta=theta[1:3], sigma=theta[3])
return np.vstack((aft, pred_surv_t))
# M-estimator with built-in Weibull AFT
mestimator = MEstimator(psi, init=[-.5, 0.7, 0., -.2, ] + [0.5, ]*len(times_to_eval))
mestimator.estimate(solver="lm")
# Predictions from Weibull AFT from lifelines for comparison
waft = WeibullAFTFitter()
waft.fit(data[['X', 'W', 't', 'delta']], 't', 'delta', ancillary=False, robust=True)
preds = (waft.predict_survival_function(dta.iloc[0], times=times_to_eval)
* waft.predict_hazard(dta.iloc[0], times=times_to_eval))
# Checking mean estimate
npt.assert_allclose(mestimator.theta[4:],
np.asarray(preds).T[0],
atol=1e-5)
def test_weibull_aft_hazard(self, data):
"""Tests predicted hazard at several time points for Weibull AFT estimating equation to lifelines.
"""
# Times to evaluate and covariate pattern to examine
times_to_eval = [1, 1.25, 3, 5]
dta = data.copy()
dta['X'] = 1
dta['W'] = 1
def psi(theta):
aft = ee_aft_weibull(theta=theta[0:4], t=data['t'], delta=data['delta'], X=data[['X', 'W']])
pred_surv_t = ee_aft_weibull_measure(theta=theta[4:], X=dta[['X', 'W']],
times=times_to_eval, measure='hazard',
mu=theta[0], beta=theta[1:3], sigma=theta[3])
return np.vstack((aft, pred_surv_t))
# M-estimator with built-in Weibull AFT
mestimator = MEstimator(psi, init=[-.5, 0.7, 0., -.2, ] + [0.5, ]*len(times_to_eval))
mestimator.estimate(solver="lm")
# Predictions from Weibull AFT from lifelines for comparison
waft = WeibullAFTFitter()
waft.fit(data[['X', 'W', 't', 'delta']], 't', 'delta', ancillary=False, robust=True)
preds = waft.predict_hazard(dta.iloc[0], times=times_to_eval)
# Checking mean estimate
npt.assert_allclose(mestimator.theta[4:],
np.asarray(preds).T[0],
atol=1e-5)
def test_weibull_aft_cumulative_hazard(self, data):
"""Tests predicted cumulative hazard at several time points for Weibull AFT estimating equation to lifelines.
"""
# Times to evaluate and covariate pattern to examine
times_to_eval = [1, 1.25, 3, 5]
dta = data.copy()
dta['X'] = 1
dta['W'] = 1
def psi(theta):
aft = ee_aft_weibull(theta=theta[0:4], t=data['t'], delta=data['delta'], X=data[['X', 'W']])
pred_surv_t = ee_aft_weibull_measure(theta=theta[4:], X=dta[['X', 'W']],
times=times_to_eval, measure='cumulative_hazard',
mu=theta[0], beta=theta[1:3], sigma=theta[3])
return np.vstack((aft, pred_surv_t))
# M-estimator with built-in Weibull AFT
mestimator = MEstimator(psi, init=[-.5, 0.7, 0., -.2, ] + [0.5, ]*len(times_to_eval))
mestimator.estimate(solver="lm")
# Predictions from Weibull AFT from lifelines for comparison
waft = WeibullAFTFitter()
waft.fit(data[['X', 'W', 't', 'delta']], 't', 'delta', ancillary=False, robust=True)
preds = waft.predict_cumulative_hazard(dta.iloc[0], times=times_to_eval)
# Checking mean estimate
npt.assert_allclose(mestimator.theta[4:],
np.asarray(preds).T[0],
atol=1e-4)
class TestEstimatingEquationsDoseResponse:
def test_4pl(self):
"""Test the 4 parameter log-logistic model using Inderjit et al. (2002)
Compares against R's drc library:
library(drc)
library(sandwich)
library(lmtest)
data(ryegrass)
rgll4 = drm(rootl ~ conc, data=ryegrass, fct=LL.4())
coeftest(rgll4, vcov=sandwich)
"""
d = load_inderjit()
dose_data = d[:, 1]
resp_data = d[:, 0]
def psi(theta):
return ee_4p_logistic(theta=theta, X=dose_data, y=resp_data)
# Optimization procedure
mestimator = MEstimator(psi, init=[0, 2, 1, 10])
mestimator.estimate(solver='lm')
# R optimization from Ritz et al.
comparison_theta = np.asarray([0.48141, 3.05795, 2.98222, 7.79296])
comparison_var = np.asarray([0.12779, 0.26741, 0.47438, 0.15311])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
comparison_theta,
atol=1e-5)
# Checking variance estimate
npt.assert_allclose(np.diag(mestimator.variance)**0.5,
comparison_var,
atol=1e-4)
def test_3pl(self):
"""Test the 3 parameter log-logistic model using Inderjit et al. (2002)
Compares against R's drc library:
library(drc)
library(sandwich)
library(lmtest)
data(ryegrass)
rgll3 = drm(rootl ~ conc, data=ryegrass, fct=LL.3())
coeftest(rgll3, vcov=sandwich)
"""
d = load_inderjit()
dose_data = d[:, 1]
resp_data = d[:, 0]
def psi(theta):
return ee_3p_logistic(theta=theta, X=dose_data, y=resp_data,
lower=0)
# Optimization procedure
mestimator = MEstimator(psi, init=[2, 1, 10])
mestimator.estimate(solver='lm')
# R optimization from Ritz et al.
comparison_theta = np.asarray([3.26336, 2.47033, 7.85543])
comparison_var = np.asarray([0.26572, 0.29238, 0.15397])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
comparison_theta,
atol=1e-5)
# Checking variance estimate
npt.assert_allclose(np.diag(mestimator.variance)**0.5,
comparison_var,
atol=1e-5)
def test_2pl(self):
"""Test the 2 parameter log-logistic model using Inderjit et al. (2002)
Compares against R's drc library:
library(drc)
library(sandwich)
library(lmtest)
data(ryegrass)
rgll2 = drm(rootl ~ conc, data=ryegrass, fct=LL.2(upper=8))
coeftest(rgll2, vcov=sandwich)
"""
d = load_inderjit()
dose_data = d[:, 1]
resp_data = d[:, 0]
def psi(theta):
return ee_2p_logistic(theta=theta, X=dose_data, y=resp_data,
lower=0, upper=8)
# Optimization procedure
mestimator = MEstimator(psi, init=[2, 1])
mestimator.estimate(solver='lm')
# R optimization from Ritz et al.
comparison_theta = np.asarray([3.19946, 2.38220])
comparison_var = np.asarray([0.24290, 0.27937])
# Checking mean estimate
npt.assert_allclose(mestimator.theta,
comparison_theta,
atol=1e-5)
# Checking variance estimate
npt.assert_allclose(np.diag(mestimator.variance)**0.5,
comparison_var,
atol=1e-5)
def test_3pl_ed_delta(self):
"""Test the ED(alpha) calculation with the 3 parameter log-logistic model using Inderjit et al. (2002)
Compares against R's drc library:
library(drc)
library(sandwich)
data(ryegrass)
rgll3 = drm(rootl ~ conc, data=ryegrass, fct=LL.3())
ED(rgll3, c(5, 10, 50), interval='delta', vcov=sandwich)
"""
d = load_inderjit()
dose_data = d[:, 1]
resp_data = d[:, 0]
def psi(theta):
lower_limit = 0
pl3 = ee_3p_logistic(theta=theta, X=dose_data, y=resp_data,
lower=lower_limit)
ed05 = ee_effective_dose_delta(theta[3], y=resp_data, delta=0.05,
steepness=theta[0], ed50=theta[1],
lower=lower_limit, upper=theta[2])
ed10 = ee_effective_dose_delta(theta[4], y=resp_data, delta=0.10,
steepness=theta[0], ed50=theta[1],
lower=lower_limit, upper=theta[2])
ed50 = ee_effective_dose_delta(theta[5], y=resp_data, delta=0.50,
steepness=theta[0], ed50=theta[1],
lower=lower_limit, upper=theta[2])
return np.vstack((pl3,
ed05,
ed10,
ed50))
# Optimization procedure
mestimator = MEstimator(psi, init=[2, 1, 10, 1, 1, 2])
mestimator.estimate(solver='lm')
# R optimization from Ritz et al.
comparison_theta = np.asarray([0.99088, 1.34086, 3.26336])
comparison_var = np.asarray([0.12397, 0.13134, 0.26572])
# Checking mean estimate
npt.assert_allclose(mestimator.theta[-3:],
comparison_theta,
atol=1e-5)
# Checking variance estimate
npt.assert_allclose(np.diag(mestimator.variance)[-3:]**0.5,
comparison_var,
atol=1e-5)
class TestEstimatingEquationsCausal:
@pytest.fixture
def causal_data(self):
np.random.seed(1205811)
n = 1000
df = pd.DataFrame()
# Covariates
df['W'] = np.random.binomial(1, p=0.5, size=n)
df['A'] = np.random.binomial(1, p=(0.25 + 0.5 * df['W']), size=n)
df['C'] = 1
# Potential outcomes
df['Ya0'] = np.random.binomial(1, p=(0.75 - 0.5 * df['W']), size=n)
df['Ya1'] = np.random.binomial(1, p=(0.75 - 0.5 * df['W'] - 0.1 * 1), size=n)
# Applying causal consistency
df['Y'] = (1 - df['A']) * df['Ya0'] + df['A'] * df['Ya1']
return df
def test_gformula(self, causal_data):
d1 = causal_data.copy()
d1['A'] = 1
d0 = causal_data.copy()
d0['A'] = 0
# M-estimation
def psi(theta):
return ee_gformula(theta,
y=causal_data['Y'],
X=causal_data[['C', 'A', 'W']],
X1=d1[['C', 'A', 'W']],
X0=d0[['C', 'A', 'W']])
mestimator = MEstimator(psi, init=[0., 0.5, 0.5, 0., 0., 0.])
mestimator.estimate(solver='lm')
# By-hand g-formula with statsmodels
glm = sm.GLM(causal_data['Y'], causal_data[['C', 'A', 'W']],
family=sm.families.Binomial()).fit()
cd = causal_data[['C', 'A', 'W']].copy()
cd['A'] = 1
ya1 = glm.predict(cd)
cd['A'] = 0
ya0 = glm.predict(cd)
# Checking logistic coefficients (nuisance model estimates)
npt.assert_allclose(mestimator.theta[3:],
np.asarray(glm.params),
atol=1e-6)
# Checking mean estimates
npt.assert_allclose(mestimator.theta[0],
np.mean(ya1) - np.mean(ya0),
atol=1e-6)
npt.assert_allclose(mestimator.theta[1],
np.mean(ya1),
atol=1e-6)
npt.assert_allclose(mestimator.theta[2],
np.mean(ya0),
atol=1e-6)
def test_gcomp_bad_dimensions_error(self, causal_data):
d1 = causal_data.copy()
d1['A'] = 1
d0 = causal_data.copy()
d0['A'] = 0
# M-estimation
def psi(theta):
return ee_gformula(theta,
y=causal_data['Y'],
X=causal_data[['C', 'A', 'W']],
X1=d1[['C', 'W']])
mestimator = MEstimator(psi, init=[0.5, 0., 0., 0.])
with pytest.raises(ValueError, match="The dimensions of X and X1"):
mestimator.estimate(solver='lm')
def psi(theta):
return ee_gformula(theta,
y=causal_data['Y'],
X=causal_data[['C', 'A', 'W']],
X1=d1[['C', 'A', 'W']],
X0=d0[['C', 'A']])
mestimator = MEstimator(psi, init=[0., 0.5, 0.5, 0., 0., 0.])
with pytest.raises(ValueError, match="The dimensions of X and X0"):
mestimator.estimate(solver='lm')
def test_ipw(self, causal_data):
# M-estimation
def psi(theta):
return ee_ipw(theta,
y=causal_data['Y'],
A=causal_data['A'],
W=causal_data[['C', 'W']])
mestimator = MEstimator(psi, init=[0., 0.5, 0.5, 0., 0.])
mestimator.estimate(solver='lm')
# By-hand IPW estimator with statsmodels
glm = sm.GLM(causal_data['A'], causal_data[['C', 'W']],
family=sm.families.Binomial()).fit()
pi = glm.predict()
ya1 = causal_data['A'] * causal_data['Y'] / pi
ya0 = (1-causal_data['A']) * causal_data['Y'] / (1-pi)
# Checking logistic coefficients (nuisance model estimates)
npt.assert_allclose(mestimator.theta[3:],
np.asarray(glm.params),
atol=1e-6)
# Checking mean estimates
npt.assert_allclose(mestimator.theta[0],
np.mean(ya1) - np.mean(ya0),
atol=1e-6)
npt.assert_allclose(mestimator.theta[1],
np.mean(ya1),
atol=1e-6)
npt.assert_allclose(mestimator.theta[2],
np.mean(ya0),
atol=1e-6)
def test_ipw_truncate(self, causal_data):
# M-estimation
def psi(theta):
return ee_ipw(theta,
y=causal_data['Y'],
A=causal_data['A'],
W=causal_data[['C', 'W']],
truncate=(0.1, 0.5))
mestimator = MEstimator(psi, init=[0., 0.5, 0.5, 0., 0.])
mestimator.estimate(solver='lm')
# By-hand IPW estimator with statsmodels
glm = sm.GLM(causal_data['A'], causal_data[['C', 'W']],
family=sm.families.Binomial()).fit()
pi = glm.predict()
pi = np.clip(pi, 0.1, 0.5)
ya1 = causal_data['A'] * causal_data['Y'] / pi
ya0 = (1-causal_data['A']) * causal_data['Y'] / (1-pi)
# Checking logistic coefficients (nuisance model estimates)
npt.assert_allclose(mestimator.theta[3:],
np.asarray(glm.params),
atol=1e-6)
# Checking mean estimates
npt.assert_allclose(mestimator.theta[0],
np.mean(ya1) - np.mean(ya0),
atol=1e-6)
npt.assert_allclose(mestimator.theta[1],
np.mean(ya1),
atol=1e-6)
npt.assert_allclose(mestimator.theta[2],
np.mean(ya0),
atol=1e-6)
def test_ipw_truncate_error(self, causal_data):
# M-estimation
def psi(theta):
return ee_ipw(theta,
y=causal_data['Y'],
A=causal_data['A'],
W=causal_data[['C', 'W']],
truncate=(0.99, 0.01))
mestimator = MEstimator(psi, init=[0., 0.5, 0.5, 0., 0.])
with pytest.raises(ValueError, match="truncate values"):
mestimator.estimate()
def test_aipw(self, causal_data):
d1 = causal_data.copy()
d1['A'] = 1
d0 = causal_data.copy()
d0['A'] = 0
# M-estimation
def psi_builtin_regression(theta):
return ee_aipw(theta,
y=causal_data['Y'],
A=causal_data['A'],
W=causal_data[['C', 'W']],
X=causal_data[['C', 'A', 'W']],
X1=d1[['C', 'A', 'W']],
X0=d0[['C', 'A', 'W']])
mestimator = MEstimator(psi_builtin_regression, init=[0., 0.5, 0.5, # Parameters of interest
0., 0., 0., # Outcome nuisance model
0., 0.]) # Treatment nuisance model
mestimator.estimate(solver='lm', tolerance=1e-12)
# By-hand IPW estimator with statsmodels
pi_m = sm.GLM(causal_data['A'], causal_data[['C', 'W']],
family=sm.families.Binomial()).fit()
y_m = sm.GLM(causal_data['Y'], causal_data[['C', 'A', 'W']],
family=sm.families.Binomial()).fit()
# Predicting coefficients
pi = pi_m.predict()
cd = causal_data[['C', 'A', 'W']].copy()
cd['A'] = 1
ya1 = y_m.predict(cd)
cd['A'] = 0
ya0 = y_m.predict(cd)
# AIPW estimator
ya1_star = causal_data['Y'] * causal_data['A'] / pi - ya1 * (causal_data['A'] - pi) / pi
ya0_star = causal_data['Y'] * (1-causal_data['A']) / (1-pi) - ya0 * (pi - causal_data['A']) / (1-pi)
# AIPW variance estimator!
var_ate = np.nanvar((ya1_star - ya0_star) - np.mean(ya1_star - ya0_star), ddof=1) / causal_data.shape[0]
var_r1 = np.nanvar(ya1_star - np.mean(ya1_star), ddof=1) / causal_data.shape[0]
var_r0 = np.nanvar(ya0_star - np.mean(ya0_star), ddof=1) / causal_data.shape[0]
# Checking logistic coefficients (nuisance model estimates)
npt.assert_allclose(mestimator.theta[3:5],
np.asarray(pi_m.params),
atol=1e-6)
npt.assert_allclose(mestimator.theta[5:],
np.asarray(y_m.params),
atol=1e-6)
# Checking mean estimates
npt.assert_allclose(mestimator.theta[0],
np.mean(ya1_star) - np.mean(ya0_star),
atol=1e-6)
npt.assert_allclose(mestimator.theta[1],
np.mean(ya1_star),
atol=1e-6)
npt.assert_allclose(mestimator.theta[2],
np.mean(ya0_star),
atol=1e-6)
# Checking variance estimates
npt.assert_allclose(mestimator.variance[0, 0],
var_ate,
atol=1e-6)
npt.assert_allclose(mestimator.variance[1, 1],
var_r1,
atol=1e-6)
npt.assert_allclose(mestimator.variance[2, 2],
var_r0,
atol=1e-6)
| 40.04674 | 120 | 0.507049 | 7,597 | 65,116 | 4.244702 | 0.051336 | 0.007381 | 0.047973 | 0.055261 | 0.902968 | 0.878903 | 0.863708 | 0.849878 | 0.830124 | 0.816386 | 0 | 0.039814 | 0.362015 | 65,116 | 1,625 | 121 | 40.071385 | 0.736418 | 0.138353 | 0 | 0.738841 | 0 | 0 | 0.024106 | 0 | 0 | 0 | 0 | 0 | 0.081671 | 1 | 0.101614 | false | 0.00095 | 0.010446 | 0.040836 | 0.175689 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed8519c2b0d3df35364ac5d49a7aa901c36d53a0 | 2,156 | py | Python | tests/unit/test_retcodes_pass_pass.py | ldav1s/pepper | 8096e0896417ef9b533c7b99dac77535c6eba15d | [
"Apache-2.0"
] | 217 | 2015-01-20T09:07:23.000Z | 2022-03-09T13:16:58.000Z | tests/unit/test_retcodes_pass_pass.py | ldav1s/pepper | 8096e0896417ef9b533c7b99dac77535c6eba15d | [
"Apache-2.0"
] | 161 | 2015-01-29T16:40:39.000Z | 2022-01-18T13:54:21.000Z | tests/unit/test_retcodes_pass_pass.py | ldav1s/pepper | 8096e0896417ef9b533c7b99dac77535c6eba15d | [
"Apache-2.0"
] | 131 | 2015-01-30T08:28:26.000Z | 2022-01-27T22:10:34.000Z | # -*- coding: utf-8 -*-
# Import Python Libraries
from __future__ import print_function, unicode_literals, absolute_import
import sys
# Import Pepper Libraries
import pepper
from mock import patch, MagicMock
PAYLOAD = {
"return": [
{
"ezh.msk.ru": {
"jid": "20180414193904158892",
"ret": "pass",
"retcode": 0
},
"saltstack.ezh.msk.ru": {
"jid": "20180414193904158892",
"ret": "pass",
"retcode": 0
}
}
]
}
@patch('pepper.cli.PepperCli.login', MagicMock(side_effect=lambda arg: None))
@patch('pepper.cli.PepperCli.low', MagicMock(side_effect=lambda api, load: PAYLOAD))
def test_default():
sys.argv = ['pepper', 'minion_id', 'request']
ret_code = pepper.script.Pepper()()
assert ret_code == 0
@patch('pepper.cli.PepperCli.login', MagicMock(side_effect=lambda arg: None))
@patch('pepper.cli.PepperCli.low', MagicMock(side_effect=lambda api, load: PAYLOAD))
def test_fail_any():
sys.argv = ['pepper', '--fail-any', 'minion_id', 'request']
ret_code = pepper.script.Pepper()()
assert ret_code == 0
@patch('pepper.cli.PepperCli.login', MagicMock(side_effect=lambda arg: None))
@patch('pepper.cli.PepperCli.low', MagicMock(side_effect=lambda api, load: PAYLOAD))
def test_fail_any_none():
sys.argv = ['pepper', '--fail-any-none', 'minion_id', 'request']
ret_code = pepper.script.Pepper()()
assert ret_code == 0
@patch('pepper.cli.PepperCli.login', MagicMock(side_effect=lambda arg: None))
@patch('pepper.cli.PepperCli.low', MagicMock(side_effect=lambda api, load: PAYLOAD))
def test_fail_all():
sys.argv = ['pepper', '--fail-all', 'minion_id', 'request']
ret_code = pepper.script.Pepper()()
assert ret_code == 0
@patch('pepper.cli.PepperCli.login', MagicMock(side_effect=lambda arg: None))
@patch('pepper.cli.PepperCli.low', MagicMock(side_effect=lambda api, load: PAYLOAD))
def test_fail_all_none():
sys.argv = ['pepper', '--fail-all-none', 'minion_id', 'request']
ret_code = pepper.script.Pepper()()
assert ret_code == 0
| 32.179104 | 84 | 0.644712 | 269 | 2,156 | 5.007435 | 0.200743 | 0.081663 | 0.103935 | 0.17075 | 0.85078 | 0.785449 | 0.785449 | 0.785449 | 0.785449 | 0.717892 | 0 | 0.02757 | 0.192486 | 2,156 | 66 | 85 | 32.666667 | 0.746123 | 0.032004 | 0 | 0.52 | 0 | 0 | 0.24964 | 0.120019 | 0 | 0 | 0 | 0 | 0.1 | 1 | 0.1 | false | 0.04 | 0.08 | 0 | 0.18 | 0.02 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed8542db68b8d64a034b02ab7099c2bebe6917d9 | 26,509 | py | Python | fhirclient/models/practitioner_tests.py | JamesSkane/smart_resources | 85c362542b85ebc43ec00cd04915b114ee95f9c0 | [
"Apache-2.0"
] | null | null | null | fhirclient/models/practitioner_tests.py | JamesSkane/smart_resources | 85c362542b85ebc43ec00cd04915b114ee95f9c0 | [
"Apache-2.0"
] | null | null | null | fhirclient/models/practitioner_tests.py | JamesSkane/smart_resources | 85c362542b85ebc43ec00cd04915b114ee95f9c0 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Generated from FHIR 0.5.0.5149 on 2015-07-06.
# 2015, SMART Health IT.
import os
import io
import unittest
import json
from . import practitioner
from .fhirdate import FHIRDate
class PractitionerTests(unittest.TestCase):
def instantiate_from(self, filename):
datadir = os.environ.get('FHIR_UNITTEST_DATADIR') or ''
with io.open(os.path.join(datadir, filename), 'r', encoding='utf-8') as handle:
js = json.load(handle)
self.assertEqual("Practitioner", js["resourceType"])
return practitioner.Practitioner(js)
def testPractitioner1(self):
inst = self.instantiate_from("pract-uslab-example1.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner1(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner1(inst2)
def implPractitioner1(self, inst):
self.assertEqual(inst.id, "uslab-example1")
self.assertEqual(inst.identifier[0].system, "https://nppes.cms.hhs.gov/NPPES/")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "4444444445")
self.assertEqual(inst.name.family[0], "Bloodraw")
self.assertEqual(inst.name.given[0], "Leanard")
self.assertEqual(inst.name.given[1], "T")
self.assertEqual(inst.name.suffix[0], "Jr")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].value, "(555)7771234 ext.11")
self.assertEqual(inst.text.status, "generated")
def testPractitioner2(self):
inst = self.instantiate_from("pract-uslab-example2.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner2(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner2(inst2)
def implPractitioner2(self, inst):
self.assertEqual(inst.address[0].city, "Boston")
self.assertEqual(inst.address[0].country, "USA")
self.assertEqual(inst.address[0].extension[0].extension[0].url, "http://example.org//iso21090-SC-coding")
self.assertEqual(inst.address[0].extension[0].extension[0].valueCoding.code, "42043")
self.assertEqual(inst.address[0].extension[0].extension[0].valueCoding.system, "https://www.census.gov/geo/reference")
self.assertEqual(inst.address[0].extension[0].url, "http://example.org/us-core-county")
self.assertEqual(inst.address[0].line[0], "100 Medical Drive")
self.assertEqual(inst.address[0].line[1], "Suite 6")
self.assertEqual(inst.address[0].postalCode, "01236")
self.assertEqual(inst.address[0].state, "MA")
self.assertEqual(inst.address[0].use, "work")
self.assertEqual(inst.id, "uslab-example2")
self.assertEqual(inst.identifier[0].system, "https://nppes.cms.hhs.gov/NPPES/")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "121121121")
self.assertEqual(inst.name.family[0], "Lookafter")
self.assertEqual(inst.name.given[0], "Bill")
self.assertEqual(inst.name.given[1], "T")
self.assertEqual(inst.name.suffix[0], "Jr")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].value, "(617)5551234 ext.12")
self.assertEqual(inst.telecom[1].system, "email")
self.assertEqual(inst.telecom[1].value, "docbill@healthedatainc.com")
self.assertEqual(inst.text.status, "generated")
def testPractitioner3(self):
inst = self.instantiate_from("pract-uslab-example3.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner3(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner3(inst2)
def implPractitioner3(self, inst):
self.assertEqual(inst.id, "uslab-example3")
self.assertEqual(inst.identifier[0].system, "https://nppes.cms.hhs.gov/NPPES/")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "1234567893")
self.assertEqual(inst.name.family[0], "House")
self.assertEqual(inst.name.given[0], "Gregory")
self.assertEqual(inst.name.given[1], "F")
self.assertEqual(inst.name.suffix[0], "PhD")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].value, "555 777 1234 11")
self.assertEqual(inst.text.status, "generated")
def testPractitioner4(self):
inst = self.instantiate_from("practitioner-example-f001-evdb.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner4(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner4(inst2)
def implPractitioner4(self, inst):
self.assertEqual(inst.address[0].city, "Den Burg")
self.assertEqual(inst.address[0].country, "NLD")
self.assertEqual(inst.address[0].line[0], "Galapagosweg 91")
self.assertEqual(inst.address[0].postalCode, "9105 PZ")
self.assertEqual(inst.address[0].use, "work")
self.assertEqual(inst.birthDate.date, FHIRDate("1975-12-07").date)
self.assertEqual(inst.birthDate.as_json(), "1975-12-07")
self.assertEqual(inst.gender, "male")
self.assertEqual(inst.id, "f001")
self.assertEqual(inst.identifier[0].system, "urn:oid:2.16.528.1.1007.3.1")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "938273695")
self.assertEqual(inst.identifier[1].system, "urn:oid:2.16.840.1.113883.2.4.6.3")
self.assertEqual(inst.identifier[1].use, "usual")
self.assertEqual(inst.identifier[1].value, "129IDH4OP733")
self.assertEqual(inst.name.family[0], "van den broek")
self.assertEqual(inst.name.given[0], "Eric")
self.assertEqual(inst.name.suffix[0], "MD")
self.assertEqual(inst.name.use, "official")
self.assertEqual(inst.practitionerRole[0].role.coding[0].code, "01.000")
self.assertEqual(inst.practitionerRole[0].role.coding[0].display, "Arts")
self.assertEqual(inst.practitionerRole[0].role.coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].role.text, "Care role")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].code, "01.018")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].display, "Ear-, Nose and Throat")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].specialty[0].text, "specialisation")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].use, "work")
self.assertEqual(inst.telecom[0].value, "0205568263")
self.assertEqual(inst.telecom[1].system, "email")
self.assertEqual(inst.telecom[1].use, "work")
self.assertEqual(inst.telecom[1].value, "E.M.vandenbroek@bmc.nl")
self.assertEqual(inst.telecom[2].system, "fax")
self.assertEqual(inst.telecom[2].use, "work")
self.assertEqual(inst.telecom[2].value, "0205664440")
self.assertEqual(inst.text.status, "generated")
def testPractitioner5(self):
inst = self.instantiate_from("practitioner-example-f002-pv.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner5(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner5(inst2)
def implPractitioner5(self, inst):
self.assertEqual(inst.address[0].city, "Den Burg")
self.assertEqual(inst.address[0].country, "NLD")
self.assertEqual(inst.address[0].line[0], "Galapagosweg 91")
self.assertEqual(inst.address[0].postalCode, "9105 PZ")
self.assertEqual(inst.address[0].use, "work")
self.assertEqual(inst.birthDate.date, FHIRDate("1979-04-29").date)
self.assertEqual(inst.birthDate.as_json(), "1979-04-29")
self.assertEqual(inst.gender, "male")
self.assertEqual(inst.id, "f002")
self.assertEqual(inst.identifier[0].system, "urn:oid:2.16.528.1.1007.3.1")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "730291637")
self.assertEqual(inst.identifier[1].system, "urn:oid:2.16.840.1.113883.2.4.6.3")
self.assertEqual(inst.identifier[1].use, "usual")
self.assertEqual(inst.identifier[1].value, "174BIP3JH438")
self.assertEqual(inst.name.family[0], "Voigt")
self.assertEqual(inst.name.given[0], "Pieter")
self.assertEqual(inst.name.suffix[0], "MD")
self.assertEqual(inst.name.use, "official")
self.assertEqual(inst.practitionerRole[0].role.coding[0].code, "01.000")
self.assertEqual(inst.practitionerRole[0].role.coding[0].display, "Arts")
self.assertEqual(inst.practitionerRole[0].role.coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].role.text, "Care role")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].code, "01.011")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].display, "Cardiothoracal surgery")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].specialty[0].text, "specialisation")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].use, "work")
self.assertEqual(inst.telecom[0].value, "0205569336")
self.assertEqual(inst.telecom[1].system, "email")
self.assertEqual(inst.telecom[1].use, "work")
self.assertEqual(inst.telecom[1].value, "p.voigt@bmc.nl")
self.assertEqual(inst.telecom[2].system, "fax")
self.assertEqual(inst.telecom[2].use, "work")
self.assertEqual(inst.telecom[2].value, "0205669382")
self.assertEqual(inst.text.status, "generated")
def testPractitioner6(self):
inst = self.instantiate_from("practitioner-example-f003-mv.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner6(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner6(inst2)
def implPractitioner6(self, inst):
self.assertEqual(inst.address[0].city, "Amsterdam")
self.assertEqual(inst.address[0].country, "NLD")
self.assertEqual(inst.address[0].line[0], "Galapagosweg 91")
self.assertEqual(inst.address[0].postalCode, "1105 AZ")
self.assertEqual(inst.address[0].use, "work")
self.assertEqual(inst.birthDate.date, FHIRDate("1963-07-01").date)
self.assertEqual(inst.birthDate.as_json(), "1963-07-01")
self.assertEqual(inst.communication[0].coding[0].code, "nl")
self.assertEqual(inst.communication[0].coding[0].display, "Dutch")
self.assertEqual(inst.communication[0].coding[0].system, "urn:oid:2.16.840.1.113883.6.121")
self.assertEqual(inst.gender, "male")
self.assertEqual(inst.id, "f003")
self.assertEqual(inst.identifier[0].system, "urn:oid:2.16.528.1.1007.3.1")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "846100293")
self.assertEqual(inst.identifier[1].system, "urn:oid:2.16.840.1.113883.2.4.6.3")
self.assertEqual(inst.identifier[1].use, "usual")
self.assertEqual(inst.identifier[1].value, "243HID3RT938")
self.assertEqual(inst.name.family[0], "Versteegh")
self.assertEqual(inst.name.given[0], "Marc")
self.assertEqual(inst.name.suffix[0], "MD")
self.assertEqual(inst.name.use, "official")
self.assertEqual(inst.practitionerRole[0].role.coding[0].code, "01.000")
self.assertEqual(inst.practitionerRole[0].role.coding[0].display, "Arts")
self.assertEqual(inst.practitionerRole[0].role.coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].role.text, "Care role")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].code, "01.011")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].display, "Cardiothoracal surgery")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].specialty[0].text, "specialisation")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].use, "work")
self.assertEqual(inst.telecom[0].value, "0205562431")
self.assertEqual(inst.telecom[1].system, "email")
self.assertEqual(inst.telecom[1].use, "work")
self.assertEqual(inst.telecom[1].value, "m.versteegh@bmc.nl")
self.assertEqual(inst.telecom[2].system, "fax")
self.assertEqual(inst.telecom[2].use, "work")
self.assertEqual(inst.telecom[2].value, "0205662948")
self.assertEqual(inst.text.status, "generated")
def testPractitioner7(self):
inst = self.instantiate_from("practitioner-example-f004-rb.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner7(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner7(inst2)
def implPractitioner7(self, inst):
self.assertEqual(inst.address[0].city, "Amsterdam")
self.assertEqual(inst.address[0].country, "NLD")
self.assertEqual(inst.address[0].line[0], "Galapagosweg 91")
self.assertEqual(inst.address[0].postalCode, "1105 AZ")
self.assertEqual(inst.address[0].use, "work")
self.assertEqual(inst.birthDate.date, FHIRDate("1980-02-04").date)
self.assertEqual(inst.birthDate.as_json(), "1980-02-04")
self.assertEqual(inst.communication[0].coding[0].code, "nl")
self.assertEqual(inst.communication[0].coding[0].display, "Netherlands")
self.assertEqual(inst.communication[0].coding[0].system, "urn:oid:2.16.840.1.113883.6.121")
self.assertEqual(inst.communication[0].text, "Language")
self.assertEqual(inst.gender, "male")
self.assertEqual(inst.id, "f004")
self.assertEqual(inst.identifier[0].system, "urn:oid:2.16.528.1.1007.3.1")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "118265112")
self.assertEqual(inst.identifier[1].system, "urn:oid:2.16.840.1.113883.2.4.6.3")
self.assertEqual(inst.identifier[1].use, "usual")
self.assertEqual(inst.identifier[1].value, "523ASA1LK927")
self.assertEqual(inst.name.family[0], "Briet")
self.assertEqual(inst.name.given[0], "Ronald")
self.assertEqual(inst.name.suffix[0], "MD")
self.assertEqual(inst.name.use, "official")
self.assertEqual(inst.practitionerRole[0].role.coding[0].code, "01.000")
self.assertEqual(inst.practitionerRole[0].role.coding[0].display, "Arts")
self.assertEqual(inst.practitionerRole[0].role.coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].role.text, "Care role")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].code, "01.018")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].display, "Ear-, Nose and Throat")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].specialty[0].text, "specialisation")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].use, "work")
self.assertEqual(inst.telecom[0].value, "0205569273")
self.assertEqual(inst.telecom[1].system, "email")
self.assertEqual(inst.telecom[1].use, "work")
self.assertEqual(inst.telecom[1].value, "r.briet@bmc.nl")
self.assertEqual(inst.telecom[2].system, "fax")
self.assertEqual(inst.telecom[2].use, "work")
self.assertEqual(inst.telecom[2].value, "0205664440")
self.assertEqual(inst.text.status, "generated")
def testPractitioner8(self):
inst = self.instantiate_from("practitioner-example-f005-al.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner8(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner8(inst2)
def implPractitioner8(self, inst):
self.assertEqual(inst.address[0].city, "Amsterdam")
self.assertEqual(inst.address[0].country, "NLD")
self.assertEqual(inst.address[0].line[0], "Galapagosweg 9")
self.assertEqual(inst.address[0].postalCode, "1105 AZ")
self.assertEqual(inst.address[0].use, "work")
self.assertEqual(inst.birthDate.date, FHIRDate("1959-03-11").date)
self.assertEqual(inst.birthDate.as_json(), "1959-03-11")
self.assertEqual(inst.communication[0].coding[0].code, "fr")
self.assertEqual(inst.communication[0].coding[0].display, "France")
self.assertEqual(inst.communication[0].coding[0].system, "urn:oid:2.16.840.1.113883.6.121")
self.assertEqual(inst.gender, "female")
self.assertEqual(inst.id, "f005")
self.assertEqual(inst.identifier[0].system, "urn:oid:2.16.528.1.1007.3.1")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "118265112")
self.assertEqual(inst.identifier[1].system, "urn:oid:2.16.840.1.113883.2.4.6.3")
self.assertEqual(inst.identifier[1].use, "usual")
self.assertEqual(inst.identifier[1].value, "191REW8WE916")
self.assertEqual(inst.name.family[0], "Anne")
self.assertEqual(inst.name.given[0], "Langeveld")
self.assertEqual(inst.name.suffix[0], "MD")
self.assertEqual(inst.name.use, "official")
self.assertEqual(inst.practitionerRole[0].role.coding[0].code, "01.000")
self.assertEqual(inst.practitionerRole[0].role.coding[0].display, "Arts")
self.assertEqual(inst.practitionerRole[0].role.coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].role.text, "Care role")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].code, "01.018")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].display, "Keel- neus- en oorarts")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].specialty[0].text, "specialisation")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].use, "work")
self.assertEqual(inst.telecom[0].value, "0205563847")
self.assertEqual(inst.telecom[1].system, "email")
self.assertEqual(inst.telecom[1].use, "work")
self.assertEqual(inst.telecom[1].value, "a.langeveld@bmc.nl")
self.assertEqual(inst.telecom[2].system, "fax")
self.assertEqual(inst.telecom[2].use, "work")
self.assertEqual(inst.telecom[2].value, "0205668916")
self.assertEqual(inst.text.status, "generated")
def testPractitioner9(self):
inst = self.instantiate_from("practitioner-example-f006-rvdb.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner9(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner9(inst2)
def implPractitioner9(self, inst):
self.assertEqual(inst.address[0].city, "Den Burg")
self.assertEqual(inst.address[0].country, "NLD")
self.assertEqual(inst.address[0].line[0], "Galapagosweg 91")
self.assertEqual(inst.address[0].postalCode, "9105 PZ")
self.assertEqual(inst.address[0].use, "work")
self.assertEqual(inst.birthDate.date, FHIRDate("1975-12-07").date)
self.assertEqual(inst.birthDate.as_json(), "1975-12-07")
self.assertEqual(inst.gender, "male")
self.assertEqual(inst.id, "f006")
self.assertEqual(inst.identifier[0].system, "urn:oid:2.16.528.1.1007.3.1")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "937223645")
self.assertEqual(inst.identifier[1].system, "urn:oid:2.16.840.1.113883.2.4.6.3")
self.assertEqual(inst.identifier[1].use, "usual")
self.assertEqual(inst.identifier[1].value, "134IDY41W988")
self.assertEqual(inst.name.family[0], "van den Berk")
self.assertEqual(inst.name.given[0], "Rob")
self.assertEqual(inst.name.suffix[0], "MD")
self.assertEqual(inst.name.use, "official")
self.assertEqual(inst.practitionerRole[0].role.coding[0].code, "01.000")
self.assertEqual(inst.practitionerRole[0].role.coding[0].display, "Arts")
self.assertEqual(inst.practitionerRole[0].role.coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].role.text, "Care role")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].code, "17.000")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].display, "Pharmacist")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].specialty[0].text, "specialisation")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].use, "work")
self.assertEqual(inst.telecom[0].value, "0205569288")
self.assertEqual(inst.telecom[1].system, "email")
self.assertEqual(inst.telecom[1].use, "work")
self.assertEqual(inst.telecom[1].value, "R.A.vandenberk@bmc.nl")
self.assertEqual(inst.telecom[2].system, "fax")
self.assertEqual(inst.telecom[2].use, "work")
self.assertEqual(inst.telecom[2].value, "0205664987")
self.assertEqual(inst.text.status, "generated")
def testPractitioner10(self):
inst = self.instantiate_from("practitioner-example-f007-sh.json")
self.assertIsNotNone(inst, "Must have instantiated a Practitioner instance")
self.implPractitioner10(inst)
js = inst.as_json()
self.assertEqual("Practitioner", js["resourceType"])
inst2 = practitioner.Practitioner(js)
self.implPractitioner10(inst2)
def implPractitioner10(self, inst):
self.assertEqual(inst.address[0].city, "Den Burg")
self.assertEqual(inst.address[0].country, "NLD")
self.assertEqual(inst.address[0].line[0], "Galapagosweg 91")
self.assertEqual(inst.address[0].postalCode, "9105 PZ")
self.assertEqual(inst.address[0].use, "work")
self.assertEqual(inst.birthDate.date, FHIRDate("1971-11-07").date)
self.assertEqual(inst.birthDate.as_json(), "1971-11-07")
self.assertEqual(inst.gender, "female")
self.assertEqual(inst.id, "f007")
self.assertEqual(inst.identifier[0].system, "urn:oid:2.16.528.1.1007.3.1")
self.assertEqual(inst.identifier[0].use, "official")
self.assertEqual(inst.identifier[0].value, "874635264")
self.assertEqual(inst.identifier[1].system, "urn:oid:2.16.840.1.113883.2.4.6.3")
self.assertEqual(inst.identifier[1].use, "usual")
self.assertEqual(inst.identifier[1].value, "567IUI51C154")
self.assertEqual(inst.name.family[0], "Heps")
self.assertEqual(inst.name.given[0], "Simone")
self.assertEqual(inst.name.suffix[0], "MD")
self.assertEqual(inst.name.use, "official")
self.assertEqual(inst.practitionerRole[0].role.coding[0].code, "01.000")
self.assertEqual(inst.practitionerRole[0].role.coding[0].display, "Arts")
self.assertEqual(inst.practitionerRole[0].role.coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].role.text, "Care role")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].code, "01.015")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].display, "Physician")
self.assertEqual(inst.practitionerRole[0].specialty[0].coding[0].system, "urn:oid:2.16.840.1.113883.2.4.15.111")
self.assertEqual(inst.practitionerRole[0].specialty[0].text, "specialisation")
self.assertEqual(inst.telecom[0].system, "phone")
self.assertEqual(inst.telecom[0].use, "work")
self.assertEqual(inst.telecom[0].value, "020556936")
self.assertEqual(inst.telecom[1].system, "email")
self.assertEqual(inst.telecom[1].use, "work")
self.assertEqual(inst.telecom[1].value, "S.M.Heps@bmc.nl")
self.assertEqual(inst.telecom[2].system, "fax")
self.assertEqual(inst.telecom[2].use, "work")
self.assertEqual(inst.telecom[2].value, "0205669283")
self.assertEqual(inst.text.status, "generated")
| 57.753813 | 126 | 0.670338 | 3,333 | 26,509 | 5.322532 | 0.089709 | 0.275648 | 0.337373 | 0.104059 | 0.884667 | 0.879538 | 0.837993 | 0.783822 | 0.765333 | 0.751184 | 0 | 0.072178 | 0.165868 | 26,509 | 458 | 127 | 57.879913 | 0.730101 | 0.004263 | 0 | 0.610048 | 1 | 0.050239 | 0.16654 | 0.053695 | 0 | 0 | 0 | 0 | 0.803828 | 1 | 0.050239 | false | 0 | 0.014354 | 0 | 0.069378 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
9c056258ee5f482d481593e11172355ffed6a115 | 233,272 | py | Python | V1/R1/N2HF_weapons_v1_r1.py | N2HF-OFFICIAL/n2hf | de4a26a3b70082cd2375cc3fe7a5c2fd09cec085 | [
"MIT"
] | null | null | null | V1/R1/N2HF_weapons_v1_r1.py | N2HF-OFFICIAL/n2hf | de4a26a3b70082cd2375cc3fe7a5c2fd09cec085 | [
"MIT"
] | null | null | null | V1/R1/N2HF_weapons_v1_r1.py | N2HF-OFFICIAL/n2hf | de4a26a3b70082cd2375cc3fe7a5c2fd09cec085 | [
"MIT"
] | null | null | null |
from PIL import Image
import os
import numpy as np
import pandas as pd
from random import seed
from random import randint
dirname = os.path.dirname('')
dimensions = 600, 600
def make_weapons(mw, mw2, mw3):
count_dangpa = 0
count_jangchang = 0
count_oweldo = 0
count_hekgakgung = 0
for x in range(0, mw2):
b=mw3
seed(x+b)
## 1이 진한거 2가 연한거
## a : 은, 쇠 부분
# a1 색상
a1 = (randint(150, 200), randint(150, 200), randint(150, 200))
c=randint(0,500)
seed(c)
# a2 색상
a2 = (a1[0]+randint(30, 56), a1[1]+randint(30, 56), a1[2]+randint(30, 56))
d = randint(501,1000)
seed(d)
## b : 어두운 철 부분
# b1 색상
b1 = (randint(0, 100), randint(0, 100), randint(0, 100))
e=randint(1001,1500)
seed(e)
# b2 색상
b2 = (b1[0]+randint(30, 60), b1[1]+randint(30, 60), b1[2]+randint(30, 60))
f = randint(1501,2000)
seed(f)
## c : 나무 부분
# c1 색상
c1 = (randint(60, 120), randint(0, 50), randint(0, 50))
g=randint(2001,2500)
seed(g)
# c2 색상
c2 = (c1[0]+randint(30, 60), c1[1]+randint(30, 60), c1[2]+randint(30, 60))
h = randint(2501,3000)
seed(h)
## d : 흰 부분
# d1 색상
d1 = (randint(150, 200), randint(150, 200), randint(150, 200))
i=randint(3001,3500)
seed(i)
# d2 색상
d2 = (d1[0]+randint(30, 60), d1[1]+randint(30, 60), d1[2]+randint(30, 60))
j=randint(3501,4000)
seed(j)
## e : 금색, 노란 부분
# e1 색상
e1 = (randint(180, 230), randint(100, 200), randint(0, 50))
k = randint(4001,4500)
seed(k)
# e2 색상
e2 = (e1[0]+randint(0, 26), e1[1]+randint(20, 50), e1[2]+randint(0, 30))
l = randint(5001,9695500)
seed(l)
## f : 솔 부분 (빨강, 초록, 파랑 범위 중 1개)
# f1 색상
f1 = (randint(0, 200), randint(0, 200), randint(0, 200))
m = randint(5501,6000)
seed(m)
# f2 색상
f2 = (f1[0]+randint(0, 50), f1[1]+randint(0, 50), f1[2]+randint(0, 50))
n = randint(6501,7000)
seed(n)
## 배경색
bg1 = randint(0,300)
if bg1 < 100: # 194, 164, 29 /
# bg1 : 누런~주황 ex) 194, 164, 29
bg1_1 = (randint(110, 150), randint(55, 95), randint(0, 25))
o = randint(7501,8000)
seed(o)
bg1_2 = (randint(150, 190), randint(75, 115), randint(0, 20))
p = randint(8001,8500)
seed(p)
bg1_3 = (randint(175, 215), randint(145, 185), randint(10, 50))
q = randint(8501,9000)
seed(q)
elif 100 <= bg1 < 200: # 141, 179, 173 / 45, 125, 112 / 0, 80, 67
bg1_3 = (randint(125, 155), randint(165, 195), randint(160, 190))
r = randint(9001,9500)
seed(r)
bg1_2 = (randint(30, 60), randint(110, 140), randint(95, 125))
s = randint(9501,10000)
seed(s)
bg1_1 = (randint(0, 30), randint(65, 95), randint(50, 80))
t = randint(10001,10500)
seed(t)
else: # 127, 173, 218 / 50, 97, 143 / 1, 40, 78
bg1_3 = (randint(115, 145), randint(160, 190), randint(205, 235))
r = randint(9001,9500)
seed(r)
bg1_2 = (randint(35, 65), randint(85, 115), randint(130, 160))
s = randint(9501,10000)
seed(s)
bg1_1 = (randint(0, 30), randint(25, 55), randint(65, 95))
t = randint(10001,10500)
seed(t)
bg2 = randint(0,300)
if bg2 < 100: # 166, 148, 105 / 121, 61, 0 / 97, 69, 0
bg2_3 = (randint(150, 180), randint(130, 160), randint(90, 120))
r = randint(9001,9500)
seed(r)
bg2_2 = (randint(105, 135), randint(45, 75), randint(0, 30))
s = randint(9501,10000)
seed(s)
bg2_1 = (randint(80, 110), randint(55, 85), randint(0, 30))
t = randint(10001,10500)
seed(t)
elif 100 <= bg2 < 200: # 141, 179, 173 / 45, 125, 112 / 0, 80, 67
bg2_3 = (randint(125, 155), randint(165, 195), randint(160, 190))
r = randint(9001,9500)
seed(r)
bg2_2 = (randint(30, 60), randint(110, 140), randint(95, 125))
s = randint(9501,10000)
seed(s)
bg2_1 = (randint(0, 30), randint(65, 95), randint(50, 80))
t = randint(10001,10500)
seed(t)
else: # 127, 173, 218 / 50, 97, 143 / 1, 40, 78
bg2_3 = (randint(115, 145), randint(160, 190), randint(205, 235))
r = randint(9001,9500)
seed(r)
bg2_2 = (randint(35, 65), randint(85, 115), randint(130, 160))
s = randint(9501,10000)
seed(s)
bg2_1 = (randint(0, 30), randint(25, 55), randint(65, 95))
t = randint(10001,10500)
seed(t)
bg3 = randint(0,300)
if bg3 < 100:
# bg3_1 : 갈색 ex) 179, 86, 5 / bg3_2 : 황토 ex) 222, 161, 108
bg3_1 = (randint(150, 200), randint(70, 100), randint(0, 10))
u = randint(10501,11000)
seed(u)
bg3_2 = (bg3_1[0]+randint(30, 55), bg3_1[1]+randint(30, 55), bg3_1[2]+randint(30, 55))
v = randint(11001,11500)
seed(v)
elif 100 <= bg3 <200 : # 127, 173, 218 / 50, 97, 143
bg3_2 = (randint(115, 145), randint(160, 190), randint(205, 235))
r = randint(9001,9500)
seed(r)
bg3_1 = (randint(35, 65), randint(85, 115), randint(130, 160))
s = randint(9501,10000)
seed(s)
else: # 141, 179, 173 / 45, 125, 112
bg3_2 = (randint(125, 155), randint(165, 195), randint(160, 190))
r = randint(9001,9500)
seed(r)
bg3_1 = (randint(30, 60), randint(110, 140), randint(95, 125))
s = randint(9501,10000)
seed(s)
## g : 화살대 (검은색)
g = (0, 0, 0)
w = randint(11500,12001)
seed(w)
## h : 활 가운데 천 부분
h = (randint(200, 256), randint(200, 256), randint(200, 256))
x = randint(12001,12501)
seed(x)
bg = (randint(150, 200), randint(70, 100), randint(0, 10))
DangPa1 = [
[bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_1, a1, a2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, a1, a2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2],
[bg1_3, bg1_3, bg1_3, a1, a1, a2, a2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, a1, a2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1],
[bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a2, bg1_3, bg1_3, a1, a2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a2, a1, a2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, a1, a1, a1, b2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, a1, a1, a2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, a2, a1, b1, f1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, a1, a2, bg1_2, bg1_2, bg1_2, bg1_2, a2, a1, b1, f1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, a1, a2, a2, a2, a2, a1, bg1_3, bg1_3, f2, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, a1, a1, a1, a1, bg1_3, bg1_3, bg1_3, f2, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, f2, f2, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f1, bg1_3, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f2, f2, f1, bg1_3, bg1_3, bg1_3, bg1_3, b1, b1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f1, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_1, bg1_2, bg1_2],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f1, f1, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, bg1_3, bg1_3, f2, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3],
[bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2],
[bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, c1, c1, c2, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2],
[bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, c1, c1, c2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1],
[bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, c1, c1, c2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, c1, c1, c2, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, c1, b1, bg1_3, bg1_3, bg1_3],
[bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, b1, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
]
DangPa2 = [
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, a1, a2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_3, bg2_2, a1, a1, a2, a2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, a1, a2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_3, bg2_2, bg2_2, bg2_2, a1, a1, a2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, a1, a2, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_2, bg2_2, bg2_2, a1, a1, a1, a2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, a1, a2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, a1, a1, a2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, a1, a2, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, a1, a2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, a1, a1, a2, bg2_3, bg2_3, a1, a2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, a1, a1, a2, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, a1, a1, a1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_3, a1, a1, a2, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, a2, a1, b1, f1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_1, bg2_1, bg2_1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, a2, a1, b1, f1, b1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, a1, a2, a2, a2, a2, a1, bg2_3, bg2_3, f2, b1, b1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a1, a1, a1, a1, bg2_3, bg2_3, bg2_3, f2, bg2_3, b1, b1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, f2, bg2_3, bg2_3, b1, b1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, f1, bg2_3, bg2_3, bg2_3, b1, b1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, f2, f2, f1, bg2_1, bg2_1, bg2_1, bg2_3, b1, b1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, f2, f1, f2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, f1, f1, f2, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, bg2_3, bg2_3, f2, f2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, f2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, c1, c1, c2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_2, bg2_2, bg2_2, bg2_3],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, c1, c1, c2, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, c1, b1, bg2_3, bg2_3, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, b1, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
]
DangPa3 = [
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_2, a1, a2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, a1, a2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, a1, a1, a2, a2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, a1, a2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_1, bg3_1, a1, a1, a2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, a1, a2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, a1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a2, bg3_1, bg3_1, a1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a2, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, b2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, a1, b1, f1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_1, a1, a2, bg3_1, bg3_1, bg3_1, bg3_1, a2, a1, b1, f1, b1, b2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_2, a1, a2, a2, a2, a2, a1, bg3_2, bg3_2, f2, b1, b1, b2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, a1, a1, a1, a1, bg3_1, bg3_1, bg3_1, f2, bg3_1, b1, b1, b2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, f2, f2, bg3_2, bg3_2, b1, b1, b2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, f2, f1, bg3_1, bg3_1, bg3_1, b1, b1, b2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, f2, f2, f2, f1, bg3_2, bg3_2, bg3_2, bg3_2, b1, b1, c2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, f2, f1, f2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c1, c1, c2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, f2, f1, f1, f2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, f2, bg3_2, bg3_1, f2, f2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, f2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, c1, c1, c2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, c1, b1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, b1, bg3_2, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
]
JangChang1 = [
[bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_1, a2, a2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2],
[bg1_3, bg1_3, bg1_3, a1, a1, a2, a2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1],
[bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a2, a2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a2, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, a1, a1, a1, a1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, a1, a1, a2, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, b2, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, b1, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, b1, b1, b1, f1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, b1, b1, d2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, bg1_3, d1, d1, d2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_1, bg1_2, bg1_2],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f2, bg1_3, bg1_3, d1, d1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f1, bg1_3, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f2, f2, f1, bg1_3, bg1_3, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f1, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, b1, b1, d2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, f2, f1, f1, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, d1, d1, d2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_2, bg1_2, f2, bg1_3, bg1_3, f2, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, d1, d1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, b1, b1, d2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3],
[bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, d1, d1, d2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, d1, d1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2],
[bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, c1, c1, c2, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2],
[bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, c1, c1, c2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1],
[bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, c1, c1, c2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, c1, c1, c2, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, c1, b1, bg1_3, bg1_3, bg1_3],
[bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, b1, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
]
JangChang2 = [
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, a2, a2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_3, bg2_2, a1, a1, a2, a2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_3, bg2_2, bg2_2, bg2_2, a1, a1, a2, a2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_2, bg2_2, bg2_2, a1, a1, a1, a2, a2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, a1, a1, a1, a1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a1, a1, a2, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a1, a1, b2, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, b1, b1, b1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, b1, b1, b1, f1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f1, b1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, f2, b1, b1, d2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, bg2_1, d1, d1, d2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, f2, f2, bg2_3, bg2_1, d1, d1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, f2, f1, bg2_3, bg2_3, bg2_1, b1, b1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, f2, f2, f2, f1, bg2_2, bg2_3, bg2_1, bg2_3, b1, b1, b2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, f2, f1, f2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, b1, b1, d2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, f2, f1, f1, f2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, d1, d1, d2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, f2, bg2_2, bg2_2, f2, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, d1, d1, b2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, f2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, b1, b1, b2, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, b1, b1, b2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, b1, b1, d2, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, d1, d1, d2, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, d1, d1, c2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, c1, c1, c2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_2, bg2_2, bg2_2, bg2_3],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, c1, c1, c2, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, c1, b1, bg2_3, bg2_3, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, b1, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
]
JangChang3 = [
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_2, a2, a2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, a1, a1, a2, a2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_1, bg3_1, a1, a1, a2, a2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a2, a2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a1, a1, a2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a1, a1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a2, b2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, b2, b2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, b1, b1, b1, b2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, b1, b1, b1, f1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, f1, b1, b2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, f2, b1, b1, d2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, f2, bg3_1, d1, d1, d2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, f2, f2, bg3_1, bg3_2, d1, d1, b2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, f2, f1, bg3_1, bg3_2, bg3_1, b1, b1, b2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, f2, f2, f2, f1, bg3_2, bg3_2, bg3_1, bg3_2, b1, b1, b2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, f2, f1, f2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, b1, b1, d2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, f2, f1, f1, f2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, d1, d1, d2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, f2, bg3_1, bg3_1, f2, f2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, d1, d1, b2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, f2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, b1, b1, b2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, b1, b1, b2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, b1, b1, d2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, d1, d1, d2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, d1, d1, c2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, c1, b1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, b1, bg3_2, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
]
OwelDo1 = [
[bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, a1, a2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, a1, a2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2],
[bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, a1, a1, a1, a2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a2, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a1, a2, bg1_3, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, a1, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, a1, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, a1, a1, a2, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, a1, a1, a2, a2, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, a1, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a1, a2, bg1_3, bg1_3, e2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, a1, a1, a2, e2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a1, e2, e2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_1, bg1_2, bg1_2],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, e1, e1, e1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, e1, bg1_3, bg1_3, c1, c1, f1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f2, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f1, bg1_2, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f2, f2, f1, bg1_3, bg1_2, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, f2, f1, f2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3],
[bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, f2, f1, f1, f2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, bg1_3, bg1_3, f2, f2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, c1, c1, c2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2],
[bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, c1, c1, c2, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2],
[bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, c1, c1, c2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, b2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1],
[bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, b1, b1, b2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, b1, b1, b2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, b1, b1, b2, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, b1, b1, bg1_3, bg1_3, bg1_3],
[bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, b1, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
]
OwelDo2 = [
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, a1, a2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, a1, a2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, a1, a1, a1, a2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, a1, a1, a1, a2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, a1, a1, a1, a1, a1, a2, bg2_3, a1, a2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, a1, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, a1, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, a1, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, a1, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, a1, a1, a2, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a1, a1, a2, a2, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a1, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a1, a1, a1, a1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a1, a1, a1, a1, a2, bg2_3, bg2_3, e2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, a1, a1, a1, a2, e2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, a1, e2, e2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, e1, e1, e1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, e1, bg2_3, bg2_1, c1, c1, f1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, f1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, f2, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, f2, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, f2, f2, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, f2, f1, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, f2, f2, f2, f1, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, f2, f1, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, f2, f1, f1, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, f2, bg2_3, bg2_1, f2, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, b2, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, b1, b1, b2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, b1, b1, b2, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, b1, b1, b2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, b1, b1, b2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, b1, b1, b2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, b1, b1, b2, bg2_2, bg2_2, bg2_2, bg2_3],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, b1, b1, b2, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, b1, b1, bg2_3, bg2_3, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, b1, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
]
OwelDo3 = [
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, a1, a2, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, a1, a1, a1, a2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, a1, a1, a1, a2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a1, a2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a1, a1, a2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a1, a1, a2, bg3_2, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a1, a1, a1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a2, a1, a1, a1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a2, a2, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a1, a1, a1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a1, a2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, a1, a1, a1, a2, bg3_1, bg3_2, e2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a1, a1, a1, a2, e2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a1, e2, e2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, e1, e1, e1, c2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, e1, bg3_2, bg3_1, c1, c1, f1, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, f1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, f2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, f2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, f2, f2, bg3_2, bg3_1, c1, c1, c2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, f2, f1, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, f2, f2, f2, f1, bg3_2, bg3_1, bg3_2, bg3_2, c1, c1, c2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, f2, f1, f2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, f2, f1, f1, f2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, f2, bg3_2, bg3_2, f2, f2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, f2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c1, c1, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, b2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, b1, b1, b2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, b1, b1, b2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, b1, b1, b2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, b1, b1, b2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, b1, b1, b2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, b1, b1, b2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, b1, b1, b2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, b1, b1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, b1, bg3_2, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
]
HekGakGung1 = [
[bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, c1, c1, c1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2],
[bg1_3, bg1_3, bg1_3, bg1_1, c2, c2, c2, c1, c1, c1, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, c1, f2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, f2, c1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f1, bg1_1, f1, g, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f1, f1, g, f2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f1, f2, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f1, f1, f2, f2, f2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, f2, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_1, bg1_2, bg1_2],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, g, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, g, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, g, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, g, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, c2, c1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, g, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, c2, c1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, g, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, c2, c1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, g, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, c1, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, bg1_3, bg1_3, g, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c2, c2, c1, c1, f2, h, bg1_1, bg1_3, bg1_3, g, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3],
[bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, c2, f2, h, h, h, bg1_3, g, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, h, h, f2, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, f2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, g, c2, c1, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, g, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, g, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, g, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, a2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2],
[bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, g, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, a2, bg1_2, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2],
[bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, g, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, a2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, a2, g, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, a2, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1],
[bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, a2, a1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c1, c1, c1, c1, c1, f2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_1, a2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, c1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c1, c1, c1, c1, c1, c1, c1, c1, c2, c2, c2, c2, c2, c2, f2, c1, c1, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_3, bg1_3, bg1_3, bg1_1, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c2, c1, c1, c1, c1, c1, c2, c2, c2, c2, c2, c2, c2, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, c2, c1, c1, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, c2, c2, c2, c2, c2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, c2, c1, c1, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_1, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, c2, c1, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_3, bg1_1, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_1, bg1_2, bg1_2, bg1_3, bg1_3, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, c2, c1, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, c2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_1, bg1_2, bg1_2, bg1_2, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_2, bg1_3, bg1_2, bg1_2, bg1_2, bg1_2, bg1_2, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_1, bg1_1, bg1_1, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
[bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3, bg1_3],
]
HekGakGung2 = [
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, c1, c1, c1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_3, bg2_2, bg2_3, c2, c2, c2, c1, c1, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, c2, c1, c1, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, c2, f2, c1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, c2, c1, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, c2, c1, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f1, bg2_3, f1, g, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f1, f1, g, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, f1, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f1, f1, f2, f2, f2, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, f2, f2, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, g, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, c2, c1, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, g, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, c2, c1, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, g, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, c2, c1, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, g, bg2_3, a2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, c2, c1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, g, bg2_3, bg2_3, bg2_1, a2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, c2, c1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, g, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, a2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, c2, c1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, g, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, a2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, c2, c1, c1, c1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, g, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, a2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, c2, c2, c2, c1, c1, f2, h, bg2_3, bg2_1, bg2_3, g, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, a2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, c2, f2, h, h, h, bg2_1, g, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, a2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, h, h, f2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, a2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, f2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, g, c2, c1, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, g, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, g, bg2_3, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, g, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, g, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3],
[bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, g, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, a2, bg2_3, bg2_1, bg2_1, bg2_1, bg2_1, bg2_1, bg2_3],
[bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, a2, g, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c2, c1, c1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, a2, a1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, c2, c1, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, c1, c1, c1, c1, c1, c1, c1, f2, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, a2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, c2, c1, c1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, c1, c1, c1, c1, c1, c1, c1, c1, c2, c2, c2, c2, c2, c2, f2, c1, c1, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, c2, c2, c1, c1, c1, c1, c1, c2, c2, c2, c2, c2, c2, c2, c2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, c2, c1, c1, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, c2, c2, c2, c2, c2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, c2, c1, c1, bg2_3, bg2_3, bg2_2],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, c2, c1, bg2_2, bg2_3, bg2_3],
[bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_2, c2, c1, bg2_2, bg2_2, bg2_3],
[bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, c2, bg2_2, bg2_2, bg2_3, bg2_3],
[bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_2, bg2_3, bg2_3, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_2, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_2, bg2_2, bg2_3, bg2_3, bg2_3, bg2_2, bg2_2],
[bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_3, bg2_1, bg2_3, bg2_3, bg2_3, bg2_2, bg2_3, bg2_2, bg2_3, bg2_3],
]
HekGakGung3 = [
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, c1, c1, c1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, c2, c2, c2, c1, c1, c1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, c2, c1, c1, f2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, f2, c1, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c2, c1, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c2, c1, bg3_1, bg3_1, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c2, c1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c2, c1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, f1, bg3_2, f1, g, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, f1, f1, g, f2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, c2, c1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, a2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, f2, f1, f2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, f2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, f1, f1, f2, f2, f2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, c2, c1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, f2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, f2, f2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, c2, c1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, a2, bg3_2, bg3_2, bg3_2, g, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, c2, c1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, a2, bg3_2, g, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, c2, c1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, g, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, c2, c1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, g, bg3_2, a2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, g, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, c2, c1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, g, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, g, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, c2, c1, c1, c1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, g, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, c2, c2, c2, c1, c1, f2, h, bg3_2, bg3_2, bg3_2, g, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, a2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, c2, f2, h, h, h, bg3_1, g, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, h, h, f2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, a2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, f2, c1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, a2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, g, c2, c1, c1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, g, bg3_1, bg3_1, c2, c1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, g, bg3_2, bg3_2, bg3_2, c2, c1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, g, bg3_1, bg3_1, bg3_1, bg3_1, c2, c1, c1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, g, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, g, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c2, c1, bg3_1, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, a2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, g, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, a1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c1, c1, c1, c1, c1, f2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, a2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, c1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c1, c1, c1, c1, c1, c1, c1, c1, c2, c2, c2, c2, c2, c2, f2, c1, c1, bg3_2, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c2, c2, c1, c1, c1, c1, c1, c2, c2, c2, c2, c2, c2, c2, c2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, c2, c1, c1, bg3_1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c2, c2, c2, c2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, c2, c1, c1, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, c2, c1, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, c2, c1, bg3_2, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, c2, bg3_2, bg3_1, bg3_1, bg3_1],
[bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_1],
[bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_2, bg3_1, bg3_2, bg3_1, bg3_2, bg3_2],
[bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_1, bg3_2, bg3_1, bg3_1, bg3_1],
]
HekGakGung = [
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, c1, c1, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, c2, c2, c2, c1, c1, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, c2, c1, c1, f2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, c2, f2, c1, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, f1, bg, f1, g, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, f1, f1, g, f2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, f2, f1, f2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, f2, bg, bg, bg, bg, bg, bg, f1, f1, f2, f2, f2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, f2, bg, bg, bg, bg, bg, f2, f2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, g, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, g, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, c1, c1, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c2, c2, c1, c1, f2, h, bg, bg, bg, g, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, f2, h, h, h, bg, g, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, h, h, f2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, f2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, c2, c1, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, bg, bg, c2, c1, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, g, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, g, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, a1, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c1, c1, c1, c1, c1, c1, c1, f2, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, a2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, c1, bg, bg, bg, bg, bg, c1, c1, c1, c1, c1, c1, c1, c1, c2, c2, c2, c2, c2, c2, f2, c1, c1, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c2, c1, c1, c1, c1, c1, c2, c2, c2, c2, c2, c2, c2, c2, bg, bg, bg, bg, bg, bg, c2, c1, c1, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c2, c2, c2, c2, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, c1, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, c1, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, c2, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
[bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg, bg],
]
y = randint(12501,13000)
seed(y)
if mw == "dangpa":
z = randint(13001,13500)
seed(z)
zq = randint(0,300)
if zq < 100:
pixels = DangPa1
p = "dangpa"
count_dangpa += 1
elif 100 <= zq < 200:
pixels = DangPa2
p = "dangpa"
count_dangpa += 1
else:
pixels = DangPa3
p = "dangpa"
count_dangpa += 1
elif mw == "jangchang":
aa = randint(12501,13000)
seed(aa)
zw = randint(0,300)
if zw < 100:
pixels = JangChang1
p = "jangchang"
count_jangchang += 1
elif 100 <= zw < 200:
pixels = JangChang2
p = "jangchang"
count_jangchang += 1
else:
pixels = JangChang3
p = "jangchang"
count_jangchang += 1
elif mw == "oweldo":
ab = randint(13001,13500)
seed(ab)
ze = randint(0,300)
if ze < 100:
pixels = OwelDo1
p = "oweldo"
count_oweldo += 1
elif 100 <= ze < 200:
pixels = OwelDo2
p = "oweldo"
count_oweldo += 1
else:
pixels = OwelDo3
p = "oweldo"
count_oweldo += 1
else:
ac = randint(13501,14000)
seed(ac)
zr = randint(0,300)
if zr < 100:
pixels = HekGakGung1
p = "hekgakgung"
count_hekgakgung += 1
elif 100 <= zr < 200:
pixels = HekGakGung2
p = "hekgakgung"
count_hekgakgung += 1
else:
pixels = HekGakGung3
p = "hekgakgung"
count_hekgakgung += 1
array = np.array(pixels, dtype=np.uint8)
new_image = Image.fromarray(array)
new_image = new_image.resize(dimensions, resample=0)
if p == "dangpa":
imgname = dirname + '/weapons/' + p + '_' + (str(count_dangpa)) + '.png'
elif p == "jangchang":
imgname = dirname + '/weapons/' + p + '_' + (str(count_jangchang)) + '.png'
elif p == "oweldo":
imgname = dirname + '/weapons/' + p + '_' + (str(count_oweldo)) + '.png'
else:
imgname = dirname + '/weapons/' + p + '_' + (str(count_hekgakgung)) + '.png'
new_image.save(imgname)
make_weapons("dangpa", 50, 511)
#make_weapons("jangchang", 500, 2456)
#make_weapons("oweldo", 1500, 52228)
#make_weapons("hekgakgung", 200, 3563)
| 230.505929 | 364 | 0.652492 | 61,336 | 233,272 | 2.029819 | 0.004255 | 0.226954 | 0.384519 | 0.383483 | 0.979936 | 0.977936 | 0.973221 | 0.972121 | 0.971028 | 0.968281 | 0 | 0.315777 | 0.196817 | 233,272 | 1,011 | 365 | 230.733927 | 0.348724 | 0.002924 | 0 | 0.286192 | 0 | 0 | 0.000851 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001114 | false | 0 | 0.006682 | 0 | 0.007795 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
9c1442b04748278107370b29c41759142c7e0bfd | 53,829 | py | Python | imx/img/images.py | beo-teis/pyIMX | 66f00c8b545fda6bca8a9c6a22c9f46fb785465d | [
"BSD-3-Clause"
] | 20 | 2017-10-14T14:08:53.000Z | 2021-10-03T22:54:13.000Z | imx/img/images.py | beo-teis/pyIMX | 66f00c8b545fda6bca8a9c6a22c9f46fb785465d | [
"BSD-3-Clause"
] | 3 | 2017-10-20T16:28:33.000Z | 2021-12-20T11:35:15.000Z | imx/img/images.py | beo-teis/pyIMX | 66f00c8b545fda6bca8a9c6a22c9f46fb785465d | [
"BSD-3-Clause"
] | 7 | 2019-03-06T14:32:52.000Z | 2021-09-15T14:33:36.000Z | # Copyright (c) 2017-2018 Martin Olejar
#
# SPDX-License-Identifier: BSD-3-Clause
# The BSD-3-Clause license for this file can be found in the LICENSE file included with this distribution
# or at https://spdx.org/licenses/BSD-3-Clause.html#licenseText
from io import BytesIO, BufferedReader, SEEK_END, SEEK_CUR
from .misc import read_raw_data, read_raw_segment
from .header import Header, Header2
from .segments import SegTag, SegIVT2, SegBDT, SegAPP, SegDCD, SegCSF, SegIVT3a, SegIVT3b, SegBDS3a, SegBDS3b, \
SegBIC1
########################################################################################################################
# i.MX Image Public Methods
########################################################################################################################
def parse(stream, step=0x100, size=None):
""" Common parser for all versions of i.MX boot images
:param stream: stream buffer to image
:param step: Image searching step
:param size: parsing size
:return: the object of boot image
"""
if isinstance(stream, (bytes, bytearray)):
stream = BytesIO(stream)
if not isinstance(stream, (BufferedReader, BytesIO)):
raise TypeError(" Not correct value type: \"{}\" !".format(type(stream)))
# calculate stream size
start_index = stream.tell()
last_index = stream.seek(0, SEEK_END)
stream.seek(start_index)
if size:
last_index = min(start_index + size, last_index)
while start_index < (last_index - Header.SIZE):
raw = read_raw_data(stream, Header.SIZE, no_seek=True)
if raw[0] == SegTag.IVT2 and ((raw[1] << 8) | raw[2]) == SegIVT2.SIZE and raw[3] in (0x40, 0x41, 0x42):
return BootImg2.parse(stream)
elif raw[0] == SegTag.IVT2 and ((raw[1] << 8) | raw[2]) == SegIVT3b.SIZE and raw[3] in (0x43,):
return BootImg3b.parse(stream)
elif raw[0] == SegTag.IVT3 and ((raw[1] << 8) | raw[2]) == SegIVT3a.SIZE and raw[3] in (0x43,):
return BootImg3a.parse(stream)
elif raw[3] == SegTag.BIC1:
return BootImg4.parse(stream)
else:
start_index = stream.seek(step, SEEK_CUR)
raise Exception(' Not an i.MX Boot Image !')
########################################################################################################################
# i.MX Boot Image Classes
########################################################################################################################
class EnumAppType:
SCFW = 1
M4_0 = 2
M4_1 = 3
APP = 4
A35 = 4
A53 = 4
A72 = 5
SCD = 6
class BootImgBase(object):
""" IMX Boot Image Base """
@property
def dcd(self):
return self._dcd
@dcd.setter
def dcd(self, value):
assert isinstance(value, SegDCD)
self._dcd = value
def __init__(self, address, offset):
""" Initialize boot image object
:param address: The start address of img in target memory
:param offset: The IVT offset
:return: BootImage object
"""
self.offset = offset
self.address = address
self._dcd = None
def info(self):
raise NotImplementedError()
def add_image(self, data, img_type, address):
raise NotImplementedError()
def export(self):
raise NotImplementedError()
@classmethod
def parse(cls, stream, step=0x100, size=None):
raise NotImplementedError()
########################################################################################################################
# Boot Image V1 Segments (i.MX5)
########################################################################################################################
# Obsolete, will not be implemented
########################################################################################################################
# Boot Image V2 (i.MX6, i.MX7)
########################################################################################################################
class BootImg2(BootImgBase):
""" IMX Boot Image v2 """
# The value of CSF segment size
CSF_SIZE = 0x2000
# The align value of APP segment
APP_ALIGN = 0x1000
# The value of img head size
# offset | size
HEAD_SIZE = {0x400: 0xC00,
0x100: 0x300}
@property
def version(self):
return self._ivt.version
@version.setter
def version(self, value):
self._ivt.version = value
@property
def plugin(self):
return self._plg
@plugin.setter
def plugin(self, value):
assert isinstance(value, bool)
self._plg = value
@property
def ivt(self):
return self._ivt
@ivt.setter
def ivt(self, value):
assert isinstance(value, SegIVT2)
self._ivt = value
@property
def bdt(self):
return self._bdt
@bdt.setter
def bdt(self, value):
assert isinstance(value, SegBDT)
self._bdt = value
@property
def app(self):
return self._app
@app.setter
def app(self, value):
assert isinstance(value, SegAPP)
self._app = value
@property
def csf(self):
return self._csf
@csf.setter
def csf(self, value):
assert isinstance(value, SegCSF)
self._csf = value
@property
def size(self):
sum = self.ivt.space
sum += self.bdt.space
sum += self.dcd.space
sum += self.app.space
sum += self.csf.space
return sum
def __init__(self, address=0, offset=0x400, version=0x41, plugin=False):
""" Initialize boot image object
:param address: The start address of img in target memory
:param offset: The IVT offset
:param version: The version of boot img format
:return: BootImage object
"""
super().__init__(address, offset)
self._ivt = SegIVT2(version)
self._bdt = SegBDT()
self._app = SegAPP()
self._dcd = SegDCD()
self._csf = SegCSF()
self._plg = plugin
def _update(self):
""" Update Image Object """
# Set zero padding for IVT and BDT sections
self.ivt.padding = 0
self.bdt.padding = 0
# Calculate padding for DCD, APP and CSF sections
tmp_val = self.ivt.space + self.bdt.space + self.dcd.size
head_size = 0xC00 if self.offset not in self.HEAD_SIZE else self.HEAD_SIZE[self.offset]
self.dcd.padding = head_size - tmp_val
tmp_val = self.app.size % self.APP_ALIGN
self.app.padding = self.APP_ALIGN - tmp_val if tmp_val > 0 else 0
# Set IVT section
self.ivt.ivt_address = self.address + self.offset
self.ivt.bdt_address = self.ivt.ivt_address + self.ivt.space
if self.dcd.enabled:
self.ivt.dcd_address = self.ivt.bdt_address + self.bdt.space
self.ivt.app_address = self.ivt.dcd_address + self.dcd.space
else:
self.ivt.dcd_address = 0
self.ivt.app_address = self.ivt.bdt_address + self.bdt.space
if self.csf.enabled:
self.ivt.csf_address = self.ivt.app_address + self.app.space
self.csf.padding = self.CSF_SIZE - self.csf.size
else:
self.ivt.csf_address = 0
# Set BDT section
self.bdt.start = self.ivt.ivt_address - self.offset
self.bdt.length = self.size + self.offset
self.bdt.plugin = 1 if self.plugin else 0
def info(self):
self._update()
# Print IVT
msg = "#" * 60 + "\n"
msg += "# IVT (Image Vector Table)\n"
msg += "#" * 60 + "\n\n"
msg += self.ivt.info()
# Print DBI
msg += "#" * 60 + "\n"
msg += "# BDI (Boot Data Info)\n"
msg += "#" * 60 + "\n\n"
msg += self.bdt.info()
# Print DCD
if self.dcd.enabled:
msg += "#" * 60 + "\n"
msg += "# DCD (Device Config Data)\n"
msg += "#" * 60 + "\n\n"
msg += self.dcd.info()
# Print CSF
if self.csf.enabled:
msg += "#" * 60 + "\n"
msg += "# CSF (Code Signing Data)\n"
msg += "#" * 60 + "\n\n"
msg += self.csf.info()
return msg
def add_image(self, data, img_type=EnumAppType.APP, address=0):
""" Add specific image into the main boot image
:param data: Raw data of img
:param img_type: Type of img
:param address: address in RAM
"""
if img_type == EnumAppType.APP:
self.app.data = data
if address != 0:
self.address = address
else:
raise Exception('Unknown data type !')
def export(self):
""" Export image as bytes array
:return: bytes
"""
self._update()
data = self.ivt.export(True)
data += self.bdt.export(True)
data += self.dcd.export(True)
data += self.app.export(True)
data += self.csf.export(True)
return data
@classmethod
def parse(cls, stream, step=0x100, size=None):
""" Parse image from stream buffer or bytes array
:param stream: The stream buffer or bytes array
:param step: Image searching step
:param size: parsing size
:return: BootImg2 object
"""
if isinstance(stream, (bytes, bytearray)):
stream = BytesIO(stream)
if not isinstance(stream, (BufferedReader, BytesIO)):
raise TypeError(" Not correct value type: \"{}\" !".format(type(stream)))
header = None
start_index = stream.tell()
last_index = stream.seek(0, SEEK_END)
stream.seek(start_index)
if size:
last_index = min(start_index + size, last_index)
imx_image = False
while start_index < (last_index - Header.SIZE):
header = Header.parse(read_raw_data(stream, Header.SIZE, no_seek=True))
if header.tag == SegTag.IVT2 or \
header.length == SegIVT2.SIZE or \
header.param in (0x40, 0x41, 0x42, 0x43):
imx_image = True
break
else:
start_index = stream.seek(step, SEEK_CUR)
if not imx_image:
raise Exception(' Not an i.MX Boot Image !')
obj = cls(version=header.param)
img_size = last_index - start_index
if start_index > 0:
obj.offset = start_index
# Parse IVT
obj.ivt = SegIVT2.parse(read_raw_segment(stream, SegTag.IVT2))
# Parse BDT
obj.bdt = SegBDT.parse(read_raw_data(stream, SegBDT.SIZE))
obj.offset = obj.ivt.ivt_address - obj.bdt.start
obj.address = obj.bdt.start
obj.plugin = True if obj.bdt.plugin else False
# Parse DCD
if obj.ivt.dcd_address:
obj.dcd = SegDCD.parse(read_raw_segment(stream, SegTag.DCD))
obj.dcd.padding = (obj.ivt.app_address - obj.ivt.dcd_address) - obj.dcd.size
# Parse APP
app_start = start_index + (obj.ivt.app_address - obj.ivt.ivt_address)
app_size = obj.ivt.csf_address - obj.ivt.app_address if obj.ivt.csf_address else \
obj.bdt.length - (obj.bdt.start - obj.ivt.app_address)
app_size = img_size - app_start if app_size > (img_size - app_start) else app_size
obj.app.data = read_raw_data(stream, app_size, app_start)
obj.app.padding = 0
# Parse CSF
if obj.ivt.csf_address:
csf_start = start_index + (obj.ivt.csf_address - obj.ivt.ivt_address)
obj.csf = SegCSF.parse(read_raw_segment(stream, SegTag.CSF, csf_start))
# obj.csf.padding = csf_start + obj.csf.size
return obj
########################################################################################################################
# Boot Image V2b (i.MX8M)
########################################################################################################################
class BootImg8m(BootImgBase):
""" IMX Boot Image """
# The value of CSF segment size
CSF_SIZE = 0x2000
# The align value of APP segment
APP_ALIGN = 0x1000
# The value of img head size
# offset | size
HEAD_SIZE = {0x400: 0xC00,
0x100: 0x300}
@property
def version(self):
return self._ivt.version
@version.setter
def version(self, value):
self._ivt.version = value
@property
def plugin(self):
return self._plg
@plugin.setter
def plugin(self, value):
assert isinstance(value, bool)
self._plg = value
@property
def ivt(self):
return self._ivt
@ivt.setter
def ivt(self, value):
assert isinstance(value, SegIVT2)
self._ivt = value
@property
def bdt(self):
return self._bdt
@bdt.setter
def bdt(self, value):
assert isinstance(value, SegBDT)
self._bdt = value
@property
def app(self):
return self._app
@app.setter
def app(self, value):
assert isinstance(value, SegAPP)
self._app = value
@property
def csf(self):
return self._csf
@csf.setter
def csf(self, value):
assert isinstance(value, SegCSF)
self._csf = value
@property
def size(self):
sum = self.ivt.space
sum += self.bdt.space
sum += self.dcd.space
sum += self.app.space
sum += self.csf.space
return sum
def __init__(self, address=0, offset=0x400, version=0x41, plugin=False):
""" Initialize boot image object
:param address: The start address of img in target memory
:param offset: The IVT offset
:param version: The version of boot img format
:return: BootImage object
"""
super().__init__(address, offset)
self._ivt = SegIVT2(version)
self._bdt = SegBDT()
self._app = SegAPP()
self._dcd = SegDCD()
self._csf = SegCSF()
self._plg = plugin
def _update(self):
# Set zero padding for IVT and BDT sections
self.ivt.padding = 0
self.bdt.padding = 0
# Calculate padding for DCD, APP and CSF sections
tmp_val = self.ivt.space + self.bdt.space + self.dcd.size
head_size = 0xC00 if self.offset not in self.HEAD_SIZE else self.HEAD_SIZE[self.offset]
self.dcd.padding = head_size - tmp_val
tmp_val = self.app.size % self.APP_ALIGN
self.app.padding = self.APP_ALIGN - tmp_val if tmp_val > 0 else 0
# Set IVT section
self.ivt.ivt_address = self.address + self.offset
self.ivt.bdt_address = self.ivt.ivt_address + self.ivt.space
if self.dcd.enabled:
self.ivt.dcd_address = self.ivt.bdt_address + self.bdt.space
self.ivt.app_address = self.ivt.dcd_address + self.dcd.space
else:
self.ivt.dcd_address = 0
self.ivt.app_address = self.ivt.bdt_address + self.bdt.space
if self.csf.enabled:
self.ivt.csf_address = self.ivt.app_address + self.app.space
self.csf.padding = self.CSF_SIZE - self.csf.size
else:
self.ivt.csf_address = 0
# Set BDT section
self.bdt.start = self.ivt.ivt_address - self.offset
self.bdt.length = self.size + self.offset
self.bdt.plugin = 1 if self.plugin else 0
def info(self):
self._update()
# Print IVT
msg = "#" * 60 + "\n"
msg += "# IVT (Image Vector Table)\n"
msg += "#" * 60 + "\n\n"
msg += self.ivt.info()
# Print DBI
msg += "#" * 60 + "\n"
msg += "# BDI (Boot Data Info)\n"
msg += "#" * 60 + "\n\n"
msg += self.bdt.info()
# Print DCD
if self.dcd.enabled:
msg += "#" * 60 + "\n"
msg += "# DCD (Device Config Data)\n"
msg += "#" * 60 + "\n\n"
msg += self.dcd.info()
# Print CSF
if self.csf.enabled:
msg += "#" * 60 + "\n"
msg += "# CSF (Code Signing Data)\n"
msg += "#" * 60 + "\n\n"
msg += self.csf.info()
return msg
def add_image(self, data, img_type=EnumAppType.APP, address=0):
""" Add specific image into the main boot image
:param data: Raw data of img
:param img_type: Type of img
:param address: address in RAM
:return:
"""
if img_type == EnumAppType.APP:
self.app.data = data
if address != 0:
self.address = address
else:
raise Exception('Unknown data type !')
def export(self):
""" Export Image as bytes array
:return: bytes
"""
self._update()
data = self.ivt.export(True)
data += self.bdt.export(True)
data += self.dcd.export(True)
data += self.app.export(True)
data += self.csf.export(True)
return data
@classmethod
def parse(cls, stream, step=0x100, size=None):
""" Parse image from stream buffer or bytes array
:param stream: The stream buffer or bytes array
:param step: Image searching step
:param size: parsing size
:return: BootImg2 object
"""
if isinstance(stream, (bytes, bytearray)):
stream = BytesIO(stream)
if not isinstance(stream, (BufferedReader, BytesIO)):
raise TypeError(" Not correct value type: \"{}\" !".format(type(stream)))
header = None
start_index = stream.tell()
last_index = stream.seek(0, SEEK_END)
stream.seek(start_index)
if size:
last_index = min(start_index + size, last_index)
imx_image = False
while start_index < (last_index - Header.SIZE):
header = Header.parse(read_raw_data(stream, Header.SIZE, no_seek=True))
if header.tag == SegTag.IVT2 or \
header.length == SegIVT2.SIZE or \
header.param in (0x40, 0x41, 0x42, 0x43):
imx_image = True
break
else:
start_index = stream.seek(step, SEEK_CUR)
if not imx_image:
raise Exception(' Not an i.MX Boot Image !')
obj = cls(version=header.param)
img_size = last_index - start_index
if start_index > 0:
obj.offset = start_index
# Parse IVT
obj.ivt = SegIVT2.parse(read_raw_segment(stream, SegTag.IVT2))
# Parse BDT
obj.bdt = SegBDT.parse(read_raw_data(stream, SegBDT.SIZE))
obj.offset = obj.ivt.ivt_address - obj.bdt.start
obj.address = obj.bdt.start
obj.plugin = True if obj.bdt.plugin else False
# Parse DCD
if obj.ivt.dcd_address:
obj.dcd = SegDCD.parse(read_raw_segment(stream, SegTag.DCD))
obj.dcd.padding = (obj.ivt.app_address - obj.ivt.dcd_address) - obj.dcd.size
# Parse APP
app_start = start_index + (obj.ivt.app_address - obj.ivt.ivt_address)
app_size = obj.ivt.csf_address - obj.ivt.app_address if obj.ivt.csf_address else \
obj.bdt.length - (obj.bdt.start - obj.ivt.app_address)
app_size = img_size - app_start if app_size > (img_size - app_start) else app_size
obj.app.data = read_raw_data(stream, app_size, app_start)
obj.app.padding = 0
# Parse CSF
#if obj.ivt.csf_address:
# obj.csf = SegCSF.parse(buffer)
# obj.csf.padding = obj.bdt.length - ((obj.ivt.csf_address - obj.ivt.ivt_address) + obj.csf.size)
return obj
########################################################################################################################
# Boot Image V3a: i.MX8QXP-A0
########################################################################################################################
class BootImg3a(BootImgBase):
""" i.MX Boot Image v3a """
IMG_TYPE_CSF = 0x01
IMG_TYPE_SCD = 0x02
IMG_TYPE_EXEC = 0x03
IMG_TYPE_DATA = 0x04
SCFW_FLAGS_APP = 0x01355FC4
SCFW_FLAGS_M4_0 = 0x4a5162
SCFW_FLAGS_M4_1 = 0x4f52a3
SCFW_FLAGS_SCFW = 0x1
INITIAL_LOAD_ADDR_SCU_ROM = 0x2000e000
INITIAL_LOAD_ADDR_AP_ROM = 0x00110000
INITIAL_LOAD_ADDR_FLEXSPI = 0x08000000
# The value of CSF segment size
CSF_SIZE = 0x2000
# The align value of APP segment
IMG_AUTO_ALIGN = 0x10
SECTOR_SIZE = 0x200
APP_ALIGN = 0x1200
# The value of img head size
# offset | size
HEAD_SIZE = {0x400: 0xC400,
0x1000: 0x1400}
PADDING_VAL = 0x00
COUNT_OF_CONTAINERS = 2
@property
def plg(self):
return self._plg
@plg.setter
def plg(self, value):
assert isinstance(value, bool)
self._plg = value
@property
def ivt(self):
return self._ivt
@ivt.setter
def ivt(self, value):
assert isinstance(value, list) and isinstance(value[0], SegIVT3a)
self._ivt = value
@property
def bdt(self):
return self._bdt
@bdt.setter
def bdt(self, value):
assert isinstance(value, list) and isinstance(value[0], SegBDS3a)
self._bdt = value
@property
def app(self):
return self._app
@app.setter
def app(self, value):
self._app = value
@property
def csf(self):
return self._csf
@csf.setter
def csf(self, value):
assert isinstance(value, SegCSF)
self._csf = value
def __init__(self, address=0, offset=0x400, version=0x43):
""" Initialize boot image object
:param address: The start address of img in target memory
:param offset: The IVT offset
:param version: The version of boot img format
:return: BootImage object
"""
super().__init__(address, offset)
self._ivt = [SegIVT3a(version), SegIVT3a(version)]
self._ivt[0].next = self._ivt[0].size
self._ivt[0].version = 0x01
self._ivt[1].version = 0x01
self._bdt = [SegBDS3a(), SegBDS3a()]
self._app = [[SegAPP() for i in range(SegBDS3a.IMAGES_MAX_COUNT)],
[SegAPP() for i in range(SegBDS3a.IMAGES_MAX_COUNT)]]
self._dcd = SegDCD()
self._csf = SegCSF()
self._plg = False
if not isinstance(self.address, list):
self.address = [self.INITIAL_LOAD_ADDR_SCU_ROM, self.INITIAL_LOAD_ADDR_AP_ROM]
self._sdc_address = 0
@staticmethod
def _compute_padding(size, sector_size):
return ((size // sector_size + (size % sector_size > 0)) * sector_size) - size
def _update(self):
# Set zero padding for IVT and BDT sections
for container in range(self.COUNT_OF_CONTAINERS):
self.ivt[container].padding = 0
self.bdt[container].padding = 0
# Set IVT section
self.ivt[container].ivt_address = self.address[container] + self.offset + \
container * self.ivt[container].size
self.ivt[container].bdt_address = self.ivt[container].ivt_address + \
self.ivt[container].space * (self.COUNT_OF_CONTAINERS - container) + \
container * self.bdt[container].size
if container == 0:
if self.dcd.enabled:
self.ivt[container].dcd_address = self.ivt[container].bdt_address + self.bdt[container].space * 2
if self.csf.enabled:
self.ivt[container].csf_address = self.ivt[container].dcd_address + self.dcd.space
else:
self.ivt[container].csf_address = 0
else:
self.ivt[container].dcd_address = 0
if self.csf.enabled:
self.ivt[container].csf_address = self.ivt[container].bdt_address + \
self.bdt[container].space * 2
else:
self.ivt[container].csf_address = 0
else:
self.ivt[container].dcd_address = 0
self.ivt[container].csf_address = 0
self.app[container][0].padding = self._compute_padding(self.bdt[container].images[0].image_size,
self.SECTOR_SIZE)
if self.bdt[container].images_count != 0:
self.bdt[container].boot_data_size = self.bdt[container].size
if container == 0:
self.bdt[container].images[0].image_source = self.APP_ALIGN
else:
last_image_index = self.bdt[container - 1].images_count - 1
last_image_address = self.bdt[container - 1].images[last_image_index].image_source
self.bdt[container].images[0].image_source = last_image_address + \
self.app[container - 1][last_image_index].space
for i in range(self.bdt[container].images_count - 1):
self.bdt[container].images[i + 1].image_source = self.bdt[container].images[i].image_source + \
self.app[container][i].space
self.app[container][i + 1].padding = self._compute_padding(self.bdt[container].images[i + 1].image_size,
self.SECTOR_SIZE)
if container == self.COUNT_OF_CONTAINERS - 1:
self.app[container][self.bdt[container].images_count - 1].padding = 0
# Set BDT section
def info(self):
self._update()
# Print IVT
msg = "#" * 60 + "\n"
msg += "# IVT (Image Vector Table)\n"
msg += "#" * 60 + "\n\n"
for index, ivt in enumerate(self.ivt):
msg += "-" * 60 + "\n"
msg += "- IVT[{}]\n".format(index)
msg += "-" * 60 + "\n\n"
msg += ivt.info()
# Print BDI
msg += "#" * 60 + "\n"
msg += "# BDI (Boot Data Info)\n"
msg += "#" * 60 + "\n\n"
for index, bdi in enumerate(self.bdt):
msg += "-" * 60 + "\n"
msg += "- BDI[{}]\n".format(index)
msg += "-" * 60 + "\n\n"
msg += bdi.info()
# Print DCD
if self.dcd.enabled:
msg += "#" * 60 + "\n"
msg += "# DCD (Device Config Data)\n"
msg += "#" * 60 + "\n\n"
msg += self.dcd.info()
# Print CSF
if self.csf.enabled:
msg += "#" * 60 + "\n"
msg += "# CSF (Code Signing Data)\n"
msg += "#" * 60 + "\n\n"
msg += self.csf.info()
return msg
def add_image(self, data, img_type=EnumAppType.APP, address=0):
""" Add specific image into the main boot image
:param data: Raw data of image
:param img_type: Type of image
:param address: address in RAM
:return:
"""
if img_type == EnumAppType.A35:
image_index = self.bdt[1].images_count
self.bdt[1].images[image_index].image_destination = address
self.bdt[1].images[image_index].image_entry = address
self.bdt[1].images[image_index].image_size = len(data)
self.bdt[1].images[image_index].rom_flags = 0
self.bdt[1].images[image_index].hab_flags = self.IMG_TYPE_EXEC
self.bdt[1].images[image_index].scfw_flags = self.SCFW_FLAGS_APP
self.bdt[1].images_count += 1
self.app[1][image_index].data = data
self.app[1][image_index].padding = self._compute_padding(len(data), self.SECTOR_SIZE)
elif img_type == EnumAppType.M4_0 or img_type == EnumAppType.M4_1:
image_index = self.bdt[0].images_count
self.bdt[0].images[image_index].image_destination = address
self.bdt[0].images[image_index].image_entry = address
self.bdt[0].images[image_index].image_size = len(data)
self.bdt[0].images[image_index].rom_flags = 0
self.bdt[0].images[image_index].hab_flags = self.IMG_TYPE_EXEC
self.bdt[0].images[image_index].scfw_flags = self.SCFW_FLAGS_M4_0 if img_type == EnumAppType.M4_0 else \
self.SCFW_FLAGS_M4_1
self.bdt[0].images_count += 1
self.app[0][image_index].data = data
self.app[0][image_index].padding = self._compute_padding(len(data), self.SECTOR_SIZE)
elif img_type == EnumAppType.SCFW:
image_index = self.bdt[0].images_count
self.bdt[0].images[image_index].image_destination = 0x1ffe0000
self.bdt[0].images[image_index].image_entry = 0x1ffe0000
self.bdt[0].images[image_index].image_size = len(data)
self.bdt[0].images[image_index].rom_flags = 0
self.bdt[0].images[image_index].hab_flags = self.IMG_TYPE_EXEC
self.bdt[0].images[image_index].scfw_flags = self.SCFW_FLAGS_SCFW
self.bdt[0].images_count += 1
self.app[0][image_index].data = data
self.app[0][image_index].padding = self._compute_padding(len(data), self.SECTOR_SIZE)
self._sdc_address = self.bdt[0].images[image_index].image_destination + len(data) + \
self._compute_padding(len(data), self.IMG_AUTO_ALIGN)
elif img_type == EnumAppType.SCD:
if self._sdc_address == 0:
raise Exception('SCFW have to be define before SCD !')
image_index = self.bdt[0].images_count
self.bdt[0].images[image_index].image_destination = self._sdc_address
self.bdt[0].images[image_index].image_entry = 0
self.bdt[0].images[image_index].image_size = len(data)
self.bdt[0].images[image_index].rom_flags = 0
self.bdt[0].images[image_index].hab_flags = self.IMG_TYPE_SCD
self.bdt[0].images[image_index].scfw_flags = 0x1
self.bdt[0].images_count += 1
self._app[0][image_index].data = data
self._app[0][image_index].padding = self._compute_padding(len(data), self.SECTOR_SIZE)
else:
raise Exception('Unknown data type !')
def export(self):
''' Export Image as binary blob
:return:
'''
self._update()
data = bytes()
data += self.ivt[0].export(True)
data += self.ivt[1].export(True)
data += self.bdt[0].export(True)
data += self.bdt[1].export(True)
data += self.dcd.export(True)
data += self.csf.export(True)
data += bytes([self.PADDING_VAL] * self._compute_padding(len(data), self.APP_ALIGN - self.offset))
for container in range(self.COUNT_OF_CONTAINERS):
for image in range(self.bdt[container].images_count):
data += self.app[container][image].export(True)
return data
@classmethod
def parse(cls, stream, step=0x100, size=None):
""" Parse image from stream buffer or bytes array
:param stream: The stream buffer or bytes array
:param step: Image searching step
:param size: parsing size
:return: BootImg3a object
"""
if isinstance(stream, (bytes, bytearray)):
stream = BytesIO(stream)
if not isinstance(stream, (BufferedReader, BytesIO)):
raise TypeError(" Not correct value type: \"{}\" !".format(type(stream)))
header = None
start_index = stream.tell()
last_index = stream.seek(0, SEEK_END)
stream.seek(start_index)
if size:
last_index = min(start_index + size, last_index)
imx_image = False
while start_index < (last_index - Header.SIZE):
header = Header.parse(read_raw_data(stream, Header.SIZE, no_seek=True))
if header.tag == SegTag.IVT3 or header.length == SegIVT3a.SIZE or \
header.param in (0x43,):
imx_image = True
break
else:
start_index = stream.seek(step, SEEK_CUR)
if not imx_image:
raise Exception(' Not an i.MX Boot Image !')
obj = cls(version=header.param)
img_size = last_index - start_index
if start_index > 0:
obj.offset = start_index
# Parse IVT
obj.ivt[0] = SegIVT3a.parse(read_raw_segment(stream, SegTag.IVT3))
obj.ivt[1] = SegIVT3a.parse(read_raw_segment(stream, SegTag.IVT3))
# Parse BDT
obj.bdt[0] = SegBDS3a.parse(read_raw_data(stream, SegBDS3a.SIZE))
obj.bdt[1] = SegBDS3a.parse(read_raw_data(stream, SegBDS3a.SIZE))
# Parse DCD
if obj.ivt[0].dcd_address:
stream.seek(start_index + (obj.ivt[0].dcd_address - obj.ivt[0].ivt_address), 0)
obj.dcd = SegDCD.parse(read_raw_segment(stream, SegTag.DCD))
# Parse CSF
if obj.ivt[0].csf_address:
stream.seek(start_index + (obj.ivt[0].csf_address - obj.ivt[0].ivt_address), 0)
obj.csf = SegCSF.parse(read_raw_segment(stream, SegTag.CSF))
# Parse IMAGES
for container in range(obj.COUNT_OF_CONTAINERS):
for i in range(obj.bdt[container].images_count):
stream.seek(obj.bdt[container].images[i].image_source - obj.offset, 0)
obj.app[container][i].data = read_raw_data(stream, obj.bdt[container].images[i].image_size)
return obj
########################################################################################################################
# Boot Image V3b: i.MX8QM-A0
########################################################################################################################
class BootImg3b(BootImgBase):
""" IMX Boot Image v3b """
IMG_TYPE_CSF = 0x01
IMG_TYPE_SCD = 0x02
IMG_TYPE_EXEC = 0x03
IMG_TYPE_DATA = 0x04
SCFW_FLAGS_A53 = 0x1354014
SCFW_FLAGS_A72 = 0x1354065
SCFW_FLAGS_M4_0 = 0x4a5162
SCFW_FLAGS_M4_1 = 0x4f52a3
SCFW_FLAGS_SCFW = 0x1
INITIAL_LOAD_ADDR_SCU_ROM = 0x2000e000
INITIAL_LOAD_ADDR_AP_ROM = 0x00110000
INITIAL_LOAD_ADDR_FLEXSPI = 0x08000000
# The value of CSF segment size
CSF_SIZE = 0x2000
# The align value for img
IMG_AUTO_ALIGN = 0x10
# The align value for sector
SECTOR_SIZE = 0x200
# The align value of APP segment
APP_ALIGN = 0x1200
PADDING_VAL = 0x00
# The value of img head size
# offset | size
HEAD_SIZE = {0x400: 0xC400,
0x1000: 0x1400}
COUNT_OF_CONTAINERS = 2
@property
def plg(self):
return self._plg
@plg.setter
def plg(self, value):
assert isinstance(value, bool)
self._plg = value
@property
def ivt(self):
return self._ivt
@ivt.setter
def ivt(self, value):
assert isinstance(value, list)
assert len(value) == self.COUNT_OF_CONTAINERS
assert isinstance(value[0], SegIVT3b)
self._ivt = value
@property
def bdt(self):
return self._bdt
@bdt.setter
def bdt(self, value):
assert isinstance(value, list)
assert len(value) == self.COUNT_OF_CONTAINERS
assert isinstance(value[0], SegBDS3b)
self._bdt = value
@property
def app(self):
return self._app
@app.setter
def app(self, value):
self._app = value
@property
def scd(self):
return self._scd
@scd.setter
def scd(self, value):
self._scd = value
@property
def csf(self):
return self._csf
@csf.setter
def csf(self, value):
assert isinstance(value, SegCSF)
self._csf = value
def __init__(self, address=0, offset=0x400, version=0x43):
""" Initialize boot image object
:param address: The start address of img in target memory
:param offset: The IVT offset
:param version: The version of boot img format
:return: BootImage object
"""
super().__init__(address, offset)
self._ivt = [SegIVT3b(version), SegIVT3b(version)]
self._bdt = [SegBDS3b(), SegBDS3b()]
self._app = [[SegAPP() for _ in range(SegBDS3b.IMAGES_MAX_COUNT)],
[SegAPP() for _ in range(SegBDS3b.IMAGES_MAX_COUNT)]]
self._dcd = SegDCD()
self._scd = SegAPP()
self._csf = SegCSF()
self._plg = False
self._scd_address = 0
if not isinstance(self.address, list):
self.address = [self.INITIAL_LOAD_ADDR_SCU_ROM, self.INITIAL_LOAD_ADDR_AP_ROM]
@staticmethod
def _compute_padding(image_size, sector_size):
return ((image_size // sector_size + (image_size % sector_size > 0)) * sector_size) - image_size
def _update(self):
# Set zero padding for IVT and BDT sections
for container in range(self.COUNT_OF_CONTAINERS):
self.ivt[container].padding = 0
self.bdt[container].padding = 0
# Set IVT section
self.ivt[container].ivt_address = self.address[container] + self.offset + \
container * self.ivt[container].size
self.ivt[container].bdt_address = self.ivt[container].ivt_address + \
self.ivt[container].space * (2 - container) + \
container * self.bdt[container].size
if container == 0:
if self.dcd.enabled:
self.ivt[container].dcd_address = self.ivt[container].bdt_address + self.bdt[container].space * 2
if self.csf.enabled:
self.ivt[container].csf_address = self.ivt[container].dcd_address + self.dcd.space
else:
self.ivt[container].csf_address = 0
else:
self.ivt[container].dcd_address = 0
if self.csf.enabled:
self.ivt[container].csf_address = self.ivt[container].bdt_address + \
self.bdt[container].space * 2
else:
self.ivt[container].csf_address = 0
else:
self.ivt[container].dcd_address = 0
self.ivt[container].csf_address = 0
self.app[container][0].padding = self._compute_padding(self.bdt[container].images[0].image_size,
self.SECTOR_SIZE)
if self.bdt[container].images_count != 0:
self.bdt[container].boot_data_size = self.bdt[container].size
if container == 0:
self.bdt[container].images[0].image_source = self.APP_ALIGN
else:
last_image_index = self.bdt[container - 1].images_count - 1
last_image_address = self.bdt[container - 1].images[last_image_index].image_source
self.bdt[container].images[0].image_source = last_image_address + \
self.app[container - 1][last_image_index].space
next_image_address = 0
for i in range(self.bdt[container].images_count - 1):
self.bdt[container].images[i + 1].image_source = self.bdt[container].images[i].image_source + \
self.app[container][i].space
self.app[container][i + 1].padding = self._compute_padding(
self.bdt[container].images[i + 1].image_size, self.SECTOR_SIZE)
next_image_address = self.bdt[container].images[i + 1].image_source + self.app[container][i + 1].space
if container == 0:
if self.bdt[container].scd.image_destination != 0:
self.bdt[container].scd.image_source = next_image_address
self.scd.padding = self._compute_padding(self.bdt[0].scd.image_size, self.SECTOR_SIZE)
next_image_address += self.scd.space
# Set BDT section
if self.csf.enabled:
self.bdt[container].csf.image_source = next_image_address
self.csf.padding = self._compute_padding(self.bdt[0].csf.image_size, self.SECTOR_SIZE)
next_image_address += self.csf.space
# Set BDT section
def info(self):
self._update()
# Print IVT
msg = "#" * 60 + "\n"
msg += "# IVT (Image Vector Table)\n"
msg += "#" * 60 + "\n\n"
for index, ivt in enumerate(self.ivt):
msg += "-" * 60 + "\n"
msg += "- IVT[{}]\n".format(index)
msg += "-" * 60 + "\n\n"
msg += ivt.info()
# Print BDI
msg += "#" * 60 + "\n"
msg += "# BDI (Boot Data Info)\n"
msg += "#" * 60 + "\n\n"
for index, bdi in enumerate(self.bdt):
msg += "-" * 60 + "\n"
msg += "- BDI[{}]\n".format(index)
msg += "-" * 60 + "\n\n"
msg += bdi.info()
# Print DCD
if self.dcd.enabled:
msg += "#" * 60 + "\n"
msg += "# DCD (Device Config Data)\n"
msg += "#" * 60 + "\n\n"
msg += self.dcd.info()
# Print CSF
if self.csf.enabled:
msg += "#" * 60 + "\n"
msg += "# CSF (Code Signing Data)\n"
msg += "#" * 60 + "\n\n"
msg += self.csf.info()
return msg
def add_image(self, data, img_type=EnumAppType.APP, address=0):
""" Add specific image into the main boot image
:param data: Raw data of image
:param img_type: Type of image
:param address: address in RAM
"""
if img_type == EnumAppType.A53 or img_type == EnumAppType.A72:
image_index = self.bdt[1].images_count
self.app[1][image_index].data = data
self.bdt[1].images[image_index].image_destination = address
self.bdt[1].images[image_index].image_entry = address
self.bdt[1].images[image_index].image_size = len(data)
if img_type == EnumAppType.A53:
self.bdt[1].images[image_index].flags = self.SCFW_FLAGS_A53
elif img_type == EnumAppType.A72:
self.bdt[1].images[image_index].flags = self.SCFW_FLAGS_A72
self.app[1][image_index].padding = self._compute_padding(len(data), self.SECTOR_SIZE)
self.bdt[1].images_count += 1
elif img_type == EnumAppType.M4_0 or img_type == EnumAppType.M4_1:
image_index = self.bdt[0].images_count
self.app[0][image_index].data = data
self.bdt[0].images[image_index].image_destination = address
self.bdt[0].images[image_index].image_entry = address
self.bdt[0].images[image_index].image_size = len(data)
if img_type == EnumAppType.M4_0:
self.bdt[0].images[image_index].flags = self.SCFW_FLAGS_M4_0
elif img_type == EnumAppType.M4_1:
self.bdt[0].images[image_index].flags = self.SCFW_FLAGS_M4_1
self.app[0][image_index].padding = ((len(data) // self.SECTOR_SIZE + (
len(data) % self.SECTOR_SIZE > 0)) * self.SECTOR_SIZE) - len(data)
self.bdt[0].images_count += 1
elif img_type == EnumAppType.SCFW:
image_index = self.bdt[0].images_count
self.bdt[0].images[image_index].image_destination = 0x30fe0000
self.bdt[0].images[image_index].image_entry = 0x1ffe0000
self.bdt[0].images[image_index].image_size = len(data)
self.bdt[0].images[image_index].flags = self.SCFW_FLAGS_SCFW
self._scd_address = self.bdt[0].images[image_index].image_destination + len(data) + \
self._compute_padding(len(data), self.IMG_AUTO_ALIGN)
self.bdt[0].images_count += 1
self.app[0][image_index].data = data
self.app[0][image_index].padding = self._compute_padding(len(data), self.SECTOR_SIZE)
elif img_type == EnumAppType.SCD:
if self._scd_address == 0:
raise Exception('SCFW have to be define before SCD !')
self.scd.data = data
self.scd.padding = self._compute_padding(len(data), self.SECTOR_SIZE)
self.bdt[0].scd.image_destination = self._scd_address
self.bdt[0].scd.image_entry = 0
self.bdt[0].scd.image_size = len(data)
self.ivt[0].scd_address = self.bdt[0].scd.image_destination
else:
raise Exception(' Unknown image type !')
def export(self):
self._update()
# data = bytearray(self._offset)
data = bytes()
data += self.ivt[0].export(True)
data += self.ivt[1].export(True)
data += self.bdt[0].export(True)
data += self.bdt[1].export(True)
data += self.dcd.export(True)
data += bytes([self.PADDING_VAL] * self._compute_padding(len(data), self.APP_ALIGN - self.offset))
for container in range(self.COUNT_OF_CONTAINERS):
for i in range(self.bdt[container].images_count):
data += self.app[container][i].export(True)
if self.bdt[0].scd.image_source != 0:
data += self.scd.export(True)
if self.bdt[0].csf.image_source != 0:
data += self.csf.export(True)
return data
@classmethod
def parse(cls, stream, step=0x100, size=None):
""" Parse image from stream buffer or bytes array
:param stream: The stream buffer or bytes array
:param step: Image searching step
:param size: parsing size
:return: BootImg3b object
"""
if isinstance(stream, (bytes, bytearray)):
stream = BytesIO(stream)
if not isinstance(stream, (BufferedReader, BytesIO)):
raise TypeError(" Not correct value type: \"{}\" !".format(type(stream)))
header = None
start_index = stream.tell()
last_index = stream.seek(0, SEEK_END)
stream.seek(start_index)
if size:
last_index = min(start_index + size, last_index)
imx_image = False
while start_index < (last_index - Header.SIZE):
header = Header.parse(read_raw_data(stream, Header.SIZE, no_seek=True))
if header.tag == SegTag.IVT2 or header.length == SegIVT3b.SIZE or \
header.param in (0x43,):
imx_image = True
break
else:
start_index = stream.seek(step, SEEK_CUR)
if not imx_image:
raise Exception(' Not an i.MX Boot Image !')
obj = cls(version=header.param)
img_size = last_index - start_index
if start_index > 0:
obj.offset = start_index
# Parse IVT
obj.ivt[0] = SegIVT3b.parse(read_raw_segment(stream, SegTag.IVT2))
obj.ivt[1] = SegIVT3b.parse(read_raw_segment(stream, SegTag.IVT2))
# Parse BDT
obj.bdt[0] = SegBDS3b.parse(read_raw_data(stream, SegBDS3b.SIZE))
obj.bdt[1] = SegBDS3b.parse(read_raw_data(stream, SegBDS3b.SIZE))
# Parse DCD
if obj.ivt[0].dcd_address:
stream.seek(start_index + (obj.ivt[0].dcd_address - obj.ivt[0].ivt_address), 0)
obj.dcd = SegDCD.parse(read_raw_segment(stream, SegTag.DCD))
# Parse IMAGES
for container in range(obj.COUNT_OF_CONTAINERS):
for i in range(obj.bdt[container].images_count):
stream.seek(obj.bdt[container].images[i].image_source - obj.offset, 0)
obj.app[container][i].data = read_raw_data(stream, obj.bdt[container].images[i].image_size)
# Parse SCD
if obj.bdt[0].scd.image_source != 0:
stream.seek(obj.bdt[0].scd.image_source - obj.offset, 0)
obj.scd.data = read_raw_data(stream, obj.bdt[0].scd.image_size)
# Parse CSF
if obj.bdt[0].csf.image_source != 0:
stream.seek(obj.bdt[0].csf.image_source - obj.offset, 0)
obj.csf = SegCSF.parse(read_raw_segment(stream, SegTag.CSF))
return obj
########################################################################################################################
# Boot Image V4: i.MX8DM, i.MX8QM_B0, i.MX8QXP_B0
########################################################################################################################
class BootImg4(BootImgBase):
""" i.MX Boot Image v4 """
def __init__(self, address=0, offset=0x400):
""" Initialize boot image object
:param address: The start address of image in target memory
:param offset: The image offset
:return: BootImage object
"""
super().__init__(address, offset)
self._dcd = SegDCD()
self._cont1_header = SegBIC1()
self._cont2_header = SegBIC1()
self._cont1_data = []
self._cont2_data = []
def _update(self):
pass
def info(self):
self._update()
msg = ""
msg += "#" * 60 + "\n"
msg += "# Boot Images Container 1\n"
msg += "#" * 60 + "\n\n"
msg += self._cont1_header.info()
msg += "#" * 60 + "\n"
msg += "# Boot Images Container 2\n"
msg += "#" * 60 + "\n\n"
msg += self._cont2_header.info()
if self.dcd.enabled:
msg += "#" * 60 + "\n"
msg += "# DCD (Device Config Data)\n"
msg += "#" * 60 + "\n\n"
msg += self.dcd.info()
return msg
def add_image(self, data, img_type, address):
raise NotImplementedError()
def export(self):
self._update()
data = bytes()
data += self._cont1_header.export(True)
data += self._cont2_header.export(True)
# TODO: Complete Implementation
return data
@classmethod
def parse(cls, stream, step=0x100, size=None):
""" Parse image from stream buffer or bytes array
:param stream: The stream buffer or bytes array
:param step: Image searching step
:param size: parsing size
:return: BootImg4 object
"""
if isinstance(stream, (bytes, bytearray)):
stream = BytesIO(stream)
if not isinstance(stream, (BufferedReader, BytesIO)):
raise TypeError(" Not correct value type: \"{}\" !".format(type(stream)))
start_index = stream.tell()
last_index = stream.seek(0, SEEK_END)
stream.seek(start_index)
if size:
last_index = min(start_index + size, last_index)
imx_image = False
while start_index < (last_index - Header.SIZE):
header = Header.parse(read_raw_data(stream, Header2.SIZE, no_seek=True))
if header.tag == SegTag.BIC1:
imx_image = True
break
else:
start_index = stream.seek(step, SEEK_CUR)
if not imx_image:
raise Exception(' Not an i.MX Boot Image !')
img_size = last_index - start_index
obj = cls()
if start_index > 0:
obj.offset = start_index
# Parse Containers
obj._cont1_header = SegBIC1.parse(read_raw_data(stream, 0x400))
obj._cont2_header = SegBIC1.parse(read_raw_data(stream, 0x400))
# TODO: Complete Implementation
return obj
########################################################################################################################
# i.MX Kernel Image Classes
########################################################################################################################
class KernelImg(object):
""" IMX Kernel Image """
IMAGE_MIN_SIZE = 0x1000
@property
def address(self):
return self._ivt.app_address
@address.setter
def address(self, value):
self._ivt.app_address = value
@property
def version(self):
return self._ivt.version
@version.setter
def version(self, value):
self._ivt.version = value
@property
def app(self):
return self._app.data
@app.setter
def app(self, value):
assert isinstance(value, (bytes, bytearray))
self._app.data = value
@property
def csf(self):
return self._csf
@csf.setter
def csf(self, value):
assert isinstance(value, SegCSF)
self._csf = value
def __init__(self, address=0, app=None, csf=None, version=0x41):
self._ivt = SegIVT2(version)
self._ivt.app_address = address
self._app = SegAPP(app)
self._csf = SegCSF() if csf is None else csf
def __str__(self):
return self.info()
def __repr__(self):
return self.info()
def _update(self):
pass
def info(self):
pass
def export(self):
self._update()
data = self._app.export(True)
data += self._ivt.export(True)
data += self._csf.export(True)
return data
@classmethod
def parse(cls, data):
assert type(data) in (str, bytes, bytearray)
assert len(data) > cls.IMAGE_MIN_SIZE
pass
| 36.469512 | 120 | 0.548013 | 6,508 | 53,829 | 4.376767 | 0.044714 | 0.03416 | 0.013762 | 0.019169 | 0.892536 | 0.875088 | 0.848898 | 0.82836 | 0.791708 | 0.778261 | 0 | 0.025651 | 0.304743 | 53,829 | 1,475 | 121 | 36.494237 | 0.735444 | 0.09506 | 0 | 0.81686 | 0 | 0 | 0.02638 | 0 | 0 | 0 | 0.011722 | 0.001356 | 0.026163 | 1 | 0.099806 | false | 0.003876 | 0.003876 | 0.031008 | 0.214147 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9c22c4479c53cb64d968726c2c498dd7df7a08c4 | 44,124 | py | Python | venv/Lib/site-packages/PyQt4/examples/desktop/systray/systray_rc3.py | prateekfxtd/ns_Startup | 095a62b3a8c7bf0ff7b767355d57d993bbd2423d | [
"MIT"
] | null | null | null | venv/Lib/site-packages/PyQt4/examples/desktop/systray/systray_rc3.py | prateekfxtd/ns_Startup | 095a62b3a8c7bf0ff7b767355d57d993bbd2423d | [
"MIT"
] | null | null | null | venv/Lib/site-packages/PyQt4/examples/desktop/systray/systray_rc3.py | prateekfxtd/ns_Startup | 095a62b3a8c7bf0ff7b767355d57d993bbd2423d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Resource object code
#
# Created: Wed Mar 20 12:29:49 2013
# by: The Resource Compiler for PyQt (Qt v4.8.4)
#
# WARNING! All changes made in this file will be lost!
from PyQt4 import QtCore
qt_resource_data = b"\
\x00\x00\x0f\x21\
\x3c\
\x3f\x78\x6d\x6c\x20\x76\x65\x72\x73\x69\x6f\x6e\x3d\x22\x31\x2e\
\x30\x22\x20\x65\x6e\x63\x6f\x64\x69\x6e\x67\x3d\x22\x55\x54\x46\
\x2d\x38\x22\x20\x73\x74\x61\x6e\x64\x61\x6c\x6f\x6e\x65\x3d\x22\
\x6e\x6f\x22\x3f\x3e\x0a\x3c\x21\x2d\x2d\x20\x43\x72\x65\x61\x74\
\x65\x64\x20\x77\x69\x74\x68\x20\x49\x6e\x6b\x73\x63\x61\x70\x65\
\x20\x28\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x69\x6e\x6b\
\x73\x63\x61\x70\x65\x2e\x6f\x72\x67\x2f\x29\x20\x2d\x2d\x3e\x3c\
\x73\x76\x67\x20\x76\x69\x65\x77\x42\x6f\x78\x3d\x22\x31\x30\x30\
\x20\x32\x30\x30\x20\x35\x35\x30\x20\x35\x30\x30\x22\x20\x68\x65\
\x69\x67\x68\x74\x3d\x22\x38\x34\x31\x2e\x38\x38\x39\x37\x36\x70\
\x74\x22\x20\x69\x64\x3d\x22\x73\x76\x67\x31\x22\x20\x69\x6e\x6b\
\x73\x63\x61\x70\x65\x3a\x76\x65\x72\x73\x69\x6f\x6e\x3d\x22\x30\
\x2e\x34\x30\x2b\x63\x76\x73\x22\x20\x73\x6f\x64\x69\x70\x6f\x64\
\x69\x3a\x64\x6f\x63\x62\x61\x73\x65\x3d\x22\x43\x3a\x5c\x44\x6f\
\x63\x75\x6d\x65\x6e\x74\x73\x20\x61\x6e\x64\x20\x53\x65\x74\x74\
\x69\x6e\x67\x73\x5c\x4a\x6f\x6e\x20\x50\x68\x69\x6c\x6c\x69\x70\
\x73\x5c\x4d\x79\x20\x44\x6f\x63\x75\x6d\x65\x6e\x74\x73\x5c\x70\
\x72\x6f\x6a\x65\x63\x74\x73\x5c\x63\x6c\x69\x70\x61\x72\x74\x2d\
\x70\x72\x6f\x6a\x65\x63\x74\x5c\x73\x75\x62\x6d\x69\x73\x73\x69\
\x6f\x6e\x73\x22\x20\x73\x6f\x64\x69\x70\x6f\x64\x69\x3a\x64\x6f\
\x63\x6e\x61\x6d\x65\x3d\x22\x68\x65\x61\x72\x74\x2d\x6c\x65\x66\
\x74\x2d\x68\x69\x67\x68\x6c\x69\x67\x68\x74\x2e\x73\x76\x67\x22\
\x20\x73\x6f\x64\x69\x70\x6f\x64\x69\x3a\x76\x65\x72\x73\x69\x6f\
\x6e\x3d\x22\x30\x2e\x33\x32\x22\x20\x77\x69\x64\x74\x68\x3d\x22\
\x35\x39\x35\x2e\x32\x37\x35\x35\x39\x70\x74\x22\x20\x78\x6d\x6c\
\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\
\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x20\
\x78\x6d\x6c\x6e\x73\x3a\x63\x63\x3d\x22\x68\x74\x74\x70\x3a\x2f\
\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\x6f\x72\
\x67\x2f\x63\x63\x2f\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x64\x63\x3d\
\x22\x68\x74\x74\x70\x3a\x2f\x2f\x70\x75\x72\x6c\x2e\x6f\x72\x67\
\x2f\x64\x63\x2f\x65\x6c\x65\x6d\x65\x6e\x74\x73\x2f\x31\x2e\x31\
\x2f\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x69\x6e\x6b\x73\x63\x61\x70\
\x65\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x69\x6e\
\x6b\x73\x63\x61\x70\x65\x2e\x6f\x72\x67\x2f\x6e\x61\x6d\x65\x73\
\x70\x61\x63\x65\x73\x2f\x69\x6e\x6b\x73\x63\x61\x70\x65\x22\x20\
\x78\x6d\x6c\x6e\x73\x3a\x72\x64\x66\x3d\x22\x68\x74\x74\x70\x3a\
\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x31\x39\x39\
\x39\x2f\x30\x32\x2f\x32\x32\x2d\x72\x64\x66\x2d\x73\x79\x6e\x74\
\x61\x78\x2d\x6e\x73\x23\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x73\x6f\
\x64\x69\x70\x6f\x64\x69\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x69\
\x6e\x6b\x73\x63\x61\x70\x65\x2e\x73\x6f\x75\x72\x63\x65\x66\x6f\
\x72\x67\x65\x2e\x6e\x65\x74\x2f\x44\x54\x44\x2f\x73\x6f\x64\x69\
\x70\x6f\x64\x69\x2d\x30\x2e\x64\x74\x64\x22\x20\x78\x6d\x6c\x6e\
\x73\x3a\x73\x76\x67\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\
\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\x30\x2f\x73\x76\
\x67\x22\x3e\x0a\x3c\x6d\x65\x74\x61\x64\x61\x74\x61\x3e\x0a\x3c\
\x72\x64\x66\x3a\x52\x44\x46\x20\x78\x6d\x6c\x6e\x73\x3a\x63\x63\
\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\
\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x22\x20\x78\
\x6d\x6c\x6e\x73\x3a\x64\x63\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\
\x70\x75\x72\x6c\x2e\x6f\x72\x67\x2f\x64\x63\x2f\x65\x6c\x65\x6d\
\x65\x6e\x74\x73\x2f\x31\x2e\x31\x2f\x22\x20\x78\x6d\x6c\x6e\x73\
\x3a\x72\x64\x66\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\
\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x31\x39\x39\x39\x2f\x30\x32\x2f\
\x32\x32\x2d\x72\x64\x66\x2d\x73\x79\x6e\x74\x61\x78\x2d\x6e\x73\
\x23\x22\x3e\x0a\x3c\x63\x63\x3a\x57\x6f\x72\x6b\x20\x72\x64\x66\
\x3a\x61\x62\x6f\x75\x74\x3d\x22\x22\x3e\x0a\x3c\x64\x63\x3a\x74\
\x69\x74\x6c\x65\x3e\x48\x65\x61\x72\x74\x20\x4c\x65\x66\x74\x2d\
\x48\x69\x67\x68\x6c\x69\x67\x68\x74\x3c\x2f\x64\x63\x3a\x74\x69\
\x74\x6c\x65\x3e\x0a\x3c\x64\x63\x3a\x64\x65\x73\x63\x72\x69\x70\
\x74\x69\x6f\x6e\x3e\x54\x68\x69\x73\x20\x69\x73\x20\x61\x20\x6e\
\x6f\x72\x6d\x61\x6c\x20\x76\x61\x6c\x65\x6e\x74\x69\x6e\x65\x73\
\x20\x64\x61\x79\x20\x68\x65\x61\x72\x74\x2e\x3c\x2f\x64\x63\x3a\
\x64\x65\x73\x63\x72\x69\x70\x74\x69\x6f\x6e\x3e\x0a\x3c\x64\x63\
\x3a\x73\x75\x62\x6a\x65\x63\x74\x3e\x0a\x3c\x72\x64\x66\x3a\x42\
\x61\x67\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\x3e\x68\x6f\x6c\x69\
\x64\x61\x79\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\
\x66\x3a\x6c\x69\x3e\x76\x61\x6c\x65\x6e\x74\x69\x6e\x65\x73\x3c\
\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\
\x3e\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\
\x6c\x69\x3e\x76\x61\x6c\x65\x6e\x74\x69\x6e\x65\x3c\x2f\x72\x64\
\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\x3e\x68\x61\
\x73\x68\x28\x30\x78\x38\x61\x30\x39\x31\x63\x30\x29\x3c\x2f\x72\
\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\x3e\x68\
\x61\x73\x68\x28\x30\x78\x38\x61\x30\x39\x31\x36\x63\x29\x3c\x2f\
\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\x3e\
\x73\x69\x67\x6e\x73\x5f\x61\x6e\x64\x5f\x73\x79\x6d\x62\x6f\x6c\
\x73\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\
\x6c\x69\x3e\x68\x61\x73\x68\x28\x30\x78\x38\x61\x30\x39\x31\x66\
\x30\x29\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\
\x3a\x6c\x69\x3e\x64\x61\x79\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\
\x0a\x3c\x2f\x72\x64\x66\x3a\x42\x61\x67\x3e\x0a\x3c\x2f\x64\x63\
\x3a\x73\x75\x62\x6a\x65\x63\x74\x3e\x0a\x3c\x64\x63\x3a\x70\x75\
\x62\x6c\x69\x73\x68\x65\x72\x3e\x0a\x3c\x63\x63\x3a\x41\x67\x65\
\x6e\x74\x20\x72\x64\x66\x3a\x61\x62\x6f\x75\x74\x3d\x22\x68\x74\
\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x6f\x70\x65\x6e\x63\x6c\x69\
\x70\x61\x72\x74\x2e\x6f\x72\x67\x22\x3e\x0a\x3c\x64\x63\x3a\x74\
\x69\x74\x6c\x65\x3e\x4a\x6f\x6e\x20\x50\x68\x69\x6c\x6c\x69\x70\
\x73\x3c\x2f\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\x0a\x3c\x2f\x63\
\x63\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\x2f\x64\x63\x3a\x70\x75\
\x62\x6c\x69\x73\x68\x65\x72\x3e\x0a\x3c\x64\x63\x3a\x63\x72\x65\
\x61\x74\x6f\x72\x3e\x0a\x3c\x63\x63\x3a\x41\x67\x65\x6e\x74\x3e\
\x0a\x3c\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\x4a\x6f\x6e\x20\x50\
\x68\x69\x6c\x6c\x69\x70\x73\x3c\x2f\x64\x63\x3a\x74\x69\x74\x6c\
\x65\x3e\x0a\x3c\x2f\x63\x63\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\
\x2f\x64\x63\x3a\x63\x72\x65\x61\x74\x6f\x72\x3e\x0a\x3c\x64\x63\
\x3a\x72\x69\x67\x68\x74\x73\x3e\x0a\x3c\x63\x63\x3a\x41\x67\x65\
\x6e\x74\x3e\x0a\x3c\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\x4a\x6f\
\x6e\x20\x50\x68\x69\x6c\x6c\x69\x70\x73\x3c\x2f\x64\x63\x3a\x74\
\x69\x74\x6c\x65\x3e\x0a\x3c\x2f\x63\x63\x3a\x41\x67\x65\x6e\x74\
\x3e\x0a\x3c\x2f\x64\x63\x3a\x72\x69\x67\x68\x74\x73\x3e\x0a\x3c\
\x64\x63\x3a\x64\x61\x74\x65\x3e\x3c\x2f\x64\x63\x3a\x64\x61\x74\
\x65\x3e\x0a\x3c\x64\x63\x3a\x66\x6f\x72\x6d\x61\x74\x3e\x69\x6d\
\x61\x67\x65\x2f\x73\x76\x67\x2b\x78\x6d\x6c\x3c\x2f\x64\x63\x3a\
\x66\x6f\x72\x6d\x61\x74\x3e\x0a\x3c\x64\x63\x3a\x74\x79\x70\x65\
\x20\x72\x64\x66\x3a\x72\x65\x73\x6f\x75\x72\x63\x65\x3d\x22\x68\
\x74\x74\x70\x3a\x2f\x2f\x70\x75\x72\x6c\x2e\x6f\x72\x67\x2f\x64\
\x63\x2f\x64\x63\x6d\x69\x74\x79\x70\x65\x2f\x53\x74\x69\x6c\x6c\
\x49\x6d\x61\x67\x65\x22\x2f\x3e\x0a\x3c\x63\x63\x3a\x6c\x69\x63\
\x65\x6e\x73\x65\x20\x72\x64\x66\x3a\x72\x65\x73\x6f\x75\x72\x63\
\x65\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\
\x73\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x50\x75\
\x62\x6c\x69\x63\x44\x6f\x6d\x61\x69\x6e\x22\x2f\x3e\x0a\x3c\x64\
\x63\x3a\x6c\x61\x6e\x67\x75\x61\x67\x65\x3e\x65\x6e\x3c\x2f\x64\
\x63\x3a\x6c\x61\x6e\x67\x75\x61\x67\x65\x3e\x0a\x3c\x2f\x63\x63\
\x3a\x57\x6f\x72\x6b\x3e\x0a\x3c\x63\x63\x3a\x4c\x69\x63\x65\x6e\
\x73\x65\x20\x72\x64\x66\x3a\x61\x62\x6f\x75\x74\x3d\x22\x68\x74\
\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\
\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x50\x75\x62\x6c\x69\x63\x44\
\x6f\x6d\x61\x69\x6e\x22\x3e\x0a\x3c\x63\x63\x3a\x70\x65\x72\x6d\
\x69\x74\x73\x20\x72\x64\x66\x3a\x72\x65\x73\x6f\x75\x72\x63\x65\
\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\
\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x52\x65\x70\
\x72\x6f\x64\x75\x63\x74\x69\x6f\x6e\x22\x2f\x3e\x0a\x3c\x63\x63\
\x3a\x70\x65\x72\x6d\x69\x74\x73\x20\x72\x64\x66\x3a\x72\x65\x73\
\x6f\x75\x72\x63\x65\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\
\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\
\x63\x2f\x44\x69\x73\x74\x72\x69\x62\x75\x74\x69\x6f\x6e\x22\x2f\
\x3e\x0a\x3c\x63\x63\x3a\x70\x65\x72\x6d\x69\x74\x73\x20\x72\x64\
\x66\x3a\x72\x65\x73\x6f\x75\x72\x63\x65\x3d\x22\x68\x74\x74\x70\
\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\
\x6f\x72\x67\x2f\x63\x63\x2f\x44\x65\x72\x69\x76\x61\x74\x69\x76\
\x65\x57\x6f\x72\x6b\x73\x22\x2f\x3e\x0a\x3c\x2f\x63\x63\x3a\x4c\
\x69\x63\x65\x6e\x73\x65\x3e\x0a\x3c\x2f\x72\x64\x66\x3a\x52\x44\
\x46\x3e\x0a\x3c\x2f\x6d\x65\x74\x61\x64\x61\x74\x61\x3e\x0a\x3c\
\x64\x65\x66\x73\x20\x69\x64\x3d\x22\x64\x65\x66\x73\x33\x22\x2f\
\x3e\x0a\x3c\x73\x6f\x64\x69\x70\x6f\x64\x69\x3a\x6e\x61\x6d\x65\
\x64\x76\x69\x65\x77\x20\x62\x6f\x72\x64\x65\x72\x63\x6f\x6c\x6f\
\x72\x3d\x22\x23\x36\x36\x36\x36\x36\x36\x22\x20\x62\x6f\x72\x64\
\x65\x72\x6f\x70\x61\x63\x69\x74\x79\x3d\x22\x31\x2e\x30\x22\x20\
\x69\x64\x3d\x22\x62\x61\x73\x65\x22\x20\x69\x6e\x6b\x73\x63\x61\
\x70\x65\x3a\x63\x75\x72\x72\x65\x6e\x74\x2d\x6c\x61\x79\x65\x72\
\x3d\x22\x6c\x61\x79\x65\x72\x31\x22\x20\x69\x6e\x6b\x73\x63\x61\
\x70\x65\x3a\x63\x78\x3d\x22\x35\x34\x39\x2e\x34\x30\x36\x37\x34\
\x22\x20\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x63\x79\x3d\x22\x35\
\x39\x36\x2e\x30\x30\x31\x35\x39\x22\x20\x69\x6e\x6b\x73\x63\x61\
\x70\x65\x3a\x64\x6f\x63\x75\x6d\x65\x6e\x74\x2d\x75\x6e\x69\x74\
\x73\x3d\x22\x70\x78\x22\x20\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\
\x67\x75\x69\x64\x65\x2d\x62\x62\x6f\x78\x3d\x22\x74\x72\x75\x65\
\x22\x20\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x70\x61\x67\x65\x6f\
\x70\x61\x63\x69\x74\x79\x3d\x22\x30\x2e\x30\x22\x20\x69\x6e\x6b\
\x73\x63\x61\x70\x65\x3a\x70\x61\x67\x65\x73\x68\x61\x64\x6f\x77\
\x3d\x22\x32\x22\x20\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x77\x69\
\x6e\x64\x6f\x77\x2d\x68\x65\x69\x67\x68\x74\x3d\x22\x36\x31\x35\
\x22\x20\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x77\x69\x6e\x64\x6f\
\x77\x2d\x77\x69\x64\x74\x68\x3d\x22\x38\x36\x36\x22\x20\x69\x6e\
\x6b\x73\x63\x61\x70\x65\x3a\x77\x69\x6e\x64\x6f\x77\x2d\x78\x3d\
\x22\x38\x38\x22\x20\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x77\x69\
\x6e\x64\x6f\x77\x2d\x79\x3d\x22\x31\x31\x36\x22\x20\x69\x6e\x6b\
\x73\x63\x61\x70\x65\x3a\x7a\x6f\x6f\x6d\x3d\x22\x30\x2e\x33\x35\
\x30\x30\x30\x30\x30\x30\x22\x20\x70\x61\x67\x65\x63\x6f\x6c\x6f\
\x72\x3d\x22\x23\x66\x66\x66\x66\x66\x66\x22\x20\x73\x68\x6f\x77\
\x67\x75\x69\x64\x65\x73\x3d\x22\x74\x72\x75\x65\x22\x2f\x3e\x0a\
\x3c\x67\x20\x69\x64\x3d\x22\x6c\x61\x79\x65\x72\x31\x22\x20\x69\
\x6e\x6b\x73\x63\x61\x70\x65\x3a\x67\x72\x6f\x75\x70\x6d\x6f\x64\
\x65\x3d\x22\x6c\x61\x79\x65\x72\x22\x20\x69\x6e\x6b\x73\x63\x61\
\x70\x65\x3a\x6c\x61\x62\x65\x6c\x3d\x22\x4c\x61\x79\x65\x72\x20\
\x31\x22\x3e\x0a\x3c\x70\x61\x74\x68\x20\x64\x3d\x22\x4d\x20\x32\
\x36\x33\x2e\x34\x31\x35\x37\x30\x2c\x32\x33\x35\x2e\x31\x34\x35\
\x38\x38\x20\x43\x20\x31\x39\x37\x2e\x31\x37\x35\x37\x30\x2c\x32\
\x33\x35\x2e\x31\x34\x35\x38\x38\x20\x31\x34\x33\x2e\x34\x31\x35\
\x37\x35\x2c\x32\x38\x38\x2e\x39\x30\x35\x38\x37\x20\x31\x34\x33\
\x2e\x34\x31\x35\x37\x35\x2c\x33\x35\x35\x2e\x31\x34\x35\x38\x38\
\x20\x43\x20\x31\x34\x33\x2e\x34\x31\x35\x37\x35\x2c\x34\x38\x39\
\x2e\x39\x30\x31\x33\x39\x20\x32\x37\x39\x2e\x33\x34\x38\x39\x30\
\x2c\x35\x32\x35\x2e\x32\x33\x33\x31\x38\x20\x33\x37\x31\x2e\x39\
\x37\x38\x32\x30\x2c\x36\x35\x38\x2e\x34\x35\x33\x39\x32\x20\x43\
\x20\x34\x35\x39\x2e\x35\x35\x32\x34\x34\x2c\x35\x32\x36\x2e\x30\
\x35\x30\x35\x36\x20\x36\x30\x30\x2e\x35\x34\x30\x37\x30\x2c\x34\
\x38\x35\x2e\x35\x39\x39\x33\x32\x20\x36\x30\x30\x2e\x35\x34\x30\
\x37\x30\x2c\x33\x35\x35\x2e\x31\x34\x35\x38\x38\x20\x43\x20\x36\
\x30\x30\x2e\x35\x34\x30\x37\x30\x2c\x32\x38\x38\x2e\x39\x30\x35\
\x38\x38\x20\x35\x34\x36\x2e\x37\x38\x30\x38\x30\x2c\x32\x33\x35\
\x2e\x31\x34\x35\x38\x37\x20\x34\x38\x30\x2e\x35\x34\x30\x37\x30\
\x2c\x32\x33\x35\x2e\x31\x34\x35\x38\x38\x20\x43\x20\x34\x33\x32\
\x2e\x34\x39\x32\x38\x30\x2c\x32\x33\x35\x2e\x31\x34\x35\x38\x38\
\x20\x33\x39\x31\x2e\x31\x33\x39\x31\x30\x2c\x32\x36\x33\x2e\x35\
\x31\x36\x33\x31\x20\x33\x37\x31\x2e\x39\x37\x38\x32\x30\x2c\x33\
\x30\x34\x2e\x33\x33\x33\x33\x38\x20\x43\x20\x33\x35\x32\x2e\x38\
\x31\x37\x34\x30\x2c\x32\x36\x33\x2e\x35\x31\x36\x33\x30\x20\x33\
\x31\x31\x2e\x34\x36\x33\x37\x30\x2c\x32\x33\x35\x2e\x31\x34\x35\
\x38\x37\x20\x32\x36\x33\x2e\x34\x31\x35\x37\x30\x2c\x32\x33\x35\
\x2e\x31\x34\x35\x38\x38\x20\x7a\x20\x22\x20\x69\x64\x3d\x22\x70\
\x61\x74\x68\x37\x22\x20\x73\x6f\x64\x69\x70\x6f\x64\x69\x3a\x6e\
\x6f\x64\x65\x74\x79\x70\x65\x73\x3d\x22\x63\x63\x63\x63\x63\x63\
\x63\x22\x20\x73\x74\x79\x6c\x65\x3d\x22\x66\x69\x6c\x6c\x3a\x23\
\x65\x36\x30\x30\x30\x30\x3b\x66\x69\x6c\x6c\x2d\x6f\x70\x61\x63\
\x69\x74\x79\x3a\x31\x2e\x30\x30\x30\x30\x30\x30\x30\x3b\x73\x74\
\x72\x6f\x6b\x65\x3a\x23\x30\x30\x30\x30\x30\x30\x3b\x73\x74\x72\
\x6f\x6b\x65\x2d\x77\x69\x64\x74\x68\x3a\x31\x38\x2e\x37\x30\x30\
\x30\x30\x31\x3b\x73\x74\x72\x6f\x6b\x65\x2d\x6d\x69\x74\x65\x72\
\x6c\x69\x6d\x69\x74\x3a\x34\x2e\x30\x30\x30\x30\x30\x30\x30\x3b\
\x73\x74\x72\x6f\x6b\x65\x2d\x6f\x70\x61\x63\x69\x74\x79\x3a\x31\
\x2e\x30\x30\x30\x30\x30\x30\x30\x22\x2f\x3e\x0a\x3c\x70\x61\x74\
\x68\x20\x64\x3d\x22\x4d\x20\x32\x36\x35\x2e\x30\x30\x30\x30\x30\
\x2c\x32\x35\x33\x2e\x35\x39\x33\x37\x35\x20\x43\x20\x32\x30\x37\
\x2e\x30\x34\x30\x33\x33\x2c\x32\x35\x33\x2e\x35\x39\x33\x37\x35\
\x20\x31\x36\x30\x2e\x30\x30\x30\x30\x30\x2c\x33\x30\x30\x2e\x36\
\x33\x34\x30\x37\x20\x31\x36\x30\x2e\x30\x30\x30\x30\x30\x2c\x33\
\x35\x38\x2e\x35\x39\x33\x37\x35\x20\x43\x20\x31\x36\x30\x2e\x30\
\x30\x30\x30\x30\x2c\x34\x37\x36\x2e\x35\x30\x34\x31\x35\x20\x32\
\x37\x38\x2e\x39\x31\x38\x35\x37\x2c\x35\x30\x37\x2e\x34\x33\x32\
\x35\x31\x20\x33\x35\x39\x2e\x39\x36\x38\x37\x35\x2c\x36\x32\x34\
\x2e\x30\x30\x30\x30\x30\x20\x43\x20\x33\x36\x36\x2e\x35\x32\x38\
\x36\x38\x2c\x36\x31\x34\x2e\x30\x38\x32\x30\x35\x20\x32\x32\x30\
\x2e\x30\x30\x30\x30\x30\x2c\x34\x37\x38\x2e\x34\x37\x33\x30\x39\
\x20\x32\x32\x30\x2e\x30\x30\x30\x30\x30\x2c\x33\x37\x38\x2e\x35\
\x39\x33\x37\x35\x20\x43\x20\x32\x32\x30\x2e\x30\x30\x30\x30\x30\
\x2c\x33\x32\x30\x2e\x36\x33\x34\x30\x37\x20\x32\x36\x37\x2e\x30\
\x34\x30\x33\x33\x2c\x32\x37\x33\x2e\x35\x39\x33\x37\x35\x20\x33\
\x32\x35\x2e\x30\x30\x30\x30\x30\x2c\x32\x37\x33\x2e\x35\x39\x33\
\x37\x35\x20\x43\x20\x33\x32\x35\x2e\x35\x30\x34\x35\x33\x2c\x32\
\x37\x33\x2e\x35\x39\x33\x37\x35\x20\x33\x32\x35\x2e\x39\x39\x37\
\x31\x38\x2c\x32\x37\x33\x2e\x36\x34\x39\x31\x32\x20\x33\x32\x36\
\x2e\x35\x30\x30\x30\x30\x2c\x32\x37\x33\x2e\x36\x35\x36\x32\x35\
\x20\x43\x20\x33\x30\x39\x2e\x32\x32\x34\x33\x36\x2c\x32\x36\x31\
\x2e\x30\x37\x32\x38\x36\x20\x32\x38\x38\x2e\x30\x30\x35\x35\x37\
\x2c\x32\x35\x33\x2e\x35\x39\x33\x37\x34\x20\x32\x36\x35\x2e\x30\
\x30\x30\x30\x30\x2c\x32\x35\x33\x2e\x35\x39\x33\x37\x35\x20\x7a\
\x20\x22\x20\x69\x64\x3d\x22\x70\x61\x74\x68\x32\x32\x30\x22\x20\
\x73\x6f\x64\x69\x70\x6f\x64\x69\x3a\x6e\x6f\x64\x65\x74\x79\x70\
\x65\x73\x3d\x22\x63\x63\x63\x63\x63\x63\x63\x22\x20\x73\x74\x79\
\x6c\x65\x3d\x22\x66\x69\x6c\x6c\x3a\x23\x65\x36\x65\x36\x65\x36\
\x3b\x66\x69\x6c\x6c\x2d\x6f\x70\x61\x63\x69\x74\x79\x3a\x30\x2e\
\x36\x34\x35\x35\x36\x39\x36\x32\x3b\x73\x74\x72\x6f\x6b\x65\x3a\
\x6e\x6f\x6e\x65\x3b\x73\x74\x72\x6f\x6b\x65\x2d\x77\x69\x64\x74\
\x68\x3a\x31\x38\x2e\x37\x30\x30\x30\x30\x31\x3b\x73\x74\x72\x6f\
\x6b\x65\x2d\x6d\x69\x74\x65\x72\x6c\x69\x6d\x69\x74\x3a\x34\x2e\
\x30\x30\x30\x30\x30\x30\x30\x3b\x73\x74\x72\x6f\x6b\x65\x2d\x6f\
\x70\x61\x63\x69\x74\x79\x3a\x31\x2e\x30\x30\x30\x30\x30\x30\x30\
\x22\x2f\x3e\x0a\x3c\x2f\x67\x3e\x0a\x3c\x2f\x73\x76\x67\x3e\x0a\
\
\x00\x00\x0d\x2f\
\x3c\
\x3f\x78\x6d\x6c\x20\x76\x65\x72\x73\x69\x6f\x6e\x3d\x22\x31\x2e\
\x30\x22\x20\x65\x6e\x63\x6f\x64\x69\x6e\x67\x3d\x22\x55\x54\x46\
\x2d\x38\x22\x20\x73\x74\x61\x6e\x64\x61\x6c\x6f\x6e\x65\x3d\x22\
\x6e\x6f\x22\x3f\x3e\x0a\x3c\x21\x44\x4f\x43\x54\x59\x50\x45\x20\
\x73\x76\x67\x20\x50\x55\x42\x4c\x49\x43\x20\x22\x2d\x2f\x2f\x57\
\x33\x43\x2f\x2f\x44\x54\x44\x20\x53\x56\x47\x20\x32\x30\x30\x31\
\x30\x39\x30\x34\x2f\x2f\x45\x4e\x22\x0a\x22\x68\x74\x74\x70\x3a\
\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x54\x52\x2f\
\x32\x30\x30\x31\x2f\x52\x45\x43\x2d\x53\x56\x47\x2d\x32\x30\x30\
\x31\x30\x39\x30\x34\x2f\x44\x54\x44\x2f\x73\x76\x67\x31\x30\x2e\
\x64\x74\x64\x22\x3e\x0a\x3c\x73\x76\x67\x20\x76\x69\x65\x77\x42\
\x6f\x78\x3d\x22\x2d\x31\x30\x20\x2d\x31\x30\x20\x31\x37\x38\x20\
\x31\x37\x38\x22\x20\x68\x65\x69\x67\x68\x74\x3d\x22\x31\x37\x37\
\x2e\x35\x32\x33\x22\x20\x69\x64\x3d\x22\x73\x76\x67\x31\x22\x20\
\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x76\x65\x72\x73\x69\x6f\x6e\
\x3d\x22\x30\x2e\x34\x30\x22\x20\x73\x6f\x64\x69\x70\x6f\x64\x69\
\x3a\x64\x6f\x63\x62\x61\x73\x65\x3d\x22\x2f\x6d\x6e\x74\x2f\x64\
\x6f\x6e\x6e\x65\x65\x73\x2f\x30\x39\x2d\x4d\x65\x73\x5f\x69\x6d\
\x61\x67\x65\x73\x2f\x54\x72\x61\x76\x61\x75\x78\x2f\x54\x72\x61\
\x76\x61\x75\x78\x20\x76\x65\x63\x74\x6f\x72\x69\x65\x6c\x2f\x70\
\x69\x63\x74\x6f\x67\x72\x61\x6d\x6d\x65\x73\x2f\x73\xc3\xa9\x63\
\x75\x20\x53\x56\x47\x2f\x70\x72\x6f\x64\x75\x69\x74\x73\x20\x63\
\x68\x69\x6d\x69\x71\x75\x65\x73\x22\x20\x73\x6f\x64\x69\x70\x6f\
\x64\x69\x3a\x64\x6f\x63\x6e\x61\x6d\x65\x3d\x22\x58\x69\x49\x72\
\x72\x69\x74\x61\x6e\x74\x2e\x73\x76\x67\x22\x20\x73\x6f\x64\x69\
\x70\x6f\x64\x69\x3a\x76\x65\x72\x73\x69\x6f\x6e\x3d\x22\x30\x2e\
\x33\x32\x22\x20\x77\x69\x64\x74\x68\x3d\x22\x31\x35\x35\x2e\x39\
\x33\x32\x22\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\x74\x74\x70\x3a\
\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x32\x30\x30\
\x30\x2f\x73\x76\x67\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x63\x63\x3d\
\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\
\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x22\x20\x78\x6d\
\x6c\x6e\x73\x3a\x64\x63\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x70\
\x75\x72\x6c\x2e\x6f\x72\x67\x2f\x64\x63\x2f\x65\x6c\x65\x6d\x65\
\x6e\x74\x73\x2f\x31\x2e\x31\x2f\x22\x20\x78\x6d\x6c\x6e\x73\x3a\
\x69\x6e\x6b\x73\x63\x61\x70\x65\x3d\x22\x68\x74\x74\x70\x3a\x2f\
\x2f\x77\x77\x77\x2e\x69\x6e\x6b\x73\x63\x61\x70\x65\x2e\x6f\x72\
\x67\x2f\x6e\x61\x6d\x65\x73\x70\x61\x63\x65\x73\x2f\x69\x6e\x6b\
\x73\x63\x61\x70\x65\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x72\x64\x66\
\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\
\x6f\x72\x67\x2f\x31\x39\x39\x39\x2f\x30\x32\x2f\x32\x32\x2d\x72\
\x64\x66\x2d\x73\x79\x6e\x74\x61\x78\x2d\x6e\x73\x23\x22\x20\x78\
\x6d\x6c\x6e\x73\x3a\x73\x6f\x64\x69\x70\x6f\x64\x69\x3d\x22\x68\
\x74\x74\x70\x3a\x2f\x2f\x73\x6f\x64\x69\x70\x6f\x64\x69\x2e\x73\
\x6f\x75\x72\x63\x65\x66\x6f\x72\x67\x65\x2e\x6e\x65\x74\x2f\x44\
\x54\x44\x2f\x73\x6f\x64\x69\x70\x6f\x64\x69\x2d\x30\x2e\x64\x74\
\x64\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x78\x6c\x69\x6e\x6b\x3d\x22\
\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\
\x67\x2f\x31\x39\x39\x39\x2f\x78\x6c\x69\x6e\x6b\x22\x3e\x0a\x3c\
\x6d\x65\x74\x61\x64\x61\x74\x61\x3e\x0a\x3c\x72\x64\x66\x3a\x52\
\x44\x46\x20\x78\x6d\x6c\x6e\x73\x3a\x63\x63\x3d\x22\x68\x74\x74\
\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\
\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x22\x20\x78\x6d\x6c\x6e\x73\x3a\
\x64\x63\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x70\x75\x72\x6c\x2e\
\x6f\x72\x67\x2f\x64\x63\x2f\x65\x6c\x65\x6d\x65\x6e\x74\x73\x2f\
\x31\x2e\x31\x2f\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x72\x64\x66\x3d\
\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\
\x72\x67\x2f\x31\x39\x39\x39\x2f\x30\x32\x2f\x32\x32\x2d\x72\x64\
\x66\x2d\x73\x79\x6e\x74\x61\x78\x2d\x6e\x73\x23\x22\x3e\x0a\x3c\
\x63\x63\x3a\x57\x6f\x72\x6b\x20\x72\x64\x66\x3a\x61\x62\x6f\x75\
\x74\x3d\x22\x22\x3e\x0a\x3c\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\
\x49\x72\x72\x69\x74\x61\x6e\x74\x3c\x2f\x64\x63\x3a\x74\x69\x74\
\x6c\x65\x3e\x0a\x3c\x64\x63\x3a\x64\x65\x73\x63\x72\x69\x70\x74\
\x69\x6f\x6e\x3e\x70\x72\x6f\x64\x75\x69\x74\x20\x63\x68\x69\x6d\
\x69\x71\x75\x65\x3c\x2f\x64\x63\x3a\x64\x65\x73\x63\x72\x69\x70\
\x74\x69\x6f\x6e\x3e\x0a\x3c\x64\x63\x3a\x73\x75\x62\x6a\x65\x63\
\x74\x3e\x0a\x3c\x72\x64\x66\x3a\x42\x61\x67\x3e\x0a\x3c\x72\x64\
\x66\x3a\x6c\x69\x3e\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\
\x72\x64\x66\x3a\x6c\x69\x3e\x73\x79\x6d\x62\x6f\x6c\x3c\x2f\x72\
\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\x3e\x73\
\x69\x67\x6e\x73\x5f\x61\x6e\x64\x5f\x73\x79\x6d\x62\x6f\x6c\x73\
\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x2f\x72\x64\x66\x3a\
\x42\x61\x67\x3e\x0a\x3c\x2f\x64\x63\x3a\x73\x75\x62\x6a\x65\x63\
\x74\x3e\x0a\x3c\x64\x63\x3a\x70\x75\x62\x6c\x69\x73\x68\x65\x72\
\x3e\x0a\x3c\x63\x63\x3a\x41\x67\x65\x6e\x74\x20\x72\x64\x66\x3a\
\x61\x62\x6f\x75\x74\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x77\
\x77\x2e\x6f\x70\x65\x6e\x63\x6c\x69\x70\x61\x72\x74\x2e\x6f\x72\
\x67\x22\x3e\x0a\x3c\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\x79\x76\
\x65\x73\x20\x47\x55\x49\x4c\x4c\x4f\x55\x3c\x2f\x64\x63\x3a\x74\
\x69\x74\x6c\x65\x3e\x0a\x3c\x2f\x63\x63\x3a\x41\x67\x65\x6e\x74\
\x3e\x0a\x3c\x2f\x64\x63\x3a\x70\x75\x62\x6c\x69\x73\x68\x65\x72\
\x3e\x0a\x3c\x64\x63\x3a\x63\x72\x65\x61\x74\x6f\x72\x3e\x0a\x3c\
\x63\x63\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\x64\x63\x3a\x74\x69\
\x74\x6c\x65\x3e\x79\x76\x65\x73\x20\x47\x55\x49\x4c\x4c\x4f\x55\
\x3c\x2f\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\x0a\x3c\x2f\x63\x63\
\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\x2f\x64\x63\x3a\x63\x72\x65\
\x61\x74\x6f\x72\x3e\x0a\x3c\x64\x63\x3a\x72\x69\x67\x68\x74\x73\
\x3e\x0a\x3c\x63\x63\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\x64\x63\
\x3a\x74\x69\x74\x6c\x65\x3e\x79\x76\x65\x73\x20\x47\x55\x49\x4c\
\x4c\x4f\x55\x3c\x2f\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\x0a\x3c\
\x2f\x63\x63\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\x2f\x64\x63\x3a\
\x72\x69\x67\x68\x74\x73\x3e\x0a\x3c\x64\x63\x3a\x64\x61\x74\x65\
\x3e\x3c\x2f\x64\x63\x3a\x64\x61\x74\x65\x3e\x0a\x3c\x64\x63\x3a\
\x66\x6f\x72\x6d\x61\x74\x3e\x69\x6d\x61\x67\x65\x2f\x73\x76\x67\
\x2b\x78\x6d\x6c\x3c\x2f\x64\x63\x3a\x66\x6f\x72\x6d\x61\x74\x3e\
\x0a\x3c\x64\x63\x3a\x74\x79\x70\x65\x20\x72\x64\x66\x3a\x72\x65\
\x73\x6f\x75\x72\x63\x65\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x70\
\x75\x72\x6c\x2e\x6f\x72\x67\x2f\x64\x63\x2f\x64\x63\x6d\x69\x74\
\x79\x70\x65\x2f\x53\x74\x69\x6c\x6c\x49\x6d\x61\x67\x65\x22\x2f\
\x3e\x0a\x3c\x63\x63\x3a\x6c\x69\x63\x65\x6e\x73\x65\x20\x72\x64\
\x66\x3a\x72\x65\x73\x6f\x75\x72\x63\x65\x3d\x22\x68\x74\x74\x70\
\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\
\x6f\x72\x67\x2f\x63\x63\x2f\x50\x75\x62\x6c\x69\x63\x44\x6f\x6d\
\x61\x69\x6e\x22\x2f\x3e\x0a\x3c\x64\x63\x3a\x6c\x61\x6e\x67\x75\
\x61\x67\x65\x3e\x65\x6e\x3c\x2f\x64\x63\x3a\x6c\x61\x6e\x67\x75\
\x61\x67\x65\x3e\x0a\x3c\x2f\x63\x63\x3a\x57\x6f\x72\x6b\x3e\x0a\
\x3c\x63\x63\x3a\x4c\x69\x63\x65\x6e\x73\x65\x20\x72\x64\x66\x3a\
\x61\x62\x6f\x75\x74\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\
\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\
\x63\x2f\x50\x75\x62\x6c\x69\x63\x44\x6f\x6d\x61\x69\x6e\x22\x3e\
\x0a\x3c\x63\x63\x3a\x70\x65\x72\x6d\x69\x74\x73\x20\x72\x64\x66\
\x3a\x72\x65\x73\x6f\x75\x72\x63\x65\x3d\x22\x68\x74\x74\x70\x3a\
\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\x6f\
\x72\x67\x2f\x63\x63\x2f\x52\x65\x70\x72\x6f\x64\x75\x63\x74\x69\
\x6f\x6e\x22\x2f\x3e\x0a\x3c\x63\x63\x3a\x70\x65\x72\x6d\x69\x74\
\x73\x20\x72\x64\x66\x3a\x72\x65\x73\x6f\x75\x72\x63\x65\x3d\x22\
\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\
\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x44\x69\x73\x74\x72\
\x69\x62\x75\x74\x69\x6f\x6e\x22\x2f\x3e\x0a\x3c\x63\x63\x3a\x70\
\x65\x72\x6d\x69\x74\x73\x20\x72\x64\x66\x3a\x72\x65\x73\x6f\x75\
\x72\x63\x65\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\
\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\
\x44\x65\x72\x69\x76\x61\x74\x69\x76\x65\x57\x6f\x72\x6b\x73\x22\
\x2f\x3e\x0a\x3c\x2f\x63\x63\x3a\x4c\x69\x63\x65\x6e\x73\x65\x3e\
\x0a\x3c\x2f\x72\x64\x66\x3a\x52\x44\x46\x3e\x0a\x3c\x2f\x6d\x65\
\x74\x61\x64\x61\x74\x61\x3e\x0a\x3c\x73\x6f\x64\x69\x70\x6f\x64\
\x69\x3a\x6e\x61\x6d\x65\x64\x76\x69\x65\x77\x20\x62\x6f\x72\x64\
\x65\x72\x63\x6f\x6c\x6f\x72\x3d\x22\x23\x36\x36\x36\x36\x36\x36\
\x22\x20\x62\x6f\x72\x64\x65\x72\x6f\x70\x61\x63\x69\x74\x79\x3d\
\x22\x31\x2e\x30\x22\x20\x69\x64\x3d\x22\x62\x61\x73\x65\x22\x20\
\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x63\x75\x72\x72\x65\x6e\x74\
\x2d\x6c\x61\x79\x65\x72\x3d\x22\x73\x76\x67\x31\x22\x20\x69\x6e\
\x6b\x73\x63\x61\x70\x65\x3a\x63\x78\x3d\x22\x36\x32\x2e\x33\x37\
\x32\x38\x30\x35\x22\x20\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x63\
\x79\x3d\x22\x33\x34\x2e\x38\x36\x34\x35\x33\x37\x22\x20\x69\x6e\
\x6b\x73\x63\x61\x70\x65\x3a\x70\x61\x67\x65\x6f\x70\x61\x63\x69\
\x74\x79\x3d\x22\x30\x2e\x30\x22\x20\x69\x6e\x6b\x73\x63\x61\x70\
\x65\x3a\x70\x61\x67\x65\x73\x68\x61\x64\x6f\x77\x3d\x22\x32\x22\
\x20\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x77\x69\x6e\x64\x6f\x77\
\x2d\x68\x65\x69\x67\x68\x74\x3d\x22\x31\x31\x32\x31\x22\x20\x69\
\x6e\x6b\x73\x63\x61\x70\x65\x3a\x77\x69\x6e\x64\x6f\x77\x2d\x77\
\x69\x64\x74\x68\x3d\x22\x31\x35\x39\x30\x22\x20\x69\x6e\x6b\x73\
\x63\x61\x70\x65\x3a\x77\x69\x6e\x64\x6f\x77\x2d\x78\x3d\x22\x32\
\x30\x30\x22\x20\x69\x6e\x6b\x73\x63\x61\x70\x65\x3a\x77\x69\x6e\
\x64\x6f\x77\x2d\x79\x3d\x22\x30\x22\x20\x69\x6e\x6b\x73\x63\x61\
\x70\x65\x3a\x7a\x6f\x6f\x6d\x3d\x22\x36\x2e\x36\x33\x39\x39\x38\
\x34\x39\x22\x20\x70\x61\x67\x65\x63\x6f\x6c\x6f\x72\x3d\x22\x23\
\x66\x66\x66\x66\x66\x66\x22\x2f\x3e\x0a\x3c\x64\x65\x66\x73\x20\
\x69\x64\x3d\x22\x64\x65\x66\x73\x32\x22\x3e\x0a\x3c\x6d\x61\x72\
\x6b\x65\x72\x20\x69\x64\x3d\x22\x41\x72\x72\x6f\x77\x45\x6e\x64\
\x22\x20\x6d\x61\x72\x6b\x65\x72\x48\x65\x69\x67\x68\x74\x3d\x22\
\x33\x22\x20\x6d\x61\x72\x6b\x65\x72\x55\x6e\x69\x74\x73\x3d\x22\
\x73\x74\x72\x6f\x6b\x65\x57\x69\x64\x74\x68\x22\x20\x6d\x61\x72\
\x6b\x65\x72\x57\x69\x64\x74\x68\x3d\x22\x34\x22\x20\x6f\x72\x69\
\x65\x6e\x74\x3d\x22\x61\x75\x74\x6f\x22\x20\x72\x65\x66\x58\x3d\
\x22\x30\x22\x20\x72\x65\x66\x59\x3d\x22\x35\x22\x20\x76\x69\x65\
\x77\x42\x6f\x78\x3d\x22\x30\x20\x30\x20\x31\x30\x20\x31\x30\x22\
\x3e\x0a\x3c\x70\x61\x74\x68\x20\x64\x3d\x22\x4d\x20\x30\x20\x30\
\x20\x4c\x20\x31\x30\x20\x35\x20\x4c\x20\x30\x20\x31\x30\x20\x7a\
\x22\x20\x69\x64\x3d\x22\x70\x61\x74\x68\x34\x22\x2f\x3e\x0a\x3c\
\x2f\x6d\x61\x72\x6b\x65\x72\x3e\x0a\x3c\x6d\x61\x72\x6b\x65\x72\
\x20\x69\x64\x3d\x22\x41\x72\x72\x6f\x77\x53\x74\x61\x72\x74\x22\
\x20\x6d\x61\x72\x6b\x65\x72\x48\x65\x69\x67\x68\x74\x3d\x22\x33\
\x22\x20\x6d\x61\x72\x6b\x65\x72\x55\x6e\x69\x74\x73\x3d\x22\x73\
\x74\x72\x6f\x6b\x65\x57\x69\x64\x74\x68\x22\x20\x6d\x61\x72\x6b\
\x65\x72\x57\x69\x64\x74\x68\x3d\x22\x34\x22\x20\x6f\x72\x69\x65\
\x6e\x74\x3d\x22\x61\x75\x74\x6f\x22\x20\x72\x65\x66\x58\x3d\x22\
\x31\x30\x22\x20\x72\x65\x66\x59\x3d\x22\x35\x22\x20\x76\x69\x65\
\x77\x42\x6f\x78\x3d\x22\x30\x20\x30\x20\x31\x30\x20\x31\x30\x22\
\x3e\x0a\x3c\x70\x61\x74\x68\x20\x64\x3d\x22\x4d\x20\x31\x30\x20\
\x30\x20\x4c\x20\x30\x20\x35\x20\x4c\x20\x31\x30\x20\x31\x30\x20\
\x7a\x22\x20\x69\x64\x3d\x22\x70\x61\x74\x68\x36\x22\x2f\x3e\x0a\
\x3c\x2f\x6d\x61\x72\x6b\x65\x72\x3e\x0a\x3c\x2f\x64\x65\x66\x73\
\x3e\x0a\x3c\x67\x20\x69\x64\x3d\x22\x67\x37\x22\x3e\x0a\x3c\x67\
\x20\x69\x64\x3d\x22\x67\x38\x22\x3e\x0a\x3c\x70\x61\x74\x68\x20\
\x64\x3d\x22\x4d\x20\x31\x35\x35\x2e\x39\x33\x32\x20\x31\x35\x35\
\x2e\x39\x33\x32\x4c\x20\x31\x35\x35\x2e\x39\x33\x32\x20\x30\x4c\
\x20\x30\x20\x30\x4c\x20\x30\x20\x31\x35\x35\x2e\x39\x33\x32\x4c\
\x20\x31\x35\x35\x2e\x39\x33\x32\x20\x31\x35\x35\x2e\x39\x33\x32\
\x7a\x22\x20\x69\x64\x3d\x22\x70\x61\x74\x68\x39\x22\x20\x73\x74\
\x79\x6c\x65\x3d\x22\x73\x74\x72\x6f\x6b\x65\x3a\x6e\x6f\x6e\x65\
\x3b\x20\x66\x69\x6c\x6c\x3a\x23\x30\x30\x30\x30\x30\x30\x22\x2f\
\x3e\x0a\x3c\x70\x61\x74\x68\x20\x64\x3d\x22\x4d\x20\x31\x35\x30\
\x2e\x38\x33\x20\x31\x35\x30\x2e\x38\x33\x4c\x20\x31\x35\x30\x2e\
\x38\x33\x20\x35\x2e\x31\x30\x31\x31\x4c\x20\x35\x2e\x31\x30\x31\
\x31\x20\x35\x2e\x31\x30\x31\x31\x4c\x20\x35\x2e\x31\x30\x31\x31\
\x20\x31\x35\x30\x2e\x38\x33\x4c\x20\x31\x35\x30\x2e\x38\x33\x20\
\x31\x35\x30\x2e\x38\x33\x7a\x22\x20\x69\x64\x3d\x22\x70\x61\x74\
\x68\x31\x30\x22\x20\x73\x74\x79\x6c\x65\x3d\x22\x73\x74\x72\x6f\
\x6b\x65\x3a\x6e\x6f\x6e\x65\x3b\x20\x66\x69\x6c\x6c\x3a\x23\x66\
\x66\x39\x39\x30\x30\x22\x2f\x3e\x0a\x3c\x2f\x67\x3e\x0a\x3c\x67\
\x20\x69\x64\x3d\x22\x67\x31\x31\x22\x3e\x0a\x3c\x70\x61\x74\x68\
\x20\x64\x3d\x22\x4d\x20\x31\x34\x30\x2e\x38\x32\x33\x20\x31\x31\
\x31\x2e\x37\x38\x33\x4c\x20\x34\x34\x2e\x33\x36\x37\x37\x20\x31\
\x34\x2e\x30\x37\x37\x31\x4c\x20\x31\x35\x2e\x31\x30\x38\x34\x20\
\x34\x34\x2e\x31\x34\x38\x39\x4c\x20\x31\x31\x31\x2e\x35\x36\x34\
\x20\x31\x34\x31\x2e\x38\x35\x34\x4c\x20\x31\x34\x30\x2e\x38\x32\
\x33\x20\x31\x31\x31\x2e\x37\x38\x33\x7a\x22\x20\x69\x64\x3d\x22\
\x70\x61\x74\x68\x31\x32\x22\x20\x73\x74\x79\x6c\x65\x3d\x22\x73\
\x74\x72\x6f\x6b\x65\x3a\x6e\x6f\x6e\x65\x3b\x20\x66\x69\x6c\x6c\
\x3a\x23\x30\x30\x30\x30\x30\x30\x22\x2f\x3e\x0a\x3c\x70\x61\x74\
\x68\x20\x64\x3d\x22\x4d\x20\x31\x31\x31\x2e\x37\x38\x33\x20\x31\
\x35\x2e\x31\x30\x38\x34\x4c\x20\x31\x34\x2e\x30\x37\x37\x31\x20\
\x31\x31\x31\x2e\x35\x36\x34\x4c\x20\x34\x34\x2e\x31\x34\x38\x39\
\x20\x31\x34\x30\x2e\x38\x32\x33\x4c\x20\x31\x34\x31\x2e\x38\x35\
\x35\x20\x34\x34\x2e\x33\x36\x37\x37\x4c\x20\x31\x31\x31\x2e\x37\
\x38\x33\x20\x31\x35\x2e\x31\x30\x38\x34\x7a\x22\x20\x69\x64\x3d\
\x22\x70\x61\x74\x68\x31\x33\x22\x20\x73\x74\x79\x6c\x65\x3d\x22\
\x73\x74\x72\x6f\x6b\x65\x3a\x6e\x6f\x6e\x65\x3b\x20\x66\x69\x6c\
\x6c\x3a\x23\x30\x30\x30\x30\x30\x30\x22\x2f\x3e\x0a\x3c\x2f\x67\
\x3e\x0a\x3c\x2f\x67\x3e\x0a\x3c\x2f\x73\x76\x67\x3e\x0a\
\x00\x00\x0c\x40\
\x3c\
\x3f\x78\x6d\x6c\x20\x76\x65\x72\x73\x69\x6f\x6e\x3d\x22\x31\x2e\
\x30\x22\x20\x65\x6e\x63\x6f\x64\x69\x6e\x67\x3d\x22\x75\x74\x66\
\x2d\x38\x22\x3f\x3e\x0a\x3c\x21\x2d\x2d\x20\x47\x65\x6e\x65\x72\
\x61\x74\x6f\x72\x3a\x20\x41\x64\x6f\x62\x65\x20\x49\x6c\x6c\x75\
\x73\x74\x72\x61\x74\x6f\x72\x20\x31\x30\x2c\x20\x53\x56\x47\x20\
\x45\x78\x70\x6f\x72\x74\x20\x50\x6c\x75\x67\x2d\x49\x6e\x20\x2e\
\x20\x53\x56\x47\x20\x56\x65\x72\x73\x69\x6f\x6e\x3a\x20\x33\x2e\
\x30\x2e\x30\x20\x42\x75\x69\x6c\x64\x20\x37\x36\x29\x20\x20\x2d\
\x2d\x3e\x3c\x73\x76\x67\x20\x65\x6e\x61\x62\x6c\x65\x2d\x62\x61\
\x63\x6b\x67\x72\x6f\x75\x6e\x64\x3d\x22\x6e\x65\x77\x20\x30\x20\
\x30\x20\x33\x34\x37\x20\x33\x34\x38\x22\x20\x68\x65\x69\x67\x68\
\x74\x3d\x22\x33\x34\x38\x22\x20\x69\x3a\x70\x61\x67\x65\x42\x6f\
\x75\x6e\x64\x73\x3d\x22\x30\x20\x37\x39\x32\x20\x36\x31\x32\x20\
\x30\x22\x20\x69\x3a\x72\x75\x6c\x65\x72\x4f\x72\x69\x67\x69\x6e\
\x3d\x22\x30\x20\x30\x22\x20\x69\x3a\x76\x69\x65\x77\x4f\x72\x69\
\x67\x69\x6e\x3d\x22\x31\x33\x31\x20\x35\x36\x37\x22\x20\x6f\x76\
\x65\x72\x66\x6c\x6f\x77\x3d\x22\x76\x69\x73\x69\x62\x6c\x65\x22\
\x20\x73\x70\x61\x63\x65\x3d\x22\x70\x72\x65\x73\x65\x72\x76\x65\
\x22\x20\x76\x69\x65\x77\x42\x6f\x78\x3d\x22\x2d\x32\x30\x20\x2d\
\x32\x30\x20\x33\x38\x37\x20\x33\x38\x38\x22\x20\x77\x69\x64\x74\
\x68\x3d\x22\x33\x34\x37\x22\x20\x78\x6d\x6c\x6e\x73\x3d\x22\x68\
\x74\x74\x70\x3a\x2f\x2f\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\
\x2f\x32\x30\x30\x30\x2f\x73\x76\x67\x22\x20\x78\x6d\x6c\x6e\x73\
\x3a\x61\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x6e\x73\x2e\x61\x64\
\x6f\x62\x65\x2e\x63\x6f\x6d\x2f\x41\x64\x6f\x62\x65\x53\x56\x47\
\x56\x69\x65\x77\x65\x72\x45\x78\x74\x65\x6e\x73\x69\x6f\x6e\x73\
\x2f\x33\x2e\x30\x2f\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x67\x72\x61\
\x70\x68\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x6e\x73\x2e\x61\x64\
\x6f\x62\x65\x2e\x63\x6f\x6d\x2f\x47\x72\x61\x70\x68\x73\x2f\x31\
\x2e\x30\x2f\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x69\x3d\x22\x68\x74\
\x74\x70\x3a\x2f\x2f\x6e\x73\x2e\x61\x64\x6f\x62\x65\x2e\x63\x6f\
\x6d\x2f\x41\x64\x6f\x62\x65\x49\x6c\x6c\x75\x73\x74\x72\x61\x74\
\x6f\x72\x2f\x31\x30\x2e\x30\x2f\x22\x20\x78\x6d\x6c\x6e\x73\x3a\
\x78\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x6e\x73\x2e\x61\x64\x6f\
\x62\x65\x2e\x63\x6f\x6d\x2f\x45\x78\x74\x65\x6e\x73\x69\x62\x69\
\x6c\x69\x74\x79\x2f\x31\x2e\x30\x2f\x22\x20\x78\x6d\x6c\x6e\x73\
\x3a\x78\x6c\x69\x6e\x6b\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\
\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x31\x39\x39\x39\x2f\x78\
\x6c\x69\x6e\x6b\x22\x3e\x0a\x3c\x6d\x65\x74\x61\x64\x61\x74\x61\
\x3e\x0a\x3c\x72\x64\x66\x3a\x52\x44\x46\x20\x78\x6d\x6c\x6e\x73\
\x3a\x63\x63\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\
\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\
\x22\x20\x78\x6d\x6c\x6e\x73\x3a\x64\x63\x3d\x22\x68\x74\x74\x70\
\x3a\x2f\x2f\x70\x75\x72\x6c\x2e\x6f\x72\x67\x2f\x64\x63\x2f\x65\
\x6c\x65\x6d\x65\x6e\x74\x73\x2f\x31\x2e\x31\x2f\x22\x20\x78\x6d\
\x6c\x6e\x73\x3a\x72\x64\x66\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\
\x77\x77\x77\x2e\x77\x33\x2e\x6f\x72\x67\x2f\x31\x39\x39\x39\x2f\
\x30\x32\x2f\x32\x32\x2d\x72\x64\x66\x2d\x73\x79\x6e\x74\x61\x78\
\x2d\x6e\x73\x23\x22\x3e\x0a\x3c\x63\x63\x3a\x57\x6f\x72\x6b\x20\
\x72\x64\x66\x3a\x61\x62\x6f\x75\x74\x3d\x22\x22\x3e\x0a\x3c\x64\
\x63\x3a\x74\x69\x74\x6c\x65\x3e\x4b\x65\x65\x70\x20\x54\x69\x64\
\x79\x20\x49\x6e\x73\x69\x64\x65\x3c\x2f\x64\x63\x3a\x74\x69\x74\
\x6c\x65\x3e\x0a\x3c\x64\x63\x3a\x64\x65\x73\x63\x72\x69\x70\x74\
\x69\x6f\x6e\x3e\x3c\x2f\x64\x63\x3a\x64\x65\x73\x63\x72\x69\x70\
\x74\x69\x6f\x6e\x3e\x0a\x3c\x64\x63\x3a\x73\x75\x62\x6a\x65\x63\
\x74\x3e\x0a\x3c\x72\x64\x66\x3a\x42\x61\x67\x3e\x0a\x3c\x72\x64\
\x66\x3a\x6c\x69\x3e\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\
\x72\x64\x66\x3a\x6c\x69\x3e\x73\x79\x6d\x62\x6f\x6c\x3c\x2f\x72\
\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\x3e\x62\
\x69\x6e\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\
\x3a\x6c\x69\x3e\x73\x69\x67\x6e\x73\x5f\x61\x6e\x64\x5f\x73\x79\
\x6d\x62\x6f\x6c\x73\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\
\x72\x64\x66\x3a\x6c\x69\x3e\x63\x6c\x65\x61\x6e\x3c\x2f\x72\x64\
\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\x3e\x72\x75\
\x62\x69\x73\x68\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\
\x64\x66\x3a\x6c\x69\x3e\x74\x72\x61\x73\x68\x3c\x2f\x72\x64\x66\
\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\x3e\x69\x6e\x73\
\x69\x64\x65\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\
\x66\x3a\x6c\x69\x3e\x67\x61\x72\x62\x61\x67\x65\x3c\x2f\x72\x64\
\x66\x3a\x6c\x69\x3e\x0a\x3c\x72\x64\x66\x3a\x6c\x69\x3e\x73\x69\
\x67\x6e\x3c\x2f\x72\x64\x66\x3a\x6c\x69\x3e\x0a\x3c\x2f\x72\x64\
\x66\x3a\x42\x61\x67\x3e\x0a\x3c\x2f\x64\x63\x3a\x73\x75\x62\x6a\
\x65\x63\x74\x3e\x0a\x3c\x64\x63\x3a\x70\x75\x62\x6c\x69\x73\x68\
\x65\x72\x3e\x0a\x3c\x63\x63\x3a\x41\x67\x65\x6e\x74\x20\x72\x64\
\x66\x3a\x61\x62\x6f\x75\x74\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\
\x77\x77\x77\x2e\x6f\x70\x65\x6e\x63\x6c\x69\x70\x61\x72\x74\x2e\
\x6f\x72\x67\x22\x3e\x0a\x3c\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\
\x4d\x61\x72\x74\x69\x6e\x20\x4f\x77\x65\x6e\x73\x3c\x2f\x64\x63\
\x3a\x74\x69\x74\x6c\x65\x3e\x0a\x3c\x2f\x63\x63\x3a\x41\x67\x65\
\x6e\x74\x3e\x0a\x3c\x2f\x64\x63\x3a\x70\x75\x62\x6c\x69\x73\x68\
\x65\x72\x3e\x0a\x3c\x64\x63\x3a\x63\x72\x65\x61\x74\x6f\x72\x3e\
\x0a\x3c\x63\x63\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\x64\x63\x3a\
\x74\x69\x74\x6c\x65\x3e\x4d\x61\x72\x74\x69\x6e\x20\x4f\x77\x65\
\x6e\x73\x3c\x2f\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\x0a\x3c\x2f\
\x63\x63\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\x2f\x64\x63\x3a\x63\
\x72\x65\x61\x74\x6f\x72\x3e\x0a\x3c\x64\x63\x3a\x72\x69\x67\x68\
\x74\x73\x3e\x0a\x3c\x63\x63\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\
\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\x4d\x61\x72\x74\x69\x6e\x20\
\x4f\x77\x65\x6e\x73\x3c\x2f\x64\x63\x3a\x74\x69\x74\x6c\x65\x3e\
\x0a\x3c\x2f\x63\x63\x3a\x41\x67\x65\x6e\x74\x3e\x0a\x3c\x2f\x64\
\x63\x3a\x72\x69\x67\x68\x74\x73\x3e\x0a\x3c\x64\x63\x3a\x64\x61\
\x74\x65\x3e\x3c\x2f\x64\x63\x3a\x64\x61\x74\x65\x3e\x0a\x3c\x64\
\x63\x3a\x66\x6f\x72\x6d\x61\x74\x3e\x69\x6d\x61\x67\x65\x2f\x73\
\x76\x67\x2b\x78\x6d\x6c\x3c\x2f\x64\x63\x3a\x66\x6f\x72\x6d\x61\
\x74\x3e\x0a\x3c\x64\x63\x3a\x74\x79\x70\x65\x20\x72\x64\x66\x3a\
\x72\x65\x73\x6f\x75\x72\x63\x65\x3d\x22\x68\x74\x74\x70\x3a\x2f\
\x2f\x70\x75\x72\x6c\x2e\x6f\x72\x67\x2f\x64\x63\x2f\x64\x63\x6d\
\x69\x74\x79\x70\x65\x2f\x53\x74\x69\x6c\x6c\x49\x6d\x61\x67\x65\
\x22\x2f\x3e\x0a\x3c\x63\x63\x3a\x6c\x69\x63\x65\x6e\x73\x65\x20\
\x72\x64\x66\x3a\x72\x65\x73\x6f\x75\x72\x63\x65\x3d\x22\x68\x74\
\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\
\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x50\x75\x62\x6c\x69\x63\x44\
\x6f\x6d\x61\x69\x6e\x22\x2f\x3e\x0a\x3c\x64\x63\x3a\x6c\x61\x6e\
\x67\x75\x61\x67\x65\x3e\x65\x6e\x3c\x2f\x64\x63\x3a\x6c\x61\x6e\
\x67\x75\x61\x67\x65\x3e\x0a\x3c\x2f\x63\x63\x3a\x57\x6f\x72\x6b\
\x3e\x0a\x3c\x63\x63\x3a\x4c\x69\x63\x65\x6e\x73\x65\x20\x72\x64\
\x66\x3a\x61\x62\x6f\x75\x74\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\
\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\
\x2f\x63\x63\x2f\x50\x75\x62\x6c\x69\x63\x44\x6f\x6d\x61\x69\x6e\
\x22\x3e\x0a\x3c\x63\x63\x3a\x70\x65\x72\x6d\x69\x74\x73\x20\x72\
\x64\x66\x3a\x72\x65\x73\x6f\x75\x72\x63\x65\x3d\x22\x68\x74\x74\
\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\
\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x52\x65\x70\x72\x6f\x64\x75\x63\
\x74\x69\x6f\x6e\x22\x2f\x3e\x0a\x3c\x63\x63\x3a\x70\x65\x72\x6d\
\x69\x74\x73\x20\x72\x64\x66\x3a\x72\x65\x73\x6f\x75\x72\x63\x65\
\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\x62\x2e\x72\x65\x73\
\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\x63\x2f\x44\x69\x73\
\x74\x72\x69\x62\x75\x74\x69\x6f\x6e\x22\x2f\x3e\x0a\x3c\x63\x63\
\x3a\x70\x65\x72\x6d\x69\x74\x73\x20\x72\x64\x66\x3a\x72\x65\x73\
\x6f\x75\x72\x63\x65\x3d\x22\x68\x74\x74\x70\x3a\x2f\x2f\x77\x65\
\x62\x2e\x72\x65\x73\x6f\x75\x72\x63\x65\x2e\x6f\x72\x67\x2f\x63\
\x63\x2f\x44\x65\x72\x69\x76\x61\x74\x69\x76\x65\x57\x6f\x72\x6b\
\x73\x22\x2f\x3e\x0a\x3c\x2f\x63\x63\x3a\x4c\x69\x63\x65\x6e\x73\
\x65\x3e\x0a\x3c\x2f\x72\x64\x66\x3a\x52\x44\x46\x3e\x0a\x3c\x2f\
\x6d\x65\x74\x61\x64\x61\x74\x61\x3e\x0a\x3c\x67\x20\x69\x3a\x64\
\x69\x6d\x6d\x65\x64\x50\x65\x72\x63\x65\x6e\x74\x3d\x22\x35\x30\
\x22\x20\x69\x3a\x6b\x6e\x6f\x63\x6b\x6f\x75\x74\x3d\x22\x4f\x66\
\x66\x22\x20\x69\x3a\x6c\x61\x79\x65\x72\x3d\x22\x79\x65\x73\x22\
\x20\x69\x3a\x72\x67\x62\x54\x72\x69\x6f\x3d\x22\x23\x34\x46\x30\
\x30\x38\x30\x30\x30\x46\x46\x46\x46\x22\x20\x69\x64\x3d\x22\x4c\
\x61\x79\x65\x72\x5f\x31\x22\x3e\x0a\x3c\x70\x61\x74\x68\x20\x64\
\x3d\x22\x4d\x33\x34\x37\x2c\x31\x37\x34\x63\x30\x2c\x39\x36\x2e\
\x30\x39\x38\x2d\x37\x37\x2e\x36\x37\x39\x2c\x31\x37\x34\x2d\x31\
\x37\x33\x2e\x35\x2c\x31\x37\x34\x43\x37\x37\x2e\x36\x37\x39\x2c\
\x33\x34\x38\x2c\x30\x2c\x32\x37\x30\x2e\x30\x39\x38\x2c\x30\x2c\
\x31\x37\x34\x20\x20\x20\x20\x43\x30\x2c\x37\x37\x2e\x39\x30\x32\
\x2c\x37\x37\x2e\x36\x37\x39\x2c\x30\x2c\x31\x37\x33\x2e\x35\x2c\
\x30\x43\x32\x36\x39\x2e\x33\x32\x31\x2c\x30\x2c\x33\x34\x37\x2c\
\x37\x37\x2e\x39\x30\x32\x2c\x33\x34\x37\x2c\x31\x37\x34\x7a\x22\
\x20\x66\x69\x6c\x6c\x3d\x22\x23\x31\x30\x41\x30\x34\x30\x22\x20\
\x69\x3a\x6b\x6e\x6f\x63\x6b\x6f\x75\x74\x3d\x22\x4f\x66\x66\x22\
\x2f\x3e\x0a\x3c\x70\x61\x74\x68\x20\x64\x3d\x22\x4d\x32\x33\x38\
\x2c\x35\x33\x63\x30\x2c\x31\x33\x2e\x38\x30\x37\x2d\x31\x31\x2e\
\x38\x36\x34\x2c\x32\x35\x2d\x32\x36\x2e\x35\x2c\x32\x35\x53\x31\
\x38\x35\x2c\x36\x36\x2e\x38\x30\x37\x2c\x31\x38\x35\x2c\x35\x33\
\x73\x31\x31\x2e\x38\x36\x34\x2d\x32\x35\x2c\x32\x36\x2e\x35\x2d\
\x32\x35\x20\x20\x20\x20\x53\x32\x33\x38\x2c\x33\x39\x2e\x31\x39\
\x33\x2c\x32\x33\x38\x2c\x35\x33\x7a\x22\x20\x66\x69\x6c\x6c\x3d\
\x22\x23\x46\x46\x46\x46\x46\x46\x22\x20\x69\x3a\x6b\x6e\x6f\x63\
\x6b\x6f\x75\x74\x3d\x22\x4f\x66\x66\x22\x2f\x3e\x0a\x3c\x70\x61\
\x74\x68\x20\x64\x3d\x22\x4d\x36\x36\x2c\x31\x37\x35\x63\x31\x2e\
\x30\x35\x35\x2c\x36\x2e\x33\x35\x35\x2c\x31\x39\x2e\x33\x33\x33\
\x2c\x31\x32\x36\x2e\x34\x31\x37\x2c\x31\x39\x2e\x33\x33\x33\x2c\
\x31\x32\x36\x2e\x34\x31\x37\x68\x36\x38\x2e\x33\x33\x33\x20\x20\
\x20\x20\x63\x30\x2c\x30\x2c\x31\x34\x2e\x31\x30\x35\x2d\x31\x32\
\x32\x2e\x35\x32\x34\x2c\x31\x34\x2e\x33\x33\x33\x2d\x31\x32\x36\
\x2e\x34\x31\x37\x63\x36\x2e\x32\x32\x34\x2d\x30\x2e\x36\x32\x32\
\x2c\x36\x2e\x36\x36\x37\x2d\x31\x33\x2d\x32\x2d\x31\x33\x63\x2d\
\x31\x32\x2e\x31\x36\x34\x2c\x30\x2d\x38\x39\x2e\x32\x30\x35\x2d\
\x30\x2e\x30\x35\x39\x2d\x39\x38\x2c\x30\x53\x36\x31\x2e\x31\x36\
\x37\x2c\x31\x37\x34\x2e\x34\x38\x37\x2c\x36\x36\x2c\x31\x37\x35\
\x7a\x22\x20\x66\x69\x6c\x6c\x3d\x22\x23\x46\x46\x46\x46\x46\x46\
\x22\x20\x69\x3a\x6b\x6e\x6f\x63\x6b\x6f\x75\x74\x3d\x22\x4f\x66\
\x66\x22\x2f\x3e\x0a\x3c\x70\x61\x74\x68\x20\x64\x3d\x22\x4d\x37\
\x38\x2c\x31\x34\x31\x63\x31\x37\x2e\x32\x39\x32\x2d\x35\x2e\x33\
\x32\x35\x2c\x32\x34\x2e\x31\x37\x39\x2d\x32\x33\x2e\x35\x33\x32\
\x2c\x32\x37\x2d\x33\x31\x63\x31\x34\x2e\x35\x31\x33\x2c\x36\x2e\
\x35\x39\x36\x2c\x34\x30\x2e\x33\x33\x33\x2c\x31\x32\x2e\x32\x36\
\x35\x2c\x35\x39\x2c\x38\x20\x20\x20\x20\x63\x33\x2e\x36\x38\x33\
\x2c\x31\x39\x2e\x34\x31\x39\x2d\x32\x38\x2e\x30\x34\x33\x2c\x31\
\x39\x2e\x33\x31\x2d\x32\x33\x2c\x33\x37\x43\x31\x33\x32\x2e\x35\
\x37\x37\x2c\x31\x34\x35\x2e\x37\x30\x35\x2c\x38\x39\x2e\x34\x30\
\x34\x2c\x31\x36\x37\x2e\x32\x39\x32\x2c\x37\x38\x2c\x31\x34\x31\
\x7a\x22\x20\x66\x69\x6c\x6c\x3d\x22\x23\x46\x46\x46\x46\x46\x46\
\x22\x20\x69\x3a\x6b\x6e\x6f\x63\x6b\x6f\x75\x74\x3d\x22\x4f\x66\
\x66\x22\x2f\x3e\x0a\x3c\x70\x61\x74\x68\x20\x64\x3d\x22\x4d\x31\
\x30\x33\x2c\x38\x32\x6c\x31\x33\x39\x2d\x31\x63\x2d\x30\x2e\x36\
\x2c\x33\x2e\x34\x32\x31\x2c\x33\x33\x2e\x36\x33\x33\x2c\x35\x37\
\x2e\x34\x39\x37\x2c\x32\x39\x2c\x36\x37\x63\x2d\x34\x2e\x30\x38\
\x39\x2c\x30\x2e\x34\x31\x38\x2d\x36\x37\x2c\x35\x2d\x36\x37\x2c\
\x35\x20\x20\x20\x20\x63\x36\x2e\x31\x30\x39\x2d\x39\x2e\x33\x37\
\x39\x2d\x31\x33\x2d\x34\x33\x2d\x31\x33\x2d\x34\x33\x4c\x31\x30\
\x33\x2c\x38\x32\x7a\x22\x20\x66\x69\x6c\x6c\x3d\x22\x23\x46\x46\
\x46\x46\x46\x46\x22\x20\x69\x3a\x6b\x6e\x6f\x63\x6b\x6f\x75\x74\
\x3d\x22\x4f\x66\x66\x22\x2f\x3e\x0a\x3c\x70\x61\x74\x68\x20\x64\
\x3d\x22\x4d\x32\x37\x30\x2c\x31\x35\x36\x6c\x2d\x36\x36\x2d\x33\
\x63\x30\x2c\x30\x2d\x32\x33\x2e\x35\x36\x35\x2c\x31\x34\x33\x2e\
\x33\x35\x35\x2d\x32\x34\x2c\x31\x34\x35\x73\x31\x2e\x38\x35\x35\
\x2c\x32\x2e\x35\x33\x36\x2c\x33\x2c\x31\x73\x35\x31\x2d\x38\x32\
\x2c\x35\x31\x2d\x38\x32\x20\x20\x20\x20\x73\x31\x39\x2e\x37\x35\
\x34\x2c\x38\x30\x2e\x37\x30\x31\x2c\x32\x30\x2c\x38\x32\x73\x33\
\x2e\x37\x32\x31\x2c\x31\x2e\x32\x30\x39\x2c\x34\x2c\x30\x53\x32\
\x37\x30\x2c\x31\x35\x36\x2c\x32\x37\x30\x2c\x31\x35\x36\x7a\x22\
\x20\x66\x69\x6c\x6c\x3d\x22\x23\x46\x46\x46\x46\x46\x46\x22\x20\
\x69\x3a\x6b\x6e\x6f\x63\x6b\x6f\x75\x74\x3d\x22\x4f\x66\x66\x22\
\x2f\x3e\x0a\x3c\x2f\x67\x3e\x0a\x3c\x2f\x73\x76\x67\x3e\x0a\
"
qt_resource_name = b"\
\x00\x06\
\x07\x03\x7d\xc3\
\x00\x69\
\x00\x6d\x00\x61\x00\x67\x00\x65\x00\x73\
\x00\x09\
\x08\x97\x87\xa7\
\x00\x68\
\x00\x65\x00\x61\x00\x72\x00\x74\x00\x2e\x00\x73\x00\x76\x00\x67\
\x00\x07\
\x08\x77\x5a\x07\
\x00\x62\
\x00\x61\x00\x64\x00\x2e\x00\x73\x00\x76\x00\x67\
\x00\x09\
\x08\x9b\xad\xc7\
\x00\x74\
\x00\x72\x00\x61\x00\x73\x00\x68\x00\x2e\x00\x73\x00\x76\x00\x67\
"
qt_resource_struct = b"\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x01\
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00\x02\
\x00\x00\x00\x2a\x00\x00\x00\x00\x00\x01\x00\x00\x0f\x25\
\x00\x00\x00\x12\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\
\x00\x00\x00\x3e\x00\x00\x00\x00\x00\x01\x00\x00\x1c\x58\
"
def qInitResources():
QtCore.qRegisterResourceData(0x01, qt_resource_struct, qt_resource_name, qt_resource_data)
def qCleanupResources():
QtCore.qUnregisterResourceData(0x01, qt_resource_struct, qt_resource_name, qt_resource_data)
qInitResources()
| 62.587234 | 96 | 0.726883 | 10,631 | 44,124 | 3.015238 | 0.014298 | 0.03519 | 0.048573 | 0.029949 | 0.855124 | 0.833692 | 0.817876 | 0.796444 | 0.777632 | 0.756668 | 0 | 0.412141 | 0.017338 | 44,124 | 704 | 97 | 62.676136 | 0.327152 | 0.004102 | 0 | 0.085631 | 0 | 0.955007 | 0 | 0 | 0 | 1 | 0.000182 | 0 | 0 | 1 | 0.002903 | false | 0 | 0.001451 | 0 | 0.004354 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
9c234c40ae41398b94337fd4cce1abe7fcad4522 | 41,490 | py | Python | tests/numpy/ufunc_test.py | tobiasholenstein/dace | 38fb56d12b59aa8dfe8bb1ff0068e29c5c75efc9 | [
"BSD-3-Clause"
] | null | null | null | tests/numpy/ufunc_test.py | tobiasholenstein/dace | 38fb56d12b59aa8dfe8bb1ff0068e29c5c75efc9 | [
"BSD-3-Clause"
] | null | null | null | tests/numpy/ufunc_test.py | tobiasholenstein/dace | 38fb56d12b59aa8dfe8bb1ff0068e29c5c75efc9 | [
"BSD-3-Clause"
] | null | null | null | # Copyright 2019-2020 ETH Zurich and the DaCe authors. All rights reserved.
import dace
import math
import numpy as np
import pytest
from common import compare_numpy_output
@compare_numpy_output(check_dtype=True)
def test_ufunc_add_ff(A: dace.float32[10], B: dace.float32[10]):
return np.add(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_subtract_ff(A: dace.float32[10], B: dace.float32[10]):
return np.subtract(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_subtract_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.subtract(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_multiply_ff(A: dace.float32[10], B: dace.float32[10]):
return np.multiply(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_divide_ff(A: dace.float32[10], B: dace.float32[10]):
return np.divide(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_divide_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.divide(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logaddexp_ff(A: dace.float32[10], B: dace.float32[10]):
return np.logaddexp(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logaddexp2_ff(A: dace.float32[10], B: dace.float32[10]):
return np.logaddexp2(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_true_divide_ff(A: dace.float32[10], B: dace.float32[10]):
return np.true_divide(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_true_divide_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.true_divide(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_floor_divide_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.floor_divide(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_floor_divide_ff(A: dace.float32[10], B: dace.float32[10]):
return np.floor_divide(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_floor_divide_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.floor_divide(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_floor_divide_ss(A: dace.int32[10], B: dace.int32[10]):
return np.floor_divide(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_negative_f(A: dace.float32[10]):
return np.negative(A)
@compare_numpy_output(validation_func=lambda a: -a)
def test_ufunc_negative_u(A: dace.uint32[10]):
return np.negative(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_positive_f(A: dace.float32[10]):
return np.positive(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_power_ff(A: dace.float32[10], B: dace.float32[10]):
return np.power(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_power_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.power(A, B)
@compare_numpy_output(non_zero=True, validation_func=lambda a, b: a**b)
def test_ufunc_power_ss(A: dace.int32[10], B: dace.int32[10]):
return np.power(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_float_power_ff(A: dace.float32[10], B: dace.float32[10]):
return np.float_power(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_float_power_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.float_power(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_float_power_ss(A: dace.int32[10], B: dace.int32[10]):
return np.float_power(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_remainder_ff(A: dace.float32[10], B: dace.float32[10]):
return np.remainder(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_remainder_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.remainder(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_remainder_ss(A: dace.int32[10], B: dace.int32[10]):
return np.remainder(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_fmod_ff(A: dace.float32[10], B: dace.float32[10]):
return np.fmod(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_fmod_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.fmod(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_fmod_ss(A: dace.int32[10], B: dace.int32[10]):
return np.fmod(A, B)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_divmod_ff(A: dace.float32[10], B: dace.float32[10]):
Q, R = np.divmod(A, B)
return Q, R
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_divmod_uu(A: dace.uint32[10], B: dace.uint32[10]):
Q, R = np.divmod(A, B)
return Q, R
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_divmod_ss(A: dace.int32[10], B: dace.int32[10]):
Q, R = np.divmod(A, B)
return Q, R
@compare_numpy_output(check_dtype=True)
def test_ufunc_absolute_c(A: dace.complex64[10]):
return np.absolute(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_absolute_f(A: dace.float32[10]):
return np.absolute(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_absolute_u(A: dace.uint32[10]):
return np.absolute(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_fabs_c(A: dace.complex64[10]):
return np.fabs(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_fabs_f(A: dace.float32[10]):
return np.fabs(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_fabs_u(A: dace.uint32[10]):
return np.fabs(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_rint_c(A: dace.complex64[10]):
return np.rint(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_rint_f(A: dace.float32[10]):
return np.rint(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_rint_u(A: dace.uint32[10]):
return np.rint(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_sign_c(A: dace.complex64[10]):
return np.sign(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_sign_f(A: dace.float32[10]):
return np.sign(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_sign_u(A: dace.uint32[10]):
return np.sign(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_heaviside_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.heaviside(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_heaviside_ff(A: dace.float32[10], B: dace.float32[10]):
return np.heaviside(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_heaviside_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.heaviside(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_conj_c(A: dace.complex64[10]):
return np.conj(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_conj_f(A: dace.float32[10]):
return np.conj(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_conj_u(A: dace.uint32[10]):
return np.conj(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_conjugate_c(A: dace.complex64[10]):
return np.conjugate(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_conjugate_f(A: dace.float32[10]):
return np.conjugate(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_conjugate_u(A: dace.uint32[10]):
return np.conjugate(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_exp_c(A: dace.complex64[10]):
return np.exp(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_exp_f(A: dace.float32[10]):
return np.exp(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_exp_u(A: dace.uint32[10]):
return np.exp(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_exp2_c(A: dace.complex64[10]):
return np.exp2(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_exp2_f(A: dace.float32[10]):
return np.exp2(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_exp2_u(A: dace.uint32[10]):
return np.exp2(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log_c(A: dace.complex64[10]):
return np.log(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log_f(A: dace.float32[10]):
return np.log(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log_u(A: dace.uint32[10]):
return np.log(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log2_c(A: dace.complex64[10]):
return np.log2(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log2_f(A: dace.float32[10]):
return np.log2(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log2_u(A: dace.uint32[10]):
return np.log2(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log10_c(A: dace.complex64[10]):
return np.log10(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log10_f(A: dace.float32[10]):
return np.log10(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log10_u(A: dace.uint32[10]):
return np.log10(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_expm1_c(A: dace.complex64[10]):
return np.expm1(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_expm1_f(A: dace.float32[10]):
return np.expm1(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_expm1_u(A: dace.uint32[10]):
return np.expm1(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log1p_c(A: dace.complex64[10]):
return np.log1p(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log1p_f(A: dace.float32[10]):
return np.log1p(A)
@compare_numpy_output(positive=True, check_dtype=True)
def test_ufunc_log1p_u(A: dace.uint32[10]):
return np.log1p(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_sqrt_c(A: dace.complex64[10]):
return np.sqrt(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_sqrt_f(A: dace.float32[10]):
return np.sqrt(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_sqrt_u(A: dace.uint32[10]):
return np.sqrt(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_square_c(A: dace.complex64[10]):
return np.square(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_square_f(A: dace.float32[10]):
return np.square(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_square_u(A: dace.uint32[10]):
return np.square(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_cbrt_c(A: dace.complex64[10]):
return np.cbrt(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_cbrt_f(A: dace.float32[10]):
return np.cbrt(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_cbrt_u(A: dace.uint32[10]):
return np.cbrt(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_reciprocal_c(A: dace.complex64[10]):
return np.reciprocal(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_reciprocal_f(A: dace.float32[10]):
return np.reciprocal(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_reciprocal_u(A: dace.uint32[10]):
return np.reciprocal(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_gcd_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.gcd(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_gcd_ff(A: dace.float32[10], B: dace.float32[10]):
return np.gcd(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_gcd_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.gcd(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_lcm_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.lcm(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_lcm_ff(A: dace.float32[10], B: dace.float32[10]):
return np.lcm(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_lcm_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.lcm(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_sin_c(A: dace.complex64[10]):
return np.sin(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_sin_f(A: dace.float32[10]):
return np.sin(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_sin_u(A: dace.uint32[10]):
return np.sin(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_cos_c(A: dace.complex64[10]):
return np.cos(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_cos_f(A: dace.float32[10]):
return np.cos(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_cos_u(A: dace.uint32[10]):
return np.cos(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_tan_c(A: dace.complex64[10]):
return np.tan(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_tan_f(A: dace.float32[10]):
return np.tan(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_tan_u(A: dace.uint32[10]):
return np.tan(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_arcsin_c(A: dace.complex64[10]):
return np.arcsin(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arcsin_f(A: dace.float32[10]):
return np.arcsin(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arcsin_u(A: dace.uint32[10]):
return np.arcsin(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_arccos_c(A: dace.complex64[10]):
return np.arccos(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arccos_f(A: dace.float32[10]):
return np.arccos(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arccos_u(A: dace.uint32[10]):
return np.arccos(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_arctan_c(A: dace.complex64[10]):
return np.arctan(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arctan_f(A: dace.float32[10]):
return np.arctan(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arctan_u(A: dace.uint32[10]):
return np.arctan(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_sinh_c(A: dace.complex64[10]):
return np.sinh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_sinh_f(A: dace.float32[10]):
return np.sinh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_sinh_u(A: dace.uint32[10]):
return np.sinh(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_cosh_c(A: dace.complex64[10]):
return np.cosh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_cosh_f(A: dace.float32[10]):
return np.cosh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_cosh_u(A: dace.uint32[10]):
return np.cosh(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_tanh_c(A: dace.complex64[10]):
return np.tanh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_tanh_f(A: dace.float32[10]):
return np.tanh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_tanh_u(A: dace.uint32[10]):
return np.tanh(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_arcsinh_c(A: dace.complex64[10]):
return np.arcsinh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arcsinh_f(A: dace.float32[10]):
return np.arcsinh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arcsinh_u(A: dace.uint32[10]):
return np.arcsinh(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_arccosh_c(A: dace.complex64[10]):
return np.arccosh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arccosh_f(A: dace.float32[10]):
return np.arccosh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arccosh_u(A: dace.uint32[10]):
return np.arccosh(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_arctanh_c(A: dace.complex64[10]):
return np.arctanh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arctanh_f(A: dace.float32[10]):
return np.arctanh(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_arctanh_u(A: dace.uint32[10]):
return np.arctanh(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_arctan2_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.arctan2(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_arctan2_ff(A: dace.float32[10], B: dace.float32[10]):
return np.arctan2(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_arctan2_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.arctan2(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_hypot_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.hypot(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_hypot_ff(A: dace.float32[10], B: dace.float32[10]):
return np.hypot(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_hypot_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.hypot(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_degrees_c(A: dace.complex64[10]):
return np.degrees(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_degrees_f(A: dace.float32[10]):
return np.degrees(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_degrees_u(A: dace.uint32[10]):
return np.degrees(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_rad2deg_c(A: dace.complex64[10]):
return np.rad2deg(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_rad2deg_f(A: dace.float32[10]):
return np.rad2deg(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_rad2deg_u(A: dace.uint32[10]):
return np.rad2deg(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_radians_c(A: dace.complex64[10]):
return np.radians(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_radians_f(A: dace.float32[10]):
return np.radians(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_radians_u(A: dace.uint32[10]):
return np.radians(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_deg2rad_c(A: dace.complex64[10]):
return np.deg2rad(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_deg2rad_f(A: dace.float32[10]):
return np.deg2rad(A)
@compare_numpy_output(non_zero=True, check_dtype=True)
def test_ufunc_deg2rad_u(A: dace.uint32[10]):
return np.deg2rad(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_and_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.bitwise_and(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_and_ff(A: dace.float32[10], B: dace.float32[10]):
return np.bitwise_and(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_and_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.bitwise_and(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_and_su(A: dace.int32[10], B: dace.uint32[10]):
return np.bitwise_and(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_or_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.bitwise_or(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_or_ff(A: dace.float32[10], B: dace.float32[10]):
return np.bitwise_or(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_or_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.bitwise_or(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_or_su(A: dace.int32[10], B: dace.uint32[10]):
return np.bitwise_or(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_xor_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.bitwise_xor(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_xor_ff(A: dace.float32[10], B: dace.float32[10]):
return np.bitwise_xor(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_xor_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.bitwise_xor(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_bitwise_xor_su(A: dace.int32[10], B: dace.uint32[10]):
return np.bitwise_xor(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_invert_c(A: dace.complex64[10]):
return np.invert(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_invert_f(A: dace.float32[10]):
return np.invert(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_invert_u(A: dace.uint32[10]):
return np.invert(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_invert_s(A: dace.int32[10]):
return np.invert(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_left_shift_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.left_shift(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_left_shift_ff(A: dace.float32[10], B: dace.float32[10]):
return np.left_shift(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_left_shift_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.left_shift(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_left_shift_su(A: dace.int32[10], B: dace.uint32[10]):
return np.left_shift(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_right_shift_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.right_shift(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_right_shift_ff(A: dace.float32[10], B: dace.float32[10]):
return np.right_shift(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_right_shift_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.right_shift(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_right_shift_su(A: dace.int32[10], B: dace.uint32[10]):
return np.right_shift(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_greater_ff(A: dace.float32[10], B: dace.float32[10]):
return np.greater(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_greater_equal_ff(A: dace.float32[10], B: dace.float32[10]):
return np.greater_equal(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_less_ff(A: dace.float32[10], B: dace.float32[10]):
return np.less(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_less_equal_ff(A: dace.float32[10], B: dace.float32[10]):
return np.less_equal(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_equal_ff(A: dace.float32[10], B: dace.float32[10]):
return np.equal(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_not_equal_ff(A: dace.float32[10], B: dace.float32[10]):
return np.not_equal(A, B)
@pytest.mark.skip
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_and_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.logical_and(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_and_ff(A: dace.float32[10], B: dace.float32[10]):
return np.logical_and(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_and_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.logical_and(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_and_su(A: dace.int32[10], B: dace.uint32[10]):
return np.logical_and(A, B)
@pytest.mark.skip
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_or_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.logical_or(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_or_ff(A: dace.float32[10], B: dace.float32[10]):
return np.logical_or(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_or_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.logical_or(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_or_su(A: dace.int32[10], B: dace.uint32[10]):
return np.logical_or(A, B)
@pytest.mark.skip
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_xor_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.logical_xor(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_xor_ff(A: dace.float32[10], B: dace.float32[10]):
return np.logical_xor(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_xor_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.logical_xor(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_logical_xor_su(A: dace.int32[10], B: dace.uint32[10]):
return np.logical_xor(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_maximum_ff(A: dace.float32[10], B: dace.float32[10]):
return np.maximum(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_maximum_nan_ff(A: dace.float32[10], B: dace.float32[10]):
C = np.true_divide(A, 0)
return np.maximum(C, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_fmax_ff(A: dace.float32[10], B: dace.float32[10]):
return np.fmax(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_fmax_nan_ff(A: dace.float32[10], B: dace.float32[10]):
C = np.true_divide(A, 0)
return np.fmax(C, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_minimum_ff(A: dace.float32[10], B: dace.float32[10]):
return np.minimum(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_minimum_nan_ff(A: dace.float32[10], B: dace.float32[10]):
C = np.true_divide(A, 0)
return np.minimum(C, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_fmin_ff(A: dace.float32[10], B: dace.float32[10]):
return np.fmin(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_fmin_nan_ff(A: dace.float32[10], B: dace.float32[10]):
C = np.true_divide(A, 0)
return np.fmin(C, B)
def test_ufunc_isfinite_c():
@compare_numpy_output(check_dtype=True)
def ufunc_isfinite_c(A: dace.complex64[10]):
A[0] = np.inf
A[1] = np.NaN
return np.isfinite(A)
args = dace.Config.get('compiler', 'cpu', 'args')
print(args)
if args.find('-ffast-math') >= 0:
new_args = args.replace('-ffast-math', '-fno-finite-math-only')
print(new_args)
dace.Config.set('compiler', 'cpu', 'args', value=new_args)
print(dace.Config.get('compiler', 'cpu', 'args'))
ufunc_isfinite_c()
dace.Config.set('compiler', 'cpu', 'args', value=args)
def test_ufunc_isfinite_f():
@compare_numpy_output(check_dtype=True)
def ufunc_isfinite_f(A: dace.float32[10]):
A[0] = np.inf
A[1] = np.NaN
return np.isfinite(A)
args = dace.Config.get('compiler', 'cpu', 'args')
print(args)
if args.find('-ffast-math') >= 0:
new_args = args.replace('-ffast-math', '-fno-finite-math-only')
print(new_args)
dace.Config.set('compiler', 'cpu', 'args', value=new_args)
print(dace.Config.get('compiler', 'cpu', 'args'))
ufunc_isfinite_f()
dace.Config.set('compiler', 'cpu', 'args', value=args)
# NumPy accepts integer arrays in np.isfinite.
# However, if any element of an integer array is inf, it will fail because it
# "<class 'OverflowError'>: cannot convert float infinity to integer"
@pytest.mark.skip
@compare_numpy_output(validation_func=lambda a: np.isfinite(a))
def test_ufunc_isfinite_u(A: dace.uint32[10]):
A[0] = np.inf
A[1] = np.NaN
return np.isfinite(A)
def test_ufunc_isinf_c():
@compare_numpy_output(check_dtype=True)
def ufunc_isinf_c(A: dace.complex64[10]):
A[0] = np.inf
A[1] = np.NaN
return np.isinf(A)
args = dace.Config.get('compiler', 'cpu', 'args')
print(args)
if args.find('-ffast-math') >= 0:
new_args = args.replace('-ffast-math', '-fno-finite-math-only')
print(new_args)
dace.Config.set('compiler', 'cpu', 'args', value=new_args)
print(dace.Config.get('compiler', 'cpu', 'args'))
ufunc_isinf_c()
dace.Config.set('compiler', 'cpu', 'args', value=args)
def test_ufunc_isinf_f():
@compare_numpy_output(check_dtype=True)
def ufunc_isinf_f(A: dace.float32[10]):
A[0] = np.inf
A[1] = np.NaN
return np.isinf(A)
args = dace.Config.get('compiler', 'cpu', 'args')
print(args)
if args.find('-ffast-math') >= 0:
new_args = args.replace('-ffast-math', '-fno-finite-math-only')
print(new_args)
dace.Config.set('compiler', 'cpu', 'args', value=new_args)
print(dace.Config.get('compiler', 'cpu', 'args'))
ufunc_isinf_f()
dace.Config.set('compiler', 'cpu', 'args', value=args)
# NumPy accepts integer arrays in np.isinf.
# However, if any element of an integer array is inf, it will fail because it
# "<class 'OverflowError'>: cannot convert float infinity to integer"
@pytest.mark.skip
@compare_numpy_output(validation_func=lambda a: np.isinf(a))
def test_ufunc_isinf_u(A: dace.uint32[10]):
A[0] = np.inf
A[1] = np.NaN
return np.isinf(A)
def test_ufunc_isnan_c():
@compare_numpy_output(check_dtype=True)
def ufunc_isnan_c(A: dace.complex64[10]):
A[0] = np.inf
A[1] = np.NaN
return np.isnan(A)
args = dace.Config.get('compiler', 'cpu', 'args')
print(args)
if args.find('-ffast-math') >= 0:
new_args = args.replace('-ffast-math', '-fno-finite-math-only')
print(new_args)
dace.Config.set('compiler', 'cpu', 'args', value=new_args)
print(dace.Config.get('compiler', 'cpu', 'args'))
ufunc_isnan_c()
dace.Config.set('compiler', 'cpu', 'args', value=args)
def test_ufunc_isnan_f():
@compare_numpy_output(check_dtype=True)
def ufunc_isnan_f(A: dace.float32[10]):
A[0] = np.inf
A[1] = np.NaN
return np.isnan(A)
args = dace.Config.get('compiler', 'cpu', 'args')
print(args)
if args.find('-ffast-math') >= 0:
new_args = args.replace('-ffast-math', '-fno-finite-math-only')
print(new_args)
dace.Config.set('compiler', 'cpu', 'args', value=new_args)
print(dace.Config.get('compiler', 'cpu', 'args'))
ufunc_isnan_f()
dace.Config.set('compiler', 'cpu', 'args', value=args)
# NumPy accepts integer arrays in np.isnan.
# However, if any element of an integer array is inf, it will fail because it
# "<class 'OverflowError'>: cannot convert float infinity to integer"
@pytest.mark.skip
@compare_numpy_output(validation_func=lambda a: np.isnan(a))
def test_ufunc_isnan_u(A: dace.uint32[10]):
A[0] = np.inf
A[1] = np.NaN
return np.isnan(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_signbit_c(A: dace.complex64[10]):
return np.signbit(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_signbit_f(A: dace.float32[10]):
return np.signbit(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_signbit_u(A: dace.uint32[10]):
return np.signbit(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_copysign_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.copysign(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_copysign_ff(A: dace.float32[10], B: dace.float32[10]):
return np.copysign(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_copysign_fd(A: dace.float32[10], B: dace.float64[10]):
return np.copysign(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_copysign_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.copysign(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_nextafter_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.nextafter(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_nextafter_ff(A: dace.float32[10], B: dace.float32[10]):
return np.nextafter(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_nextafter_fd(A: dace.float32[10], B: dace.float64[10]):
return np.nextafter(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_nextafter_uu(A: dace.uint32[10], B: dace.uint32[10]):
return np.nextafter(A, B)
@compare_numpy_output(validation_func=lambda a: np.nextafter(a, np.inf) - a)
def test_ufunc_spacing_c(A: dace.complex64[10]):
return np.spacing(A)
@compare_numpy_output(validation_func=lambda a: np.nextafter(a, np.inf) - a)
def test_ufunc_spacing_f(A: dace.float32[10]):
return np.spacing(A)
@compare_numpy_output(validation_func=lambda a: np.nextafter(a, np.inf) - a)
def test_ufunc_spacing_u(A: dace.uint32[10]):
return np.spacing(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_modf_c(A: dace.complex64[10]):
Q, R = np.modf(A)
return Q, R
@compare_numpy_output(check_dtype=True)
def test_ufunc_modf_f(A: dace.float32[10]):
Q, R = np.modfd(A)
return Q, R
@compare_numpy_output(check_dtype=True)
def test_ufunc_modf_u(A: dace.uint32[10]):
Q, R = np.modf(A)
return Q, R
@compare_numpy_output(check_dtype=True)
def test_ufunc_ldexp_cc(A: dace.complex64[10], B: dace.complex64[10]):
return np.ldexp(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_ldexp_fs(A: dace.float32[10], B: dace.int32[10]):
return np.ldexp(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_ldexp_ss(A: dace.int32[10], B: dace.int32[10]):
return np.ldexp(A, B)
@compare_numpy_output(check_dtype=True)
def test_ufunc_frexp_c(A: dace.complex64[10]):
Q, R = np.frexp(A)
return Q, R
@compare_numpy_output(check_dtype=True)
def test_ufunc_frexp_f(A: dace.float32[10]):
Q, R = np.frexp(A)
return Q, R
@compare_numpy_output(check_dtype=True)
def test_ufunc_frexp_u(A: dace.uint32[10]):
Q, R = np.frexp(A)
return Q, R
@compare_numpy_output(check_dtype=True)
def test_ufunc_floor_c(A: dace.complex64[10]):
return np.floor(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_floor_f(A: dace.float32[10]):
return np.floor(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_floor_u(A: dace.uint32[10]):
return np.floor(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_ceil_c(A: dace.complex64[10]):
return np.ceil(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_ceil_f(A: dace.float32[10]):
return np.ceil(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_ceil_u(A: dace.uint32[10]):
return np.ceil(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_trunc_c(A: dace.complex64[10]):
return np.trunc(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_trunc_f(A: dace.float32[10]):
return np.trunc(A)
@compare_numpy_output(check_dtype=True)
def test_ufunc_trunc_u(A: dace.uint32[10]):
return np.trunc(A)
if __name__ == "__main__":
test_ufunc_add_ff()
test_ufunc_subtract_ff()
test_ufunc_subtract_uu()
test_ufunc_multiply_ff()
test_ufunc_divide_ff()
test_ufunc_divide_uu()
test_ufunc_logaddexp_ff()
test_ufunc_logaddexp2_ff()
test_ufunc_true_divide_ff()
test_ufunc_true_divide_uu()
test_ufunc_floor_divide_cc()
test_ufunc_floor_divide_ff()
test_ufunc_floor_divide_uu()
test_ufunc_floor_divide_ss()
test_ufunc_negative_f()
test_ufunc_negative_u() # NumPy doesn't change unsigned to signed
test_ufunc_positive_f()
test_ufunc_power_ff()
test_ufunc_power_uu()
test_ufunc_power_ss() # DaCe implementation behaves like Python
test_ufunc_float_power_ff()
test_ufunc_float_power_uu()
test_ufunc_float_power_ss()
test_ufunc_remainder_ff()
test_ufunc_remainder_uu()
test_ufunc_remainder_ss()
test_ufunc_fmod_ff()
test_ufunc_fmod_uu()
test_ufunc_fmod_ss()
test_ufunc_divmod_ff()
test_ufunc_divmod_uu()
test_ufunc_divmod_ss()
test_ufunc_absolute_c()
test_ufunc_absolute_f()
test_ufunc_absolute_u()
test_ufunc_fabs_c()
test_ufunc_fabs_f()
test_ufunc_fabs_u()
test_ufunc_rint_c()
test_ufunc_rint_f()
test_ufunc_rint_u()
test_ufunc_sign_c()
test_ufunc_sign_f()
test_ufunc_sign_u()
test_ufunc_heaviside_cc()
test_ufunc_heaviside_ff()
test_ufunc_heaviside_uu()
test_ufunc_conj_c()
test_ufunc_conj_f()
test_ufunc_conj_u()
test_ufunc_conjugate_c()
test_ufunc_conjugate_f()
test_ufunc_conjugate_u()
test_ufunc_exp_c()
test_ufunc_exp_f()
test_ufunc_exp_u()
test_ufunc_exp2_c()
test_ufunc_exp2_f()
test_ufunc_exp2_u()
test_ufunc_log_c()
test_ufunc_log_f()
test_ufunc_log_u()
test_ufunc_log2_c()
test_ufunc_log2_f()
test_ufunc_log2_u()
test_ufunc_log10_c()
test_ufunc_log10_f()
test_ufunc_log10_u()
test_ufunc_expm1_c()
test_ufunc_expm1_f()
test_ufunc_expm1_u()
test_ufunc_log1p_c()
test_ufunc_log1p_f()
test_ufunc_log1p_u()
test_ufunc_sqrt_c()
test_ufunc_sqrt_f()
test_ufunc_sqrt_u()
test_ufunc_square_c()
test_ufunc_square_f()
test_ufunc_square_u()
test_ufunc_cbrt_c()
test_ufunc_cbrt_f()
test_ufunc_cbrt_u()
test_ufunc_reciprocal_c()
test_ufunc_reciprocal_f()
test_ufunc_reciprocal_u()
test_ufunc_gcd_cc()
test_ufunc_gcd_ff()
test_ufunc_gcd_uu()
test_ufunc_lcm_cc()
test_ufunc_lcm_ff()
test_ufunc_lcm_uu()
test_ufunc_sin_c()
test_ufunc_sin_f()
test_ufunc_sin_u()
test_ufunc_cos_c()
test_ufunc_cos_f()
test_ufunc_cos_u()
test_ufunc_tan_c()
test_ufunc_tan_f()
test_ufunc_tan_u()
test_ufunc_arcsin_c()
test_ufunc_arcsin_f()
test_ufunc_arcsin_u()
test_ufunc_arccos_c()
test_ufunc_arccos_f()
test_ufunc_arccos_u()
test_ufunc_arctan_c()
test_ufunc_arctan_f()
test_ufunc_arctan_u()
test_ufunc_sinh_c()
test_ufunc_sinh_f()
test_ufunc_sinh_u()
test_ufunc_cosh_c()
test_ufunc_cosh_f()
test_ufunc_cosh_u()
test_ufunc_tanh_c()
test_ufunc_tanh_f()
test_ufunc_tanh_u()
test_ufunc_arcsinh_c()
test_ufunc_arcsinh_f()
test_ufunc_arcsinh_u()
test_ufunc_arccosh_c()
test_ufunc_arccosh_f()
test_ufunc_arccosh_u()
test_ufunc_arctanh_c()
test_ufunc_arctanh_f()
test_ufunc_arctanh_u()
test_ufunc_arctan2_cc()
test_ufunc_arctan2_ff()
test_ufunc_arctan2_uu()
test_ufunc_hypot_cc()
test_ufunc_hypot_ff()
test_ufunc_hypot_uu()
test_ufunc_degrees_c()
test_ufunc_degrees_f()
test_ufunc_degrees_u()
test_ufunc_rad2deg_c()
test_ufunc_rad2deg_f()
test_ufunc_rad2deg_u()
test_ufunc_radians_c()
test_ufunc_radians_f()
test_ufunc_radians_u()
test_ufunc_deg2rad_c()
test_ufunc_deg2rad_f()
test_ufunc_deg2rad_u()
test_ufunc_bitwise_and_cc()
test_ufunc_bitwise_and_ff()
test_ufunc_bitwise_and_uu()
test_ufunc_bitwise_and_su()
test_ufunc_bitwise_or_cc()
test_ufunc_bitwise_or_ff()
test_ufunc_bitwise_or_uu()
test_ufunc_bitwise_or_su()
test_ufunc_bitwise_xor_cc()
test_ufunc_bitwise_xor_ff()
test_ufunc_bitwise_xor_uu()
test_ufunc_bitwise_xor_su()
test_ufunc_invert_c()
test_ufunc_invert_f()
test_ufunc_invert_u()
test_ufunc_invert_s()
test_ufunc_left_shift_cc()
test_ufunc_left_shift_ff()
test_ufunc_left_shift_uu()
test_ufunc_left_shift_su()
test_ufunc_right_shift_cc()
test_ufunc_right_shift_ff()
test_ufunc_right_shift_uu()
test_ufunc_right_shift_su()
test_ufunc_greater_ff()
test_ufunc_greater_equal_ff()
test_ufunc_less_ff()
test_ufunc_less_equal_ff()
test_ufunc_equal_ff()
test_ufunc_not_equal_ff()
# test_ufunc_logical_and_cc() # TODO: How to convert to bool?
test_ufunc_logical_and_ff()
test_ufunc_logical_and_uu()
test_ufunc_logical_and_su()
# test_ufunc_logical_or_cc() # TODO: How to convert to bool?
test_ufunc_logical_or_ff()
test_ufunc_logical_or_uu()
test_ufunc_logical_or_su()
# test_ufunc_logical_xor_cc() # TODO: How to convert to bool?
test_ufunc_logical_xor_ff()
test_ufunc_logical_xor_uu()
test_ufunc_logical_xor_su()
test_ufunc_maximum_ff()
test_ufunc_maximum_nan_ff()
test_ufunc_fmax_ff()
test_ufunc_fmax_nan_ff()
test_ufunc_minimum_ff()
test_ufunc_minimum_nan_ff()
test_ufunc_fmin_ff()
test_ufunc_fmin_nan_ff()
test_ufunc_isfinite_c()
test_ufunc_isfinite_f()
# test_ufunc_isfinite_u()
test_ufunc_isinf_c()
test_ufunc_isinf_f()
# test_ufunc_isinf_u())
test_ufunc_isnan_c()
test_ufunc_isnan_f()
# test_ufunc_isnan_u()
test_ufunc_signbit_c()
test_ufunc_signbit_f()
test_ufunc_signbit_u()
test_ufunc_copysign_cc()
test_ufunc_copysign_ff()
test_ufunc_copysign_fd()
test_ufunc_copysign_uu()
test_ufunc_nextafter_cc()
test_ufunc_nextafter_ff()
test_ufunc_nextafter_fd()
test_ufunc_nextafter_uu()
test_ufunc_spacing_c() # Unclear formula
test_ufunc_spacing_f() # Unclear formula
test_ufunc_spacing_u() # Unclear formula
test_ufunc_modf_c()
test_ufunc_modf_f()
test_ufunc_modf_u()
test_ufunc_ldexp_cc()
test_ufunc_ldexp_fs()
test_ufunc_ldexp_ss()
test_ufunc_frexp_c()
test_ufunc_frexp_f()
test_ufunc_frexp_u()
test_ufunc_floor_c()
test_ufunc_floor_f()
test_ufunc_floor_u()
test_ufunc_ceil_c()
test_ufunc_ceil_f()
test_ufunc_ceil_u()
test_ufunc_trunc_c()
test_ufunc_trunc_f()
test_ufunc_trunc_u()
| 26.976593 | 77 | 0.74001 | 6,994 | 41,490 | 4.048613 | 0.027881 | 0.150657 | 0.151293 | 0.137484 | 0.877031 | 0.841856 | 0.83723 | 0.776981 | 0.758617 | 0.743961 | 0 | 0.040862 | 0.131743 | 41,490 | 1,537 | 78 | 26.994144 | 0.74517 | 0.024319 | 0 | 0.511005 | 0 | 0 | 0.015474 | 0.003114 | 0 | 0 | 0 | 0.000651 | 0 | 1 | 0.232536 | false | 0 | 0.004785 | 0.205742 | 0.464115 | 0.017225 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
9c3645627cd6800b9fafdb638e570bf801a0fffe | 16,520 | py | Python | test/myproject/blog/rdsfilter.py | d0r6y/CloudKloud | 33747e208bd625a6c80bae0f87a0997181be4298 | [
"MIT"
] | null | null | null | test/myproject/blog/rdsfilter.py | d0r6y/CloudKloud | 33747e208bd625a6c80bae0f87a0997181be4298 | [
"MIT"
] | null | null | null | test/myproject/blog/rdsfilter.py | d0r6y/CloudKloud | 33747e208bd625a6c80bae0f87a0997181be4298 | [
"MIT"
] | null | null | null | import boto3
import time
import json
import datetime
AWS_ACCESS_KEY_ID = ""
AWS_SECRET_ACCESS_KEY = ""
AWS_DEFAULT_REGION = "ap-northeast-2"
logs = boto3.client('logs',
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
region_name=AWS_DEFAULT_REGION)
s3 = boto3.client('s3',aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
region_name=AWS_DEFAULT_REGION)
# DB 목록 검색
def DescribeDBInstances():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="DescribeDBInstances"}',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="DescribeDBInstances"}'
)
for i in log['events']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/15' )
return response
# DB 내의 데이터 삭제
def DeleteDBData():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='/aws/rds/instance/database-2/general',
filterPattern='?DELETE ?delete ?DROP',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='/aws/rds/instance/database-2/general',
filterPattern='?DELETE ?delete ?DROP'
)
for i in log['events']:
if u'innodb_txn_key' not in i['message']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/16' )
return response
# DB 사용자 추가
def AddUser():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='/aws/rds/instance/database-2/general',
filterPattern='CREATE USER',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='/aws/rds/instance/database-2/general',
filterPattern='CREATE USER'
)
for i in log['events']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/17' )
return response
# DB 사용자 권한 수정
def GrantAuth():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='/aws/rds/instance/database-2/general',
filterPattern='GRANT',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='/aws/rds/instance/database-2/general',
filterPattern='GRANT'
)
for i in log['events']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/18' )
return response
# RDS 관련 API call
def RDSAPICall():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="=DescribeDBInstances"}',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="=DescribeDBInstances"}'
)
for i in log['events']:
msg_json = json.loads(i.get('message'))
if 'rds.amazonaws.com' in msg_json['eventSource']:
if '"errorCode":"AccessDenied"' in i['message']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/19' )
return response
# 로깅 관련 파라미터 그룹 수정 + nextToken
def ModifyDBParameterGroup():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && ($.eventName="ModifyDBParameterGroup" || $.eventName="ResetDBParameterGroup")}',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && ($.eventName="ModifyDBParameterGroup" || $.eventName="ResetDBParameterGroup")}'
)
for i in log['events']:
temp = str(i['message'])
if 'general_log' in temp:
if temp[temp.index('general_log') + 31:temp.index('general_log') + 32] != '1':
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
elif 'slow_query_log' in temp:
if temp[temp.index('slow_query_log') + 34:temp.index('slow_query_log') + 35] != '1':
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
elif 'log_output' in temp:
if temp[temp.index('log_output') + 30:temp.index('log_output') + 34] != 'FILE':
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/20' )
return response
# 로깅 관련 파라미터 그룺 삭제
def DeleteDBParameterGroup():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="DeleteDBParameterGroup"}',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="DeleteDBParameterGroup"}'
)
for i in log['events']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/21' )
return response
# DB 중지
def StopDBInstance():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="StopDBInstance"}',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="StopDBInstance"}'
)
for i in log['events']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
jsonoutput = json.dumps(output)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/22' )
return response
# DB 삭제
def DeleteDBInstance():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="DeleteDBInstance"}',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="DeleteDBInstance"}'
)
for i in log['events']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/23' )
return response
# 위험가능성 있는 OS에서의 API call
def SusOSAPI():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{ $.eventSource = "rds.amazonaws.com" && ($.userAgent="*kali*" || $.userAgent="*parrot*" || $.userAgent="*pentoo*")}',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{ $.eventSource = "rds.amazonaws.com" && ($.userAgent="*kali*" || $.userAgent="*parrot*" || $.userAgent="*pentoo*")}'
)
for i in log['events']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/24' )
return response
# cloudwatch 로깅 수정
def StopWatchLogs():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="ModifyDBInstance"}',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='all_region_cloudtrail',
filterPattern='{$.eventSource = "rds.amazonaws.com" && $.eventName="ModifyDBInstance"}'
)
for i in log['events']:
temp = str(i['message'])
if 'disableLogTypes' in temp:
if temp[temp.index('disableLogTypes') + 17:temp.index('disableLogTypes') + 19] != "[]":
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/25' )
return response
# DB 접속 실패 기록
def RDSAccessDenied():
cnt = 0
output = []
next_token = ''
while True:
if next_token:
log = logs.filter_log_events(
logGroupName='/aws/rds/instance/database-2/general',
filterPattern='Connect Access denied',
nextToken=next_token
)
else:
log = logs.filter_log_events(
logGroupName='/aws/rds/instance/database-2/general',
filterPattern='Connect Access denied'
)
for i in log['events']:
result = {"id" : cnt, "timestamp": datetime.datetime.fromtimestamp(i['timestamp']/1000).strftime('%Y-%m-%d %H:%M:%S'), "message": i['message']}
cnt += 1
output.append(result)
if log.get("nextToken"):
next_token = log["nextToken"]
else:
print(cnt)
break
ret = json.dumps({"total" : cnt, "totalNotFiltered" : cnt, "rows" : output})
response = s3.put_object(Body=ret,
Bucket='threatitem',
Key='RDS/26' )
return response
| 36.069869 | 163 | 0.517252 | 1,628 | 16,520 | 5.129607 | 0.09828 | 0.05173 | 0.034487 | 0.045983 | 0.89642 | 0.885403 | 0.880374 | 0.874626 | 0.874626 | 0.870435 | 0 | 0.013868 | 0.345278 | 16,520 | 457 | 164 | 36.148797 | 0.758229 | 0.010351 | 0 | 0.758883 | 0 | 0.010152 | 0.236119 | 0.081175 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030457 | false | 0 | 0.010152 | 0 | 0.071066 | 0.027919 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9c42e0c8403c7254dbd37f99a03c675aaa418c21 | 28 | py | Python | TSL2561/__init__.py | bbaumg/Python_TSL2561 | 0bf09cbe319d51673331ee02d852bd0ba5d0ab3e | [
"MIT"
] | null | null | null | TSL2561/__init__.py | bbaumg/Python_TSL2561 | 0bf09cbe319d51673331ee02d852bd0ba5d0ab3e | [
"MIT"
] | null | null | null | TSL2561/__init__.py | bbaumg/Python_TSL2561 | 0bf09cbe319d51673331ee02d852bd0ba5d0ab3e | [
"MIT"
] | null | null | null | from TSL2561 import TSL2561
| 14 | 27 | 0.857143 | 4 | 28 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 0.142857 | 28 | 1 | 28 | 28 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9c444fbe6546e843ae0ad6bcabe9b87878c947fb | 10,774 | py | Python | suites/RegistrationApi/SubmitRegistrationSolution.py | dbulka/pytests | d2658cff3832293cb1b8abcd970f7df83a2b5035 | [
"MIT"
] | null | null | null | suites/RegistrationApi/SubmitRegistrationSolution.py | dbulka/pytests | d2658cff3832293cb1b8abcd970f7df83a2b5035 | [
"MIT"
] | null | null | null | suites/RegistrationApi/SubmitRegistrationSolution.py | dbulka/pytests | d2658cff3832293cb1b8abcd970f7df83a2b5035 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import lemoncheesecake.api as lcc
from lemoncheesecake.matching import check_that, is_integer, is_true, equal_to
from common.base_test import BaseTest
SUITE = {
"description": "Registration Api"
}
@lcc.prop("main", "type")
@lcc.prop("negative", "type")
@lcc.tags("api", "notice", "registration_api", "submit_registration_solution")
@lcc.suite("Registration API", rank=1)
class SubmitRegistrationSolution(BaseTest):
def __init__(self):
super().__init__()
self.__database_api_identifier = None
self.__registration_api_identifier = None
def setup_suite(self):
super().setup_suite()
lcc.set_step("Setup for {}".format(self.__class__.__name__))
self.__database_api_identifier = self.get_identifier("database")
self.__registration_api_identifier = self.get_identifier("registration")
lcc.log_info(
"API identifiers are: database='{}', registration='{}'".format(self.__database_api_identifier,
self.__registration_api_identifier))
@lcc.tags("submit_registration_solution")
@lcc.test("Check method submit_registration_solution of registration_api")
def method_main_check(self, get_random_integer, get_random_valid_account_name):
callback = get_random_integer
account_name = get_random_valid_account_name
generate_keys = self.generate_keys()
public_key = generate_keys[1]
lcc.set_step("Get 'request_registration_task' and solve")
response_id = self.send_request(self.get_request("request_registration_task"),
self.__registration_api_identifier)
pow_algorithm_data = self.get_response(response_id)["result"]
solution = self.solve_registration_task(pow_algorithm_data["block_id"],
pow_algorithm_data["rand_num"],
pow_algorithm_data["difficulty"])
check_that("registration task solution", solution, is_integer())
lcc.set_step("Check that 'submit_registration_solution' completed successfully")
account_params = [callback, account_name, public_key, public_key, solution, pow_algorithm_data["rand_num"]]
response_id = self.send_request(self.get_request("submit_registration_solution", account_params),
self.__registration_api_identifier)
result = self.get_response(response_id)["result"]
check_that("'submit_registration_solution' result", result, is_true())
response_id = self.send_request(self.get_request("get_account_by_name", [account_name]),
self.__database_api_identifier)
result = self.get_response(response_id)["result"]
check_that("new account name", result["name"], equal_to(account_name))
check_that("new account 'echorand_key'", result["echorand_key"], equal_to(public_key))
@lcc.prop("negative", "type")
@lcc.tags("api", "notice", "registration_api", "submit_registration_solution")
@lcc.suite("Negative testing of method 'submit_registration_solution'")
class NegativeTesting(BaseTest):
def __init__(self):
super().__init__()
self.__database_api_identifier = None
self.__registration_api_identifier = None
def setup_suite(self):
super().setup_suite()
lcc.set_step("Setup for {}".format(self.__class__.__name__))
self.__database_api_identifier = self.get_identifier("database")
self.__registration_api_identifier = self.get_identifier("registration")
lcc.log_info(
"API identifiers are: database='{}', registration='{}'".format(self.__database_api_identifier,
self.__registration_api_identifier))
def prepare_rand_num_and_task_solution(self):
lcc.set_step("Get 'request_registration_task' and solve")
response_id = self.send_request(self.get_request("request_registration_task"),
self.__registration_api_identifier)
pow_algorithm_data = self.get_response(response_id)["result"]
solution = self.solve_registration_task(pow_algorithm_data["block_id"],
pow_algorithm_data["rand_num"],
pow_algorithm_data["difficulty"])
return pow_algorithm_data["rand_num"], solution
@lcc.test("Register account with wrong 'account name'")
@lcc.depends_on("RegistrationApi.SubmitRegistrationSolution.SubmitRegistrationSolution.method_main_check")
def submit_registration_solution_with_wrong_account_name(self, get_random_integer, get_random_valid_account_name):
callback = get_random_integer
account_name = get_random_valid_account_name + "A"
generate_keys = self.generate_keys()
public_key = generate_keys[1]
rand_num, solution = self.prepare_rand_num_and_task_solution()
solution = solution
expected_error_message = "Assert Exception: is_valid_name( name ): "
lcc.set_step("Check that 'submit_registration_solution' crashes at each execution")
account_params = [callback, account_name, public_key, public_key, solution, rand_num]
response_id = self.send_request(self.get_request("submit_registration_solution", account_params),
self.__registration_api_identifier)
error = self.get_response(response_id, negative=True)["error"]
check_that("error message", error["message"], equal_to(expected_error_message))
response_id = self.send_request(self.get_request("get_account_by_name", [account_name]),
self.__database_api_identifier)
result = self.get_response(response_id)["result"]
check_that("account creation state", result, equal_to(None))
@lcc.test("Register account with wrong 'public key'")
@lcc.depends_on("RegistrationApi.SubmitRegistrationSolution.SubmitRegistrationSolution.method_main_check")
def submit_registration_solution_with_wrong_public_key(self, get_random_integer, get_random_valid_account_name):
callback = get_random_integer
account_name = get_random_valid_account_name
generate_keys = self.generate_keys()
public_key = generate_keys[1]
error_pk = public_key + '1'
rand_num, solution = self.prepare_rand_num_and_task_solution()
expected_error_message = "invalid eddsa key length: Eddsa public key length should be 32 bytes!"
lcc.set_step("Check that 'submit_registration_solution' crashes at each execution")
account_params = [callback, account_name, error_pk, public_key, solution, rand_num]
response_id = self.send_request(self.get_request("submit_registration_solution", account_params),
self.__registration_api_identifier)
error = self.get_response(response_id, negative=True)["error"]
check_that("error message", error["message"], equal_to(expected_error_message))
account_params = [callback, account_name, public_key, error_pk, solution, rand_num]
response_id = self.send_request(self.get_request("submit_registration_solution", account_params),
self.__registration_api_identifier)
error = self.get_response(response_id, negative=True)["error"]
check_that("error message", error["message"], equal_to(expected_error_message))
response_id = self.send_request(self.get_request("get_account_by_name", [account_name]),
self.__database_api_identifier)
result = self.get_response(response_id)["result"]
check_that("account creation state", result, equal_to(None))
@lcc.test("Register account with wrong 'rand_num'")
@lcc.depends_on("RegistrationApi.SubmitRegistrationSolution.SubmitRegistrationSolution.method_main_check")
def submit_registration_solution_with_wrong_rand_num(self, get_random_integer, get_random_valid_account_name):
callback = get_random_integer
account_name = get_random_valid_account_name
generate_keys = self.generate_keys()
public_key = generate_keys[1]
rand_num, solution = self.prepare_rand_num_and_task_solution()
rand_num = rand_num + "1"
expected_error_message = "Parse Error: Couldn't parse uint64_t"
lcc.set_step("Check that 'submit_registration_solution' crashes at each execution")
account_params = [callback, account_name, public_key, public_key, solution, rand_num]
response_id = self.send_request(self.get_request("submit_registration_solution", account_params),
self.__registration_api_identifier)
error = self.get_response(response_id, negative=True)["error"]
check_that("error message", error["message"], equal_to(expected_error_message))
response_id = self.send_request(self.get_request("get_account_by_name", [account_name]),
self.__database_api_identifier)
result = self.get_response(response_id)["result"]
check_that("account creation state", result, equal_to(None))
@lcc.test("Register account with wrong 'task solution'")
@lcc.depends_on("RegistrationApi.SubmitRegistrationSolution.SubmitRegistrationSolution.method_main_check")
def submit_registration_solution_with_wrong_solution(self, get_random_integer, get_random_valid_account_name):
callback = get_random_integer
account_name = get_random_valid_account_name
generate_keys = self.generate_keys()
public_key = generate_keys[1]
rand_num, solution = self.prepare_rand_num_and_task_solution()
solution = solution + 1
lcc.set_step("Check that 'submit_registration_solution' crashes at each execution")
account_params = [callback, account_name, public_key, public_key, solution, rand_num]
response_id = self.send_request(self.get_request("submit_registration_solution", account_params),
self.__registration_api_identifier)
result = self.get_response(response_id)["result"]
check_that("result", result, equal_to(False))
response_id = self.send_request(self.get_request("get_account_by_name", [account_name]),
self.__database_api_identifier)
result = self.get_response(response_id)["result"]
check_that("account creation state", result, equal_to(None))
| 59.524862 | 118 | 0.688509 | 1,224 | 10,774 | 5.607843 | 0.098039 | 0.035693 | 0.079545 | 0.059149 | 0.861888 | 0.852418 | 0.843677 | 0.837704 | 0.831148 | 0.831148 | 0 | 0.001664 | 0.219231 | 10,774 | 180 | 119 | 59.855556 | 0.814313 | 0.001949 | 0 | 0.740506 | 0 | 0 | 0.215887 | 0.08762 | 0 | 0 | 0 | 0 | 0.006329 | 1 | 0.063291 | false | 0 | 0.018987 | 0 | 0.101266 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9c48f8b69754cab11f9eae5f4a60a4b657d4a83f | 37 | py | Python | script.py | cebradley/test-repo-course | 34243d891cfd4ed915d2e13f51b5ad4cdc12bcd3 | [
"MIT"
] | null | null | null | script.py | cebradley/test-repo-course | 34243d891cfd4ed915d2e13f51b5ad4cdc12bcd3 | [
"MIT"
] | null | null | null | script.py | cebradley/test-repo-course | 34243d891cfd4ed915d2e13f51b5ad4cdc12bcd3 | [
"MIT"
] | null | null | null | print('Whaddduuuuuuuuuuupppp?????/')
| 18.5 | 36 | 0.702703 | 2 | 37 | 13 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 37 | 1 | 37 | 37 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0.72973 | 0.72973 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
9c523eecbf0891adb2680a616f0227175fb615d9 | 943 | py | Python | zones/datasets/data.py | jgrss/zones | cb6495ab18e49111f31f7c2951d3b1d4abe2bab4 | [
"MIT"
] | 1 | 2021-03-27T03:01:58.000Z | 2021-03-27T03:01:58.000Z | zones/datasets/data.py | jgrss/zones | cb6495ab18e49111f31f7c2951d3b1d4abe2bab4 | [
"MIT"
] | 1 | 2020-01-08T01:21:19.000Z | 2020-01-16T00:21:42.000Z | zones/datasets/data.py | jgrss/zones | cb6495ab18e49111f31f7c2951d3b1d4abe2bab4 | [
"MIT"
] | 3 | 2019-11-12T17:25:24.000Z | 2022-03-08T08:30:28.000Z | import os
from pathlib import Path
def load_01_single():
file_path = Path(os.path.dirname(__file__))
raster = file_path / 'raster' / '01_single_band_utm.tif'
vector = file_path / 'vector' / '01_vector_wgs84.gpkg'
return raster, vector
def load_01_multi():
file_path = Path(os.path.dirname(__file__))
raster = file_path / 'raster' / '01_multi_band_utm.tif'
vector = file_path / 'vector' / '01_vector_wgs84.gpkg'
return raster, vector
def load_01_single_points():
file_path = Path(os.path.dirname(__file__))
raster = file_path / 'raster' / '01_single_band_utm.tif'
vector = file_path / 'vector' / '01_vector_points_wgs84.gpkg'
return raster, vector
def load_01_multi_points():
file_path = Path(os.path.dirname(__file__))
raster = file_path / 'raster' / '01_multi_band_utm.tif'
vector = file_path / 'vector' / '01_vector_points_wgs84.gpkg'
return raster, vector
| 21.930233 | 65 | 0.696713 | 134 | 943 | 4.455224 | 0.156716 | 0.160804 | 0.060302 | 0.093802 | 0.916248 | 0.916248 | 0.916248 | 0.916248 | 0.916248 | 0.884422 | 0 | 0.04183 | 0.188759 | 943 | 42 | 66 | 22.452381 | 0.738562 | 0 | 0 | 0.727273 | 0 | 0 | 0.241782 | 0.148462 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.090909 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9c5345c7504ed1501307aba2bacfcd1b5cbb866a | 14,818 | py | Python | bookwyrm/migrations/0020_auto_20201208_0213.py | daveross/bookwyrm | d3251f511c184ae6b94b191b33919849c81ef2c2 | [
"CC0-1.0"
] | null | null | null | bookwyrm/migrations/0020_auto_20201208_0213.py | daveross/bookwyrm | d3251f511c184ae6b94b191b33919849c81ef2c2 | [
"CC0-1.0"
] | null | null | null | bookwyrm/migrations/0020_auto_20201208_0213.py | daveross/bookwyrm | d3251f511c184ae6b94b191b33919849c81ef2c2 | [
"CC0-1.0"
] | null | null | null | # Generated by Django 3.0.7 on 2020-12-08 02:13
import bookwyrm.models.fields
from django.conf import settings
import django.core.validators
from django.db import migrations, models
import django.db.models.deletion
import django.utils.timezone
class Migration(migrations.Migration):
dependencies = [
('bookwyrm', '0019_auto_20201130_1939'),
]
operations = [
migrations.AlterField(
model_name='author',
name='aliases',
field=bookwyrm.models.fields.ArrayField(base_field=models.CharField(max_length=255), blank=True, default=list, size=None),
),
migrations.AlterField(
model_name='author',
name='bio',
field=bookwyrm.models.fields.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='author',
name='born',
field=bookwyrm.models.fields.DateTimeField(blank=True, null=True),
),
migrations.AlterField(
model_name='author',
name='died',
field=bookwyrm.models.fields.DateTimeField(blank=True, null=True),
),
migrations.AlterField(
model_name='author',
name='name',
field=bookwyrm.models.fields.CharField(max_length=255),
),
migrations.AlterField(
model_name='author',
name='openlibrary_key',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='author',
name='wikipedia_link',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='book',
name='authors',
field=bookwyrm.models.fields.ManyToManyField(to='bookwyrm.Author'),
),
migrations.AlterField(
model_name='book',
name='cover',
field=bookwyrm.models.fields.ImageField(blank=True, null=True, upload_to='covers/'),
),
migrations.AlterField(
model_name='book',
name='description',
field=bookwyrm.models.fields.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='book',
name='first_published_date',
field=bookwyrm.models.fields.DateTimeField(blank=True, null=True),
),
migrations.AlterField(
model_name='book',
name='goodreads_key',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='book',
name='languages',
field=bookwyrm.models.fields.ArrayField(base_field=models.CharField(max_length=255), blank=True, default=list, size=None),
),
migrations.AlterField(
model_name='book',
name='librarything_key',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='book',
name='openlibrary_key',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='book',
name='published_date',
field=bookwyrm.models.fields.DateTimeField(blank=True, null=True),
),
migrations.AlterField(
model_name='book',
name='series',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='book',
name='series_number',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='book',
name='sort_title',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='book',
name='subject_places',
field=bookwyrm.models.fields.ArrayField(base_field=models.CharField(max_length=255), blank=True, default=list, null=True, size=None),
),
migrations.AlterField(
model_name='book',
name='subjects',
field=bookwyrm.models.fields.ArrayField(base_field=models.CharField(max_length=255), blank=True, default=list, null=True, size=None),
),
migrations.AlterField(
model_name='book',
name='subtitle',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='book',
name='title',
field=bookwyrm.models.fields.CharField(max_length=255),
),
migrations.AlterField(
model_name='boost',
name='boosted_status',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='boosters', to='bookwyrm.Status'),
),
migrations.AlterField(
model_name='comment',
name='book',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Edition'),
),
migrations.AlterField(
model_name='edition',
name='asin',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='edition',
name='isbn_10',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='edition',
name='isbn_13',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='edition',
name='oclc_number',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='edition',
name='pages',
field=bookwyrm.models.fields.IntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='edition',
name='parent_work',
field=bookwyrm.models.fields.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, related_name='editions', to='bookwyrm.Work'),
),
migrations.AlterField(
model_name='edition',
name='physical_format',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='edition',
name='publishers',
field=bookwyrm.models.fields.ArrayField(base_field=models.CharField(max_length=255), blank=True, default=list, size=None),
),
migrations.AlterField(
model_name='favorite',
name='status',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Status'),
),
migrations.AlterField(
model_name='favorite',
name='user',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='image',
name='caption',
field=bookwyrm.models.fields.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='image',
name='image',
field=bookwyrm.models.fields.ImageField(blank=True, null=True, upload_to='status/'),
),
migrations.AlterField(
model_name='quotation',
name='book',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Edition'),
),
migrations.AlterField(
model_name='quotation',
name='quote',
field=bookwyrm.models.fields.TextField(),
),
migrations.AlterField(
model_name='review',
name='book',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Edition'),
),
migrations.AlterField(
model_name='review',
name='name',
field=bookwyrm.models.fields.CharField(max_length=255, null=True),
),
migrations.AlterField(
model_name='review',
name='rating',
field=bookwyrm.models.fields.IntegerField(blank=True, default=None, null=True, validators=[django.core.validators.MinValueValidator(1), django.core.validators.MaxValueValidator(5)]),
),
migrations.AlterField(
model_name='shelf',
name='name',
field=bookwyrm.models.fields.CharField(max_length=100),
),
migrations.AlterField(
model_name='shelf',
name='privacy',
field=bookwyrm.models.fields.CharField(choices=[('public', 'Public'), ('unlisted', 'Unlisted'), ('followers', 'Followers'), ('direct', 'Direct')], default='public', max_length=255),
),
migrations.AlterField(
model_name='shelf',
name='user',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='shelfbook',
name='added_by',
field=bookwyrm.models.fields.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='shelfbook',
name='book',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Edition'),
),
migrations.AlterField(
model_name='shelfbook',
name='shelf',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Shelf'),
),
migrations.AlterField(
model_name='status',
name='content',
field=bookwyrm.models.fields.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='status',
name='mention_books',
field=bookwyrm.models.fields.TagField(related_name='mention_book', to='bookwyrm.Edition'),
),
migrations.AlterField(
model_name='status',
name='mention_users',
field=bookwyrm.models.fields.TagField(related_name='mention_user', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='status',
name='published_date',
field=bookwyrm.models.fields.DateTimeField(default=django.utils.timezone.now),
),
migrations.AlterField(
model_name='status',
name='reply_parent',
field=bookwyrm.models.fields.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Status'),
),
migrations.AlterField(
model_name='status',
name='sensitive',
field=bookwyrm.models.fields.BooleanField(default=False),
),
migrations.AlterField(
model_name='status',
name='user',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='tag',
name='name',
field=bookwyrm.models.fields.CharField(max_length=100, unique=True),
),
migrations.AlterField(
model_name='userblocks',
name='user_object',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='userblocks_user_object', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='userblocks',
name='user_subject',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='userblocks_user_subject', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='userfollowrequest',
name='user_object',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='userfollowrequest_user_object', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='userfollowrequest',
name='user_subject',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='userfollowrequest_user_subject', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='userfollows',
name='user_object',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='userfollows_user_object', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='userfollows',
name='user_subject',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='userfollows_user_subject', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='usertag',
name='book',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Edition'),
),
migrations.AlterField(
model_name='usertag',
name='tag',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Tag'),
),
migrations.AlterField(
model_name='usertag',
name='user',
field=bookwyrm.models.fields.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='work',
name='default_edition',
field=bookwyrm.models.fields.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, to='bookwyrm.Edition'),
),
migrations.AlterField(
model_name='work',
name='lccn',
field=bookwyrm.models.fields.CharField(blank=True, max_length=255, null=True),
),
]
| 41.858757 | 194 | 0.60683 | 1,482 | 14,818 | 5.930499 | 0.099865 | 0.108317 | 0.154739 | 0.221072 | 0.889976 | 0.877119 | 0.80669 | 0.772784 | 0.746388 | 0.733644 | 0 | 0.010628 | 0.269807 | 14,818 | 353 | 195 | 41.977337 | 0.801664 | 0.003037 | 0 | 0.74928 | 1 | 0 | 0.101821 | 0.01178 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017291 | 0 | 0.025937 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
92e49bbc44f64be41ff263593c578399d6497b8d | 1,612 | py | Python | models/game.py | project-lolquiz/the-backend | f8a84bd19f400b7c3a2c9b2dfbe305871c1e866e | [
"MIT"
] | null | null | null | models/game.py | project-lolquiz/the-backend | f8a84bd19f400b7c3a2c9b2dfbe305871c1e866e | [
"MIT"
] | 19 | 2021-02-01T19:52:49.000Z | 2021-09-26T13:52:41.000Z | models/game.py | project-lolquiz/the-backend | f8a84bd19f400b7c3a2c9b2dfbe305871c1e866e | [
"MIT"
] | null | null | null | from app import get_db_connection
db = get_db_connection()
def get_all_types():
return GameType.query.all()
class GameType(db.Model):
__tablename__ = 'game_types'
id = db.Column(db.BIGINT, primary_key=True, unique=True, nullable=False, autoincrement=True)
name = db.Column(db.String, nullable=False)
description = db.Column(db.String, nullable=False)
created_at = db.Column(db.TIMESTAMP, nullable=False, default=db.func.now())
updated_at = db.Column(db.TIMESTAMP, nullable=False, default=db.func.now())
def __init__(self):
pass
def __init__(self, _id, name, description):
self.id = _id
self.name = name
self.description = description
def __repr__(self):
return '<GameType id={}|name={}|description={}>'.format(self.id, self.name, self.description)
def get_all_modes():
return GameMode.query.all()
class GameMode(db.Model):
__tablename__ = 'game_modes'
id = db.Column(db.BIGINT, primary_key=True, unique=True, nullable=False, autoincrement=True)
name = db.Column(db.String, nullable=False)
description = db.Column(db.String, nullable=False)
created_at = db.Column(db.TIMESTAMP, nullable=False, default=db.func.now())
updated_at = db.Column(db.TIMESTAMP, nullable=False, default=db.func.now())
def __init__(self):
pass
def __init__(self, _id, name, description):
self.id = _id
self.name = name
self.description = description
def __repr__(self):
return '<GameMode id={}|name={}|description={}>'.format(self.id, self.name, self.description)
| 31 | 101 | 0.67928 | 212 | 1,612 | 4.919811 | 0.20283 | 0.076702 | 0.095877 | 0.061361 | 0.799616 | 0.799616 | 0.799616 | 0.799616 | 0.799616 | 0.799616 | 0 | 0 | 0.184864 | 1,612 | 51 | 102 | 31.607843 | 0.79376 | 0 | 0 | 0.666667 | 0 | 0 | 0.060794 | 0.03598 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0.055556 | 0.027778 | 0.111111 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 9 |
92ec148696f42e81010efa13c59343d54d4b9608 | 179 | py | Python | util/__init__.py | gousaiyang/SE342-simple-image-processor | fb00e598b7c59e978b60d12dfbe76daeae67425d | [
"MIT"
] | 2 | 2019-07-29T15:45:36.000Z | 2019-11-26T13:29:23.000Z | util/__init__.py | gousaiyang/SE342-simple-image-processor | fb00e598b7c59e978b60d12dfbe76daeae67425d | [
"MIT"
] | null | null | null | util/__init__.py | gousaiyang/SE342-simple-image-processor | fb00e598b7c59e978b60d12dfbe76daeae67425d | [
"MIT"
] | null | null | null | from .file_util import *
from .tk_util import *
from .config import load_config, store_config
from .log import logger
from .image_util import *
from .version import Version
| 25.571429 | 46 | 0.77095 | 27 | 179 | 4.925926 | 0.444444 | 0.225564 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173184 | 179 | 6 | 47 | 29.833333 | 0.898649 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1312aed39887cff03296ec0787cf6d80cfb9d144 | 89 | py | Python | algorithms/shoebox/__init__.py | toastisme/dials | 6bc8ababc33bfe334513677f8adb65c0e90003f3 | [
"BSD-3-Clause"
] | 58 | 2015-10-15T09:28:20.000Z | 2022-03-28T20:09:38.000Z | algorithms/shoebox/__init__.py | toastisme/dials | 6bc8ababc33bfe334513677f8adb65c0e90003f3 | [
"BSD-3-Clause"
] | 1,741 | 2015-11-24T08:17:02.000Z | 2022-03-31T15:46:42.000Z | algorithms/shoebox/__init__.py | toastisme/dials | 6bc8ababc33bfe334513677f8adb65c0e90003f3 | [
"BSD-3-Clause"
] | 45 | 2015-10-14T13:44:16.000Z | 2022-03-22T14:45:56.000Z | from dials.algorithms.shoebox.masker import *
from dials_algorithms_shoebox_ext import *
| 29.666667 | 45 | 0.853933 | 12 | 89 | 6.083333 | 0.583333 | 0.246575 | 0.520548 | 0.712329 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089888 | 89 | 2 | 46 | 44.5 | 0.901235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
1335e923db2b15bcc6ad83ae134a283ba223f92a | 24,188 | py | Python | src/ebird/api/observations.py | StuartMacKay/ebird-api | 14b5c777548416a58abec05e25cd4b9a8e22f210 | [
"MIT"
] | 9 | 2020-05-16T20:26:33.000Z | 2021-11-02T06:24:46.000Z | src/ebird/api/observations.py | StuartMacKay/ebird-api | 14b5c777548416a58abec05e25cd4b9a8e22f210 | [
"MIT"
] | 17 | 2019-06-22T09:41:22.000Z | 2020-09-11T06:25:21.000Z | src/ebird/api/observations.py | ProjectBabbler/ebird-api | 14b5c777548416a58abec05e25cd4b9a8e22f210 | [
"MIT"
] | null | null | null | # pylint: disable=R0913,R0914
"""Functions for fetching information about what species have been seen."""
from ebird.api.utils import call
from ebird.api.validation import (
clean_areas,
clean_back,
clean_categories,
clean_detail,
clean_dist,
clean_hotspot,
clean_lat,
clean_lng,
clean_locale,
clean_max_observations,
clean_provisional,
clean_sort,
)
OBSERVATIONS_URL = "https://ebird.org/ws2.0/data/obs/%s/recent"
NOTABLE_OBSERVATIONS_URL = "https://ebird.org/ws2.0/data/obs/%s/recent/notable"
SPECIES_OBSERVATIONS_URL = "https://ebird.org/ws2.0/data/obs/%s/recent/%s"
NEARBY_OBSERVATIONS_URL = "https://ebird.org/ws2.0/data/obs/geo/recent"
NEARBY_NOTABLE_URL = "https://ebird.org/ws2.0/data/obs/geo/recent/notable"
NEARBY_SPECIES_URL = "https://ebird.org/ws2.0/data/obs/geo/recent/%s"
NEAREST_SPECIES_URL = "https://ebird.org/ws2.0/data/nearest/geo/recent/%s"
HISTORIC_OBSERVATIONS_URL = "https://ebird.org/ws2.0/data/obs/%s/historic/%s"
def get_observations(
token,
area,
back=14,
max_results=None,
locale="en",
provisional=False,
hotspot=False,
detail="simple",
category=None,
):
"""Get recent observations (up to 30 days ago) for a region or location.
The maps to the end point in the eBird API 2.0,
https://documenter.getpostman.com/view/664302/S1ENwy59?version=latest#3d2a17c1-2129-475c-b4c8-7d362d6000cd
:param token: the token needed to access the API.
:param area: a country, subnational1, subnational2 or location code
or a list of up to 10 codes. All codes must be same type.
:param back: the number of days in the past to include. Ranges from
1 to 30 with a default of 14 days.
:param max_results: the maximum number of observations to return from
1 to 10000. The default value is None which means all observations will
be returned.
:param locale: the language (to use) for the species common names. The
default of 'en' will use species names from the eBird/Clements checklist.
This can be any locale for which eBird has translations available. For a
complete list see, http://help.ebird.org/customer/portal/articles/1596582.
:param provisional: include records which have not yet been reviewed.
Either True or False, the default is False.
:param hotspot: return records only from hotspots, True or include both
hotspots and private locations, False (the default).
:param detail: return records in 'simple' or 'full' format. See the eBird
API documentation for a description of the fields.
:param category: one or more categories of species to return: 'domestic',
'form', 'hybrid', 'intergrade', 'issf', 'slash', 'species' or 'spuh'.
More than one value can be given in a comma-separated string. The default
value is None and records from all categories will be returned.
:return: the list of observations in simple format.
:raises ValueError: if any of the arguments fail the validation checks.
:raises URLError if there is an error with the connection to the
eBird site.
:raises HTTPError if the eBird API returns an error.
"""
cleaned = clean_areas(area)
url = OBSERVATIONS_URL % cleaned[0]
params = {
"back": clean_back(back),
"maxObservations": clean_max_observations(max_results),
"sppLocale": clean_locale(locale),
"includeProvisional": clean_provisional(provisional),
"hotspot": clean_hotspot(hotspot),
"detail": clean_detail(detail),
}
if category is not None:
params["cat"] = ",".join(clean_categories(category))
if len(cleaned) > 1:
params["r"] = ",".join(cleaned)
headers = {
"X-eBirdApiToken": token,
}
return call(url, params, headers)
def get_notable_observations(
token, area, back=14, max_results=None, locale="en", hotspot=False, detail="simple"
):
"""Get recent observations of a rare species for a region or location
Get all the recent observations (up to 30 days ago) of species classes
as rare (locally or nationally) for a country, subnational1 region,
subnational2 region or location.
The maps to the end point in the eBird API 2.0,
https://documenter.getpostman.com/view/664302/S1ENwy59?version=latest#397b9b8c-4ab9-4136-baae-3ffa4e5b26e4
:param token: the token needed to access the API.
:param area: a country, subnational1, subnational2 or location code
or a list of up to 10 codes. All codes must be same type.
:param back: the number of days in the past to include. Ranges from
1 to 30 with a default of 14 days.
:param max_results: the maximum number of observations to return from
1 to 10000. The default value is None which means all observations will
be returned.
:param locale: the language (to use) for the species common names. The
default of 'en' will use species names from the eBird/Clements checklist.
This can be any locale for which eBird has translations available. For a
complete list see, http://help.ebird.org/customer/portal/articles/1596582.
:param hotspot: return records only from hotspots, True or include both
hotspots and private locations, False (the default).
:param detail: return records in 'simple' or 'full' format. See the eBird
API documentation for a description of the fields.
:return: the list of observations.
:raises ValueError: if any of the arguments fail the validation checks.
:raises URLError if there is an error with the connection to the
eBird site.
:raises HTTPError if the eBird API returns an error.
"""
cleaned = clean_areas(area)
url = NOTABLE_OBSERVATIONS_URL % cleaned[0]
params = {
"back": clean_back(back),
"maxObservations": clean_max_observations(max_results),
"sppLocale": clean_locale(locale),
"hotspot": clean_hotspot(hotspot),
"detail": clean_detail(detail),
}
if len(cleaned) > 1:
params["r"] = ",".join(cleaned)
headers = {
"X-eBirdApiToken": token,
}
return call(url, params, headers)
def get_species_observations(
token,
species,
area,
back=14,
max_results=None,
locale="en",
provisional=False,
hotspot=False,
detail="simple",
category=None,
):
"""Get recent observations for a given species in a region.
Get all the recent observations (up to 30 days ago) for a species
in a given region.
The maps to the end point in the eBird API 2.0,
https://documenter.getpostman.com/view/664302/S1ENwy59?version=latest#755ce9ab-dc27-4cfc-953f-c69fb0f282d9
:param token: the token needed to access the API.
:param species: the scientific name of the species.
:param area: a country, subnational1, subnational2 or location code
or a list of up to 10 codes. All codes must be same type.
:param back: the number of days in the past to include. Ranges from
1 to 30 with a default of 14 days.
:param max_results: the maximum number of observations to return from
1 to 10000. The default value is None which means all observations will
be returned.
:param locale: the language (to use) for the species common names. The
default of 'en' will use species names from the eBird/Clements checklist.
This can be any locale for which eBird has translations available. For a
complete list see, http://help.ebird.org/customer/portal/articles/1596582.
:param provisional: include records which have not yet been reviewed.
Either True or False, the default is False.
:param hotspot: return records only from hotspots, True or include both
hotspots and private locations, False (the default).
:param detail: return records in 'simple' or 'full' format. See the eBird
API documentation for a description of the fields.
:param category: one or more categories of species to return: 'domestic',
'form', 'hybrid', 'intergrade', 'issf', 'slash', 'species' or 'spuh'.
More than one value can be given in a comma-separated string. The default
value is None and records from all categories will be returned. It's not clear
what the purpose of this parameter is given the species is being specified.
It is not documented on the eBird API page but it is supported by the code.
:return: the list of observations in simple format.
:raises ValueError: if any of the arguments fail the validation checks.
:raises URLError if there is an error with the connection to the
eBird site.
:raises HTTPError if the eBird API returns an error.
"""
cleaned = clean_areas(area)
url = SPECIES_OBSERVATIONS_URL % (cleaned[0], species)
params = {
"back": clean_back(back),
"maxObservations": clean_max_observations(max_results),
"sppLocale": clean_locale(locale),
"includeProvisional": clean_provisional(provisional),
"hotspot": clean_hotspot(hotspot),
"detail": clean_detail(detail),
}
if category is not None:
params["cat"] = ",".join(clean_categories(category))
if len(cleaned) > 1:
params["r"] = ",".join(cleaned)
headers = {
"X-eBirdApiToken": token,
}
return call(url, params, headers)
def get_nearby_observations(
token,
lat,
lng,
dist=25,
back=14,
max_results=None,
locale="en",
provisional=False,
hotspot=False,
category=None,
sort="date",
):
"""Get nearby recent observations of each species.
Get recent observations (up to 30 days ago) of each species from all
locations in an area centered on a set of coordinates (latitude,
longitude) and optional distance (up to 50km away, with a default
distance of 25km).
NOTE: Only the most recent record of each species is returned.
The maps to the end point in the eBird API 1.1,
https://documenter.getpostman.com/view/664302/S1ENwy59?version=latest#62b5ffb3-006e-4e8a-8e50-21d90d036edc
:param token: the token needed to access the API.
:param lat: the latitude, which will be rounded to 2 decimal places.
:param lng: the longitude, which will be rounded to 2 decimal places.
:param dist: include all sites within this distance, from 0 to 50km
with a default of 25km.
:param back: the number of days in the past to include. Ranges from
1 to 30 with a default of 14 days.
:param max_results: the maximum number of observations to return from
1 to 10000. The default value is None which means all observations will
be returned.
:param locale: the language (to use) for the species common names. The
default of 'en_US' will use species names from the eBird/Clements checklist.
This can be any locale for which eBird has translations available. For a
complete list see, http://help.ebird.org/customer/portal/articles/1596582.
:param provisional: include records which have not yet been reviewed.
Either True or False, the default is False.
:param hotspot: get only observations from hotspots, in other words
exclude private locations. The default is False so observations will
be returned from all locations.
:param category: one or more categories of species to return: 'domestic',
'form', 'hybrid', 'intergrade', 'issf', 'slash', 'species' or 'spuh'.
More than one value can be given in a comma-separated string. The default
value is None and records from all categories will be returned.
:param sort: return the records sorted by date, 'date' or taxonomy, 'species'.
:return: the list of observations in simple format.
:raises ValueError: if any of the arguments fail the validation checks.
:raises URLError if there is an error with the connection to the
eBird site.
:raises HTTPError if the eBird API returns an error.
"""
params = {
"lat": clean_lat(lat),
"lng": clean_lng(lng),
"dist": clean_dist(dist),
"back": clean_back(back),
"maxObservations": clean_max_observations(max_results),
"sppLocale": clean_locale(locale),
"includeProvisional": clean_provisional(provisional),
"hotspot": clean_hotspot(hotspot),
"sort": clean_sort(sort),
}
if category is not None:
params["cat"] = ",".join(clean_categories(category))
headers = {
"X-eBirdApiToken": token,
}
return call(NEARBY_OBSERVATIONS_URL, params, headers)
def get_nearby_species(
token,
species,
lat,
lng,
dist=25,
back=14,
max_results=None,
locale="en",
provisional=False,
hotspot=False,
category=None,
):
"""Get most recent observation of a species nearby.
Get the most recent observation (up to 30 days ago) for a species seen at
any locations in an area centered on a set of coordinates (latitude,
longitude) and optional distance (up to 50km away).
The maps to the end point in the eBird API 1.1,
https://documenter.getpostman.com/view/664302/S1ENwy59?version=latest#20fb2c3b-ee7f-49ae-a912-9c3f16a40397
:param token: the token needed to access the API.
:param species: the scientific name of the species.
:param lat: the latitude, which will be rounded to 2 decimal places.
:param lng: the longitude, which will be rounded to 2 decimal places.
:param dist: include all sites within this distance, from 0 to 50km
with a default of 25km.
:param back: the number of days in the past to include. Ranges from
1 to 30 with a default of 14 days.
:param max_results: the maximum number of observations to return from
1 to 10000. The default value is None which means all observations will
be returned.
:param locale: the language (to use) for the species common names. The
default of 'en_US' will use species names from the eBird/Clements checklist.
This can be any locale for which eBird has translations available. For a
complete list see, http://help.ebird.org/customer/portal/articles/1596582.
:param provisional: include records which have not yet been reviewed.
Either True or False, the default is False.
:param hotspot: get only observations from hotspots, in other words exclude
private locations. The default is False so observations will be returned from
all locations.
:param category: one or more categories of species to return: 'domestic',
'form', 'hybrid', 'intergrade', 'issf', 'slash', 'species' or 'spuh'.
More than one value can be given in a comma-separated string. The default
value is None and records from all categories will be returned. It's not
clear what the purpose of this parameter is given the species is being
specified. It is not documented on the eBird API page but it is supported
by the code.
:return: the list of observations in 'simple' form. See the eBird API
documentation for a description of the fields.
:raises ValueError: if any of the arguments fail the validation checks.
:raises URLError if there is an error with the connection to the
eBird site.
:raises HTTPError if the eBird API returns an error.
"""
url = NEARBY_SPECIES_URL % species
params = {
"lat": clean_lat(lat),
"lng": clean_lng(lng),
"dist": clean_dist(dist),
"back": clean_back(back),
"maxObservations": clean_max_observations(max_results),
"sppLocale": clean_locale(locale),
"includeProvisional": clean_provisional(provisional),
"hotspot": clean_hotspot(hotspot),
}
if category is not None:
params["cat"] = ",".join(clean_categories(category))
headers = {
"X-eBirdApiToken": token,
}
return call(url, params, headers)
def get_nearby_notable(
token,
lat,
lng,
dist=25,
back=14,
max_results=None,
locale="en",
hotspot=False,
detail="simple",
):
"""Get the nearby, recent observations of rare species.
Get all the recent observations (up to 30 days ago) for a species seen at
locations in an area centered on a set of coordinates (latitude, longitude)
and optional distance (up to 50km away) which are locally or nationally
rare.
The maps to the end point in the eBird API 2.0,
https://documenter.getpostman.com/view/664302/S1ENwy59?version=latest#caa348bb-71f6-471c-b203-9e1643377cbc
:param token: the token needed to access the API.
:param lat: the latitude, which will be rounded to 2 decimal places.
:param lng: the longitude, which will be rounded to 2 decimal places.
:param dist: include all sites within this distance, from 0 to 50km
with a default of 25km.
:param back: the number of days in the past to include. Ranges from
1 to 30 with a default of 14 days.
:param max_results: the maximum number of observations to return from
1 to 10000. The default value is None which means all observations will
be returned.
:param locale: the language (to use) for the species common names. The
default of 'en_US' will use species names from the eBird/Clements checklist.
This can be any locale for which eBird has translations available. For a
complete list see, http://help.ebird.org/customer/portal/articles/1596582.
:param hotspot: get only observations from hotspots, in other words
exclude private locations. The default is False so observations will be
returned from all locations.
:param detail: return records in 'simple' or 'full' format. See the eBird
API documentation for a description of the fields.
:return: the list of observations.
:raises ValueError: if any of the arguments fail the validation checks.
:raises URLError if there is an error with the connection to the
eBird site.
:raises HTTPError if the eBird API returns an error.
"""
params = {
"lat": clean_lat(lat),
"lng": clean_lng(lng),
"dist": clean_dist(dist),
"back": clean_back(back),
"maxObservations": clean_max_observations(max_results),
"sppLocale": clean_locale(locale),
"hotspot": clean_hotspot(hotspot),
"detail": clean_detail(detail),
}
headers = {
"X-eBirdApiToken": token,
}
return call(NEARBY_NOTABLE_URL, params, headers)
def get_nearest_species(
token,
species,
lat,
lng,
dist=25,
back=14,
max_results=None,
locale="en",
provisional=False,
hotspot=False,
):
"""Get the nearest, recent, observations of a species.
Get the recent observations (up to 30 days ago) for a species seen at
locations closest to a set of coordinates (latitude, longitude).
IMPORTANT: As of 2019-05-27 the dist parameter does not appear to be
respected and so this call will return records from anywhere the
specified species are reported. Also the English common name for the
species is always returned regardless of which locale is specified.
The maps to the end point in the eBird API 1.1,
https://documenter.getpostman.com/view/664302/S1ENwy59?version=latest#6bded97f-9997-477f-ab2f-94f254954ccb
:param token: the token needed to access the API.
:param species: the scientific name of the species.
:param lat: the latitude, which will be rounded to 2 decimal places.
:param lng: the longitude, which will be rounded to 2 decimal places.
:param dist: include all sites within this distance, from 0 to 50km
with a default of 25km.
:param back: the number of days in the past to include. Ranges from
1 to 30 with a default of 14 days.
:param max_results: the maximum number of observations to return from
1 to 1000. The default value is None which means all observations will
be returned.
:param locale: the language (to use) for the species common names. The
default of 'en_US' will use species names from the eBird/Clements checklist.
This can be any locale for which eBird has translations available. For a
complete list see, http://help.ebird.org/customer/portal/articles/1596582.
:param provisional: include records which have not yet been reviewed.
Either True or False, the default is False.
:param hotspot: get only observations from hotspots, in other words
exclude private locations. The default is False so observations will
be returned from all locations.
:return: the list of observations in 'simple' form. See the eBird API
documentation for a description of the fields.
:raises ValueError: if any of the arguments fail the validation checks.
:raises URLError if there is an error with the connection to the
eBird site.
:raises HTTPError if the eBird API returns an error.
"""
url = NEAREST_SPECIES_URL % species
params = {
"lat": clean_lat(lat),
"lng": clean_lng(lng),
"back": clean_back(back),
"dist": clean_dist(dist),
"maxObservations": clean_max_observations(max_results),
"sppLocale": clean_locale(locale),
"includeProvisional": clean_provisional(provisional),
"hotspot": clean_hotspot(hotspot),
}
headers = {
"X-eBirdApiToken": token,
}
return call(url, params, headers)
def get_historic_observations(
token,
area,
date,
max_results=None,
locale="en",
provisional=False,
hotspot=False,
detail="simple",
category=None,
):
"""Get recent observations for a region.
Get all the recent observations (up to 30 days ago) for a region.
The maps to the end point in the eBird API 2.0,
https://documenter.getpostman.com/view/664302/S1ENwy59?version=latest#2d8c6ee8-c435-4e91-9f66-6d3eeb09edd2
:param token: the token needed to access the API.
:param area: a country, subnational1, subnational2 or location code
or a list of up to 10 codes. All codes must be same type.
:param date: the date, since Jan 1st 1800.
:param max_results: the maximum number of observations to return from
1 to 10000. The default value is None which means all observations will
be returned.
:param locale: the language (to use) for the species common names. The
default of 'en_US' will use species names from the eBird/Clements checklist.
This can be any locale for which eBird has translations available. For a
complete list see, http://help.ebird.org/customer/portal/articles/1596582.
:param provisional: include records which have not yet been reviewed.
Either True or False, the default is False.
:param hotspot: return records only from hotspots, True or include both
hotspots and private locations, False (the default).
:param detail: return records in 'simple' or 'full' format. See the
eBird API documentation for a description of the fields.
:param category: one or more categories of species to return: 'domestic',
'form', 'hybrid', 'intergrade', 'issf', 'slash', 'species' or 'spuh'.
More than one value can be given in a comma-separated string. The default
value is None and records from all categories will be returned.
:return: the list of observations in simple format.
:raises ValueError: if any of the arguments fail the validation checks.
:raises URLError if there is an error with the connection to the
eBird site.
:raises HTTPError if the eBird API returns an error.
"""
cleaned = clean_areas(area)
url = HISTORIC_OBSERVATIONS_URL % (cleaned[0], date.strftime("%Y/%m/%d"))
params = {
"rank": "mrec",
"detail": clean_detail(detail),
"sppLocale": clean_locale(locale),
"includeProvisional": clean_provisional(provisional),
"hotspot": clean_hotspot(hotspot),
"maxObservations": clean_max_observations(max_results),
}
if len(cleaned) > 1:
params["r"] = ",".join(cleaned)
if category is not None:
params["cat"] = ",".join(clean_categories(category))
headers = {
"X-eBirdApiToken": token,
}
return call(url, params, headers)
| 34.163842 | 110 | 0.697743 | 3,453 | 24,188 | 4.841587 | 0.088908 | 0.01962 | 0.016449 | 0.013219 | 0.897954 | 0.895442 | 0.889042 | 0.886171 | 0.87977 | 0.879053 | 0 | 0.027362 | 0.223375 | 24,188 | 707 | 111 | 34.212164 | 0.862603 | 0.666157 | 0 | 0.751037 | 0 | 0 | 0.150352 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033195 | false | 0 | 0.008299 | 0 | 0.074689 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
13437c720c98aefbb6474f1aa6fa0eea25d20b83 | 7,926 | py | Python | test/test_download.py | yop-platform/yop-python-sdk | 8eab98a97b9564d6ddd58d189790c8cea95229ed | [
"Apache-2.0"
] | 3 | 2020-12-02T12:55:07.000Z | 2022-02-28T15:23:01.000Z | test/test_download.py | yop-platform/yop-python-sdk | 8eab98a97b9564d6ddd58d189790c8cea95229ed | [
"Apache-2.0"
] | 1 | 2021-06-02T06:50:15.000Z | 2021-08-23T12:39:17.000Z | test/test_download.py | yop-platform/yop-python-sdk | 8eab98a97b9564d6ddd58d189790c8cea95229ed | [
"Apache-2.0"
] | 2 | 2021-05-07T09:48:43.000Z | 2021-05-08T05:57:46.000Z | # -*- coding: utf-8 -*-
from auth.v3signer.credentials import YopCredentials
import test.assertion as assertion
import os
import sys
sys.path.append("./")
class Test(object):
# def test_download_hbird(self, client):
# api = '/rest/v2.0/hbird/magic-cube/material-download'
# params = {
# 'fileKey': '392',
# 'merchantNo': '10033934316',
# 'dayString': '2020-11-30',
# 'dataType': 'merchant',
# }
# file_path = os.environ['HOME']
# res = client.download(api, params, file_path=file_path)
# assert 0 == res
def test_download(self, client):
"""
Test download functionality.
Args:
self: write your description
client: write your description
"""
if 'sm' == client.cert_type:
return
api = '/yos/v1.0/balance/yop-simple-remit/download-electronic-receipt'
params = {
'batchNo': '000000005499580',
'orderId': 'YB654db6376fd04045a6abd82f055f6e04',
}
credentials = YopCredentials(appKey='OPR:10012413438',
cert_type='RSA2048',
priKey='MIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQCcKXb/SKitY30pWKON9ra84xcgiRu8kN4z+jvxqyyKpIx6gnNLQE4+6wp5C9LviF3o7rqPFXDSifHLz94AVVKCT7a2mvx7lGk+y56p3GJzrvahjyxbaBMRyDI76tdOvl1T1qFRWFv4OXagDawUcUCKSrxlRej7uALSdeyKaoKnqC0eR2fpuFNbz6NxIqRREIUZiSqmehjUPgBidJuOPJ1/HGkzZhmWBB46QxvermojqoxHEsLi0NakfX/rW3GQC1I1KY58e9ukTnZ2lo8j8gZrbPuY/1WM2Q3pzPGd5OPdtcZGtaHKIrzHupf+Et6EDU8IbZFIDJ/1qQ0JXYpBODVNAgMBAAECggEAOY40zBkhBirNdiAzw76DEnIWU4kFHoY8R2b6kfM3avAD0KFk0f7k996UERIRD/SwPApE2ziZSRfLdQVreq73xoyPuJS96uRDt/+/Pja6WI3LW7dTr2rX4G1rSlcfPOf/qMdJ1Jve5cl0FcCERFKLaYzrC95s5N2ouJ367PcdqaHIHsOUetqHoOH6Z9VCmpHPpX/+RfdXkFS1XfenAPW1x90e1u9e7jWbPfYVhwzjegYyp1KzJsPavs3BwQvu2J4tFq7THKtjA31BelX133kuv3oFeq2J5dFnFfpe3s/p9HtHRGLSKt9Sf/Zy13uFw4kjZCwGnVFZr68LaifPsyJMpwKBgQC6D1yjPNfb7QMLBQZlWEx0ATqYLXbUes6ZdM88Qd/nd1HrweLtAdptofHVXdE4FKIQ75yt7pKKYj0sURhzhM13wchwjXUhBzZYUjOQ9YcUd1f9eaMheZoE149P+pwKB3Yck/hBIJk6W20fMr6vucroA4vbvnvaAmwrR3nNdRKX8wKBgQDW3QBoXnALNTC0iAe5B8z49CYcFqyGFDwtflXulSvRYlxpl4mRAOA+P2oQtaaElZre8LSlC6e93J4VxfvLDeVzFer+g8JuQNJDhvPsoYmT29s+A1kqicE6SU+KAFoMoubuHAClM0elle202IJqdWt/pt/Yl+Z44MBvFzsrvLiNvwKBgQCJIql053Nydc64YIvGRr6TAhTd9SSQl7OPB7l3AFa3lAqdadqINcV46NQGH5AFda++K92flSgNNzs/XsZW3ptSmVHTI3AhV9+GWZAIV++n9g60lOLX2Xjb+MV4fY5lFfrINYfU+OH3UUusowpJGvei6no7DLrchMyVWak89f0uYQKBgHNXTfm5AHKzygKPp32nd1wJTE/1yAVt5WQSlrSttUkAgVVZuMpzau1fg2OW793qpamaE48p85ETVnWfw2wceJjQIkcgmgYvm/AOCPF1QfJyqn3etEYGjwjoA9+0EqMH6+nUdHA6V/LGykUzmMbnY56yCSYvXNR06jh4gxYWiAfnAoGAXh30ObSnOrf3befSF6qAHtEWBAf3oAXnpVKdqNaAy+Py/myJ6fvjENY3ZfzROkZqu5BSyuqiUw+V50WFM6hDgbEXJoRXdm41M9S8JwFBl5qAe1e3BZdbxbUK7G/qM4PQuTaArkvuz0wbJiZ2soFzi6S2ktDraafk+ErRgJx+q1k=')
file_path = os.environ['HOME']
res = client.download(api, params, credentials, file_path=file_path)
assert 0 == res
def test_download_with_credentials(self, client):
"""
Test downloading with credentials.
Args:
self: write your description
client: write your description
"""
if 'sm' == client.cert_type:
return
api = '/yos/v1.0/balance/yop-simple-remit/download-electronic-receipt'
params = {
'batchNo': '000000005499580',
'orderId': 'YB654db6376fd04045a6abd82f055f6e04',
}
credentials = YopCredentials(appKey='OPR:10012413438',
cert_type='RSA2048',
priKey='MIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQCcKXb/SKitY30pWKON9ra84xcgiRu8kN4z+jvxqyyKpIx6gnNLQE4+6wp5C9LviF3o7rqPFXDSifHLz94AVVKCT7a2mvx7lGk+y56p3GJzrvahjyxbaBMRyDI76tdOvl1T1qFRWFv4OXagDawUcUCKSrxlRej7uALSdeyKaoKnqC0eR2fpuFNbz6NxIqRREIUZiSqmehjUPgBidJuOPJ1/HGkzZhmWBB46QxvermojqoxHEsLi0NakfX/rW3GQC1I1KY58e9ukTnZ2lo8j8gZrbPuY/1WM2Q3pzPGd5OPdtcZGtaHKIrzHupf+Et6EDU8IbZFIDJ/1qQ0JXYpBODVNAgMBAAECggEAOY40zBkhBirNdiAzw76DEnIWU4kFHoY8R2b6kfM3avAD0KFk0f7k996UERIRD/SwPApE2ziZSRfLdQVreq73xoyPuJS96uRDt/+/Pja6WI3LW7dTr2rX4G1rSlcfPOf/qMdJ1Jve5cl0FcCERFKLaYzrC95s5N2ouJ367PcdqaHIHsOUetqHoOH6Z9VCmpHPpX/+RfdXkFS1XfenAPW1x90e1u9e7jWbPfYVhwzjegYyp1KzJsPavs3BwQvu2J4tFq7THKtjA31BelX133kuv3oFeq2J5dFnFfpe3s/p9HtHRGLSKt9Sf/Zy13uFw4kjZCwGnVFZr68LaifPsyJMpwKBgQC6D1yjPNfb7QMLBQZlWEx0ATqYLXbUes6ZdM88Qd/nd1HrweLtAdptofHVXdE4FKIQ75yt7pKKYj0sURhzhM13wchwjXUhBzZYUjOQ9YcUd1f9eaMheZoE149P+pwKB3Yck/hBIJk6W20fMr6vucroA4vbvnvaAmwrR3nNdRKX8wKBgQDW3QBoXnALNTC0iAe5B8z49CYcFqyGFDwtflXulSvRYlxpl4mRAOA+P2oQtaaElZre8LSlC6e93J4VxfvLDeVzFer+g8JuQNJDhvPsoYmT29s+A1kqicE6SU+KAFoMoubuHAClM0elle202IJqdWt/pt/Yl+Z44MBvFzsrvLiNvwKBgQCJIql053Nydc64YIvGRr6TAhTd9SSQl7OPB7l3AFa3lAqdadqINcV46NQGH5AFda++K92flSgNNzs/XsZW3ptSmVHTI3AhV9+GWZAIV++n9g60lOLX2Xjb+MV4fY5lFfrINYfU+OH3UUusowpJGvei6no7DLrchMyVWak89f0uYQKBgHNXTfm5AHKzygKPp32nd1wJTE/1yAVt5WQSlrSttUkAgVVZuMpzau1fg2OW793qpamaE48p85ETVnWfw2wceJjQIkcgmgYvm/AOCPF1QfJyqn3etEYGjwjoA9+0EqMH6+nUdHA6V/LGykUzmMbnY56yCSYvXNR06jh4gxYWiAfnAoGAXh30ObSnOrf3befSF6qAHtEWBAf3oAXnpVKdqNaAy+Py/myJ6fvjENY3ZfzROkZqu5BSyuqiUw+V50WFM6hDgbEXJoRXdm41M9S8JwFBl5qAe1e3BZdbxbUK7G/qM4PQuTaArkvuz0wbJiZ2soFzi6S2ktDraafk+ErRgJx+q1k=')
file_path = os.environ['HOME']
res = client.download(api, params, credentials, file_path=file_path)
assert 0 == res
def test_download_failed(self, client):
"""
Test that downloading a file fails.
Args:
self: write your description
client: write your description
"""
if 'sm' == client.cert_type:
return
api = '/yos/v1.0/balance/yop-simple-remit/download-electronic-receipt'
params = {
'batchNo': '000000005499580',
'orderId': 'YB654db6376fd04045a6abd82f055f6e042',
}
credentials = YopCredentials(appKey='OPR:10012413438',
cert_type='RSA2048',
priKey='MIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQCcKXb/SKitY30pWKON9ra84xcgiRu8kN4z+jvxqyyKpIx6gnNLQE4+6wp5C9LviF3o7rqPFXDSifHLz94AVVKCT7a2mvx7lGk+y56p3GJzrvahjyxbaBMRyDI76tdOvl1T1qFRWFv4OXagDawUcUCKSrxlRej7uALSdeyKaoKnqC0eR2fpuFNbz6NxIqRREIUZiSqmehjUPgBidJuOPJ1/HGkzZhmWBB46QxvermojqoxHEsLi0NakfX/rW3GQC1I1KY58e9ukTnZ2lo8j8gZrbPuY/1WM2Q3pzPGd5OPdtcZGtaHKIrzHupf+Et6EDU8IbZFIDJ/1qQ0JXYpBODVNAgMBAAECggEAOY40zBkhBirNdiAzw76DEnIWU4kFHoY8R2b6kfM3avAD0KFk0f7k996UERIRD/SwPApE2ziZSRfLdQVreq73xoyPuJS96uRDt/+/Pja6WI3LW7dTr2rX4G1rSlcfPOf/qMdJ1Jve5cl0FcCERFKLaYzrC95s5N2ouJ367PcdqaHIHsOUetqHoOH6Z9VCmpHPpX/+RfdXkFS1XfenAPW1x90e1u9e7jWbPfYVhwzjegYyp1KzJsPavs3BwQvu2J4tFq7THKtjA31BelX133kuv3oFeq2J5dFnFfpe3s/p9HtHRGLSKt9Sf/Zy13uFw4kjZCwGnVFZr68LaifPsyJMpwKBgQC6D1yjPNfb7QMLBQZlWEx0ATqYLXbUes6ZdM88Qd/nd1HrweLtAdptofHVXdE4FKIQ75yt7pKKYj0sURhzhM13wchwjXUhBzZYUjOQ9YcUd1f9eaMheZoE149P+pwKB3Yck/hBIJk6W20fMr6vucroA4vbvnvaAmwrR3nNdRKX8wKBgQDW3QBoXnALNTC0iAe5B8z49CYcFqyGFDwtflXulSvRYlxpl4mRAOA+P2oQtaaElZre8LSlC6e93J4VxfvLDeVzFer+g8JuQNJDhvPsoYmT29s+A1kqicE6SU+KAFoMoubuHAClM0elle202IJqdWt/pt/Yl+Z44MBvFzsrvLiNvwKBgQCJIql053Nydc64YIvGRr6TAhTd9SSQl7OPB7l3AFa3lAqdadqINcV46NQGH5AFda++K92flSgNNzs/XsZW3ptSmVHTI3AhV9+GWZAIV++n9g60lOLX2Xjb+MV4fY5lFfrINYfU+OH3UUusowpJGvei6no7DLrchMyVWak89f0uYQKBgHNXTfm5AHKzygKPp32nd1wJTE/1yAVt5WQSlrSttUkAgVVZuMpzau1fg2OW793qpamaE48p85ETVnWfw2wceJjQIkcgmgYvm/AOCPF1QfJyqn3etEYGjwjoA9+0EqMH6+nUdHA6V/LGykUzmMbnY56yCSYvXNR06jh4gxYWiAfnAoGAXh30ObSnOrf3befSF6qAHtEWBAf3oAXnpVKdqNaAy+Py/myJ6fvjENY3ZfzROkZqu5BSyuqiUw+V50WFM6hDgbEXJoRXdm41M9S8JwFBl5qAe1e3BZdbxbUK7G/qM4PQuTaArkvuz0wbJiZ2soFzi6S2ktDraafk+ErRgJx+q1k=')
file_path = os.environ['HOME']
res = client.download(api, params, credentials, file_path=file_path)
assertion.failure(res, '40044')
assertion.failure(res, 'isp.code.data-not-fund', 'subCode')
| 86.152174 | 1,671 | 0.806712 | 411 | 7,926 | 15.493917 | 0.318735 | 0.015075 | 0.018844 | 0.010678 | 0.919912 | 0.919912 | 0.919912 | 0.919912 | 0.919912 | 0.919912 | 0 | 0.139724 | 0.13134 | 7,926 | 91 | 1,672 | 87.098901 | 0.785185 | 0.089579 | 0 | 0.693878 | 0 | 0.061224 | 0.760017 | 0.733824 | 0 | 1 | 0 | 0 | 0.102041 | 1 | 0.061224 | false | 0 | 0.081633 | 0 | 0.22449 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
1382d2ed2ecda8eb48024661d1a835e3b7a26912 | 119 | py | Python | code/DeepDA/loss_funcs/__init__.py | chenkang121/transferlearning | f96e9a371b391accc14b2a31dbb9cde5aae576d8 | [
"MIT"
] | 2 | 2021-05-30T14:45:58.000Z | 2021-05-30T14:46:00.000Z | code/DeepDA/loss_funcs/__init__.py | yuezhijian/transferlearning | f96e9a371b391accc14b2a31dbb9cde5aae576d8 | [
"MIT"
] | null | null | null | code/DeepDA/loss_funcs/__init__.py | yuezhijian/transferlearning | f96e9a371b391accc14b2a31dbb9cde5aae576d8 | [
"MIT"
] | null | null | null | from loss_funcs.mmd import *
from loss_funcs.coral import *
from loss_funcs.adv import *
from loss_funcs.lmmd import *
| 23.8 | 30 | 0.798319 | 20 | 119 | 4.55 | 0.4 | 0.351648 | 0.571429 | 0.626374 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134454 | 119 | 4 | 31 | 29.75 | 0.883495 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
13a1681caea314144bbca782de3496ab3aa3a628 | 194 | py | Python | core/admin.py | HemangNakarani/Insta-API | a6e6ee4ee62a969a45e1612393b15d2f27969cd1 | [
"MIT"
] | 1 | 2021-05-28T21:54:57.000Z | 2021-05-28T21:54:57.000Z | core/admin.py | HemangNakarani/Insta-API | a6e6ee4ee62a969a45e1612393b15d2f27969cd1 | [
"MIT"
] | null | null | null | core/admin.py | HemangNakarani/Insta-API | a6e6ee4ee62a969a45e1612393b15d2f27969cd1 | [
"MIT"
] | null | null | null | from django.contrib import admin
from core import models
admin.site.register(models.User)
admin.site.register(models.Post)
admin.site.register(models.Comment)
admin.site.register(models.Story)
| 24.25 | 35 | 0.824742 | 29 | 194 | 5.517241 | 0.448276 | 0.225 | 0.425 | 0.575 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06701 | 194 | 7 | 36 | 27.714286 | 0.883978 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
13a8aee57a427e9d09ac54685f57df015b71ce43 | 139 | py | Python | addons/dissertation_admission_app/wizards/__init__.py | Teixeira992/LEI | 1ede98b171027e58920fb77f079b523071dfbb69 | [
"MIT"
] | null | null | null | addons/dissertation_admission_app/wizards/__init__.py | Teixeira992/LEI | 1ede98b171027e58920fb77f079b523071dfbb69 | [
"MIT"
] | null | null | null | addons/dissertation_admission_app/wizards/__init__.py | Teixeira992/LEI | 1ede98b171027e58920fb77f079b523071dfbb69 | [
"MIT"
] | 1 | 2022-01-12T21:41:39.000Z | 2022-01-12T21:41:39.000Z | from . import publish_dissertation_wizard
from . import sign_wizard
from . import upload_work_plan_wizard
from . import make_review_wizard
| 27.8 | 41 | 0.856115 | 20 | 139 | 5.55 | 0.55 | 0.36036 | 0.432432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115108 | 139 | 4 | 42 | 34.75 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
13ef490b2b3c2f3478552aaab79da1abb00177f2 | 80,116 | py | Python | pynos/versions/ver_7/ver_7_0_0/yang/brocade_tunnels.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 12 | 2015-09-21T23:56:09.000Z | 2018-03-30T04:35:32.000Z | pynos/versions/ver_7/ver_7_0_0/yang/brocade_tunnels.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 10 | 2016-09-15T19:03:27.000Z | 2017-07-17T23:38:01.000Z | pynos/versions/ver_7/ver_7_0_0/yang/brocade_tunnels.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 6 | 2015-08-14T08:05:23.000Z | 2022-02-03T15:33:54.000Z | #!/usr/bin/env python
import xml.etree.ElementTree as ET
class brocade_tunnels(object):
"""Auto generated class.
"""
def __init__(self, **kwargs):
self._callback = kwargs.pop('callback')
def nsx_controller_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name = ET.SubElement(nsx_controller, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_connection_addr_address(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
connection_addr = ET.SubElement(nsx_controller, "connection-addr")
address = ET.SubElement(connection_addr, "address")
address.text = kwargs.pop('address')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_connection_addr_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
connection_addr = ET.SubElement(nsx_controller, "connection-addr")
port = ET.SubElement(connection_addr, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_connection_addr_method(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
connection_addr = ET.SubElement(nsx_controller, "connection-addr")
method = ET.SubElement(connection_addr, "method")
method.text = kwargs.pop('method')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_reconnect_interval(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
reconnect_interval = ET.SubElement(nsx_controller, "reconnect-interval")
reconnect_interval.text = kwargs.pop('reconnect_interval')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_activate(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
activate = ET.SubElement(nsx_controller, "activate")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name = ET.SubElement(overlay_gateway, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_gw_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
gw_type = ET.SubElement(overlay_gateway, "gw-type")
gw_type.text = kwargs.pop('gw_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_ip_interface_ve_ve_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
ip = ET.SubElement(overlay_gateway, "ip")
interface = ET.SubElement(ip, "interface")
ve = ET.SubElement(interface, "ve")
ve_id = ET.SubElement(ve, "ve-id")
ve_id.text = kwargs.pop('ve_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_ip_interface_ve_vrrp_extended_group(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
ip = ET.SubElement(overlay_gateway, "ip")
interface = ET.SubElement(ip, "interface")
ve = ET.SubElement(interface, "ve")
vrrp_extended_group = ET.SubElement(ve, "vrrp-extended-group")
vrrp_extended_group.text = kwargs.pop('vrrp_extended_group')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_ip_interface_loopback_loopback_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
ip = ET.SubElement(overlay_gateway, "ip")
interface = ET.SubElement(ip, "interface")
loopback = ET.SubElement(interface, "loopback")
loopback_id = ET.SubElement(loopback, "loopback-id")
loopback_id.text = kwargs.pop('loopback_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_attach_rbridge_id_rb_add(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
attach = ET.SubElement(overlay_gateway, "attach")
rbridge_id = ET.SubElement(attach, "rbridge-id")
rb_add = ET.SubElement(rbridge_id, "rb-add")
rb_add.text = kwargs.pop('rb_add')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_attach_rbridge_id_rb_remove(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
attach = ET.SubElement(overlay_gateway, "attach")
rbridge_id = ET.SubElement(attach, "rbridge-id")
rb_remove = ET.SubElement(rbridge_id, "rb-remove")
rb_remove.text = kwargs.pop('rb_remove')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_attach_vlan_vid(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
attach = ET.SubElement(overlay_gateway, "attach")
vlan = ET.SubElement(attach, "vlan")
mac_key = ET.SubElement(vlan, "mac")
mac_key.text = kwargs.pop('mac')
vid = ET.SubElement(vlan, "vid")
vid.text = kwargs.pop('vid')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_attach_vlan_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
attach = ET.SubElement(overlay_gateway, "attach")
vlan = ET.SubElement(attach, "vlan")
vid_key = ET.SubElement(vlan, "vid")
vid_key.text = kwargs.pop('vid')
mac = ET.SubElement(vlan, "mac")
mac.text = kwargs.pop('mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_map_vlan_vni_mapping_vid(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
map = ET.SubElement(overlay_gateway, "map")
vlan_vni_mapping = ET.SubElement(map, "vlan-vni-mapping")
vid = ET.SubElement(vlan_vni_mapping, "vid")
vid.text = kwargs.pop('vid')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_map_vlan_vni_mapping_vni(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
map = ET.SubElement(overlay_gateway, "map")
vlan_vni_mapping = ET.SubElement(map, "vlan-vni-mapping")
vid_key = ET.SubElement(vlan_vni_mapping, "vid")
vid_key.text = kwargs.pop('vid')
vni = ET.SubElement(vlan_vni_mapping, "vni")
vni.text = kwargs.pop('vni')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_map_vlan_vni_auto(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
map = ET.SubElement(overlay_gateway, "map")
vlan = ET.SubElement(map, "vlan")
vni = ET.SubElement(vlan, "vni")
auto = ET.SubElement(vni, "auto")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name = ET.SubElement(site, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_tunnel_dst_address(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
tunnel_dst = ET.SubElement(site, "tunnel-dst")
address = ET.SubElement(tunnel_dst, "address")
address.text = kwargs.pop('address')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_extend_vlan_add(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
extend = ET.SubElement(site, "extend")
vlan = ET.SubElement(extend, "vlan")
add = ET.SubElement(vlan, "add")
add.text = kwargs.pop('add')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_extend_vlan_remove(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
extend = ET.SubElement(site, "extend")
vlan = ET.SubElement(extend, "vlan")
remove = ET.SubElement(vlan, "remove")
remove.text = kwargs.pop('remove')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_mac_learning_protocol(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
mac_learning = ET.SubElement(site, "mac-learning")
protocol = ET.SubElement(mac_learning, "protocol")
protocol.text = kwargs.pop('protocol')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_enable(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd_enable = ET.SubElement(site, "bfd-enable")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_params_interval_min_tx(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd = ET.SubElement(site, "bfd")
params = ET.SubElement(bfd, "params")
interval = ET.SubElement(params, "interval")
min_tx = ET.SubElement(interval, "min-tx")
min_tx.text = kwargs.pop('min_tx')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_params_interval_min_rx(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd = ET.SubElement(site, "bfd")
params = ET.SubElement(bfd, "params")
interval = ET.SubElement(params, "interval")
min_rx = ET.SubElement(interval, "min-rx")
min_rx.text = kwargs.pop('min_rx')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_params_interval_multiplier(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd = ET.SubElement(site, "bfd")
params = ET.SubElement(bfd, "params")
interval = ET.SubElement(params, "interval")
multiplier = ET.SubElement(interval, "multiplier")
multiplier.text = kwargs.pop('multiplier')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_params_bfd_shutdown(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd = ET.SubElement(site, "bfd")
params = ET.SubElement(bfd, "params")
bfd_shutdown = ET.SubElement(params, "bfd-shutdown")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_shutdown(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
shutdown = ET.SubElement(site, "shutdown")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_enable_statistics_stats_direction(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
enable = ET.SubElement(overlay_gateway, "enable")
statistics = ET.SubElement(enable, "statistics")
stats_direction = ET.SubElement(statistics, "stats-direction")
stats_direction.text = kwargs.pop('stats_direction')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_enable_statistics_vlan_action(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
enable = ET.SubElement(overlay_gateway, "enable")
statistics = ET.SubElement(enable, "statistics")
vlan_action = ET.SubElement(statistics, "vlan-action")
vlan_action.text = kwargs.pop('vlan_action')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_enable_statistics_vlan_list(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
enable = ET.SubElement(overlay_gateway, "enable")
statistics = ET.SubElement(enable, "statistics")
vlan_list = ET.SubElement(statistics, "vlan-list")
vlan_list.text = kwargs.pop('vlan_list')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_session(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session = ET.SubElement(monitor, "session")
session.text = kwargs.pop('session')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_direction(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
direction = ET.SubElement(monitor, "direction")
direction.text = kwargs.pop('direction')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_remote_endpoint(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
remote_endpoint = ET.SubElement(monitor, "remote-endpoint")
remote_endpoint.text = kwargs.pop('remote_endpoint')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_vlan_leaf(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
vlan_leaf = ET.SubElement(monitor, "vlan-leaf")
vlan_leaf.text = kwargs.pop('vlan_leaf')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_vlan_add_remove(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
vlan_add_remove = ET.SubElement(monitor, "vlan-add-remove")
vlan_add_remove.text = kwargs.pop('vlan_add_remove')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_vlan_range(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
vlan_range = ET.SubElement(monitor, "vlan-range")
vlan_range.text = kwargs.pop('vlan_range')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_sflow_sflow_profile_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
sflow = ET.SubElement(overlay_gateway, "sflow")
sflow_profile_name = ET.SubElement(sflow, "sflow-profile-name")
sflow_profile_name.text = kwargs.pop('sflow_profile_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_sflow_sflow_remote_endpoint(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
sflow = ET.SubElement(overlay_gateway, "sflow")
sflow_profile_name_key = ET.SubElement(sflow, "sflow-profile-name")
sflow_profile_name_key.text = kwargs.pop('sflow_profile_name')
sflow_remote_endpoint = ET.SubElement(sflow, "sflow-remote-endpoint")
sflow_remote_endpoint.text = kwargs.pop('sflow_remote_endpoint')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_sflow_sflow_vlan_action(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
sflow = ET.SubElement(overlay_gateway, "sflow")
sflow_profile_name_key = ET.SubElement(sflow, "sflow-profile-name")
sflow_profile_name_key.text = kwargs.pop('sflow_profile_name')
sflow_vlan_action = ET.SubElement(sflow, "sflow-vlan-action")
sflow_vlan_action.text = kwargs.pop('sflow_vlan_action')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_sflow_sflow_vlan_range(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
sflow = ET.SubElement(overlay_gateway, "sflow")
sflow_profile_name_key = ET.SubElement(sflow, "sflow-profile-name")
sflow_profile_name_key.text = kwargs.pop('sflow_profile_name')
sflow_vlan_range = ET.SubElement(sflow, "sflow-vlan-range")
sflow_vlan_range.text = kwargs.pop('sflow_vlan_range')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_mac_in_cg_mac_acl_in_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
mac = ET.SubElement(access_lists, "mac")
in_cg = ET.SubElement(mac, "in")
mac_acl_in_name = ET.SubElement(in_cg, "mac-acl-in-name")
mac_acl_in_name.text = kwargs.pop('mac_acl_in_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_mac_in_cg_mac_acl_in_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
mac = ET.SubElement(access_lists, "mac")
in_cg = ET.SubElement(mac, "in")
mac_acl_in_dir = ET.SubElement(in_cg, "mac-acl-in-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_mac_out_mac_acl_out_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
mac = ET.SubElement(access_lists, "mac")
out = ET.SubElement(mac, "out")
mac_acl_out_name = ET.SubElement(out, "mac-acl-out-name")
mac_acl_out_name.text = kwargs.pop('mac_acl_out_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_mac_out_mac_acl_out_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
mac = ET.SubElement(access_lists, "mac")
out = ET.SubElement(mac, "out")
mac_acl_out_dir = ET.SubElement(out, "mac-acl-out-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv4_in_cg_ipv4_acl_in_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv4 = ET.SubElement(access_lists, "ipv4")
in_cg = ET.SubElement(ipv4, "in")
ipv4_acl_in_name = ET.SubElement(in_cg, "ipv4-acl-in-name")
ipv4_acl_in_name.text = kwargs.pop('ipv4_acl_in_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv4_in_cg_ipv4_acl_in_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv4 = ET.SubElement(access_lists, "ipv4")
in_cg = ET.SubElement(ipv4, "in")
ipv4_acl_in_dir = ET.SubElement(in_cg, "ipv4-acl-in-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv4_out_ipv4_acl_out_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv4 = ET.SubElement(access_lists, "ipv4")
out = ET.SubElement(ipv4, "out")
ipv4_acl_out_name = ET.SubElement(out, "ipv4-acl-out-name")
ipv4_acl_out_name.text = kwargs.pop('ipv4_acl_out_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv4_out_ipv4_acl_out_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv4 = ET.SubElement(access_lists, "ipv4")
out = ET.SubElement(ipv4, "out")
ipv4_acl_out_dir = ET.SubElement(out, "ipv4-acl-out-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv6_in_cg_ipv6_acl_in_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv6 = ET.SubElement(access_lists, "ipv6")
in_cg = ET.SubElement(ipv6, "in")
ipv6_acl_in_name = ET.SubElement(in_cg, "ipv6-acl-in-name")
ipv6_acl_in_name.text = kwargs.pop('ipv6_acl_in_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv6_in_cg_ipv6_acl_in_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv6 = ET.SubElement(access_lists, "ipv6")
in_cg = ET.SubElement(ipv6, "in")
ipv6_acl_in_dir = ET.SubElement(in_cg, "ipv6-acl-in-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv6_out_ipv6_acl_out_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv6 = ET.SubElement(access_lists, "ipv6")
out = ET.SubElement(ipv6, "out")
ipv6_acl_out_name = ET.SubElement(out, "ipv6-acl-out-name")
ipv6_acl_out_name.text = kwargs.pop('ipv6_acl_out_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv6_out_ipv6_acl_out_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv6 = ET.SubElement(access_lists, "ipv6")
out = ET.SubElement(ipv6, "out")
ipv6_acl_out_dir = ET.SubElement(out, "ipv6-acl-out-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_activate(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
activate = ET.SubElement(overlay_gateway, "activate")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def ovsdb_server_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
ovsdb_server = ET.SubElement(config, "ovsdb-server", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name = ET.SubElement(ovsdb_server, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def ovsdb_server_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
ovsdb_server = ET.SubElement(config, "ovsdb-server", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(ovsdb_server, "name")
name_key.text = kwargs.pop('name')
port = ET.SubElement(ovsdb_server, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def ovsdb_server_activate(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
ovsdb_server = ET.SubElement(config, "ovsdb-server", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(ovsdb_server, "name")
name_key.text = kwargs.pop('name')
activate = ET.SubElement(ovsdb_server, "activate")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name = ET.SubElement(nsx_controller, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_connection_addr_address(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
connection_addr = ET.SubElement(nsx_controller, "connection-addr")
address = ET.SubElement(connection_addr, "address")
address.text = kwargs.pop('address')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_connection_addr_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
connection_addr = ET.SubElement(nsx_controller, "connection-addr")
port = ET.SubElement(connection_addr, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_connection_addr_method(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
connection_addr = ET.SubElement(nsx_controller, "connection-addr")
method = ET.SubElement(connection_addr, "method")
method.text = kwargs.pop('method')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_reconnect_interval(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
reconnect_interval = ET.SubElement(nsx_controller, "reconnect-interval")
reconnect_interval.text = kwargs.pop('reconnect_interval')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def nsx_controller_activate(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
nsx_controller = ET.SubElement(config, "nsx-controller", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(nsx_controller, "name")
name_key.text = kwargs.pop('name')
activate = ET.SubElement(nsx_controller, "activate")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name = ET.SubElement(overlay_gateway, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_gw_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
gw_type = ET.SubElement(overlay_gateway, "gw-type")
gw_type.text = kwargs.pop('gw_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_ip_interface_ve_ve_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
ip = ET.SubElement(overlay_gateway, "ip")
interface = ET.SubElement(ip, "interface")
ve = ET.SubElement(interface, "ve")
ve_id = ET.SubElement(ve, "ve-id")
ve_id.text = kwargs.pop('ve_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_ip_interface_ve_vrrp_extended_group(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
ip = ET.SubElement(overlay_gateway, "ip")
interface = ET.SubElement(ip, "interface")
ve = ET.SubElement(interface, "ve")
vrrp_extended_group = ET.SubElement(ve, "vrrp-extended-group")
vrrp_extended_group.text = kwargs.pop('vrrp_extended_group')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_ip_interface_loopback_loopback_id(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
ip = ET.SubElement(overlay_gateway, "ip")
interface = ET.SubElement(ip, "interface")
loopback = ET.SubElement(interface, "loopback")
loopback_id = ET.SubElement(loopback, "loopback-id")
loopback_id.text = kwargs.pop('loopback_id')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_attach_rbridge_id_rb_add(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
attach = ET.SubElement(overlay_gateway, "attach")
rbridge_id = ET.SubElement(attach, "rbridge-id")
rb_add = ET.SubElement(rbridge_id, "rb-add")
rb_add.text = kwargs.pop('rb_add')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_attach_rbridge_id_rb_remove(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
attach = ET.SubElement(overlay_gateway, "attach")
rbridge_id = ET.SubElement(attach, "rbridge-id")
rb_remove = ET.SubElement(rbridge_id, "rb-remove")
rb_remove.text = kwargs.pop('rb_remove')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_attach_vlan_vid(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
attach = ET.SubElement(overlay_gateway, "attach")
vlan = ET.SubElement(attach, "vlan")
mac_key = ET.SubElement(vlan, "mac")
mac_key.text = kwargs.pop('mac')
vid = ET.SubElement(vlan, "vid")
vid.text = kwargs.pop('vid')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_attach_vlan_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
attach = ET.SubElement(overlay_gateway, "attach")
vlan = ET.SubElement(attach, "vlan")
vid_key = ET.SubElement(vlan, "vid")
vid_key.text = kwargs.pop('vid')
mac = ET.SubElement(vlan, "mac")
mac.text = kwargs.pop('mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_map_vlan_vni_mapping_vid(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
map = ET.SubElement(overlay_gateway, "map")
vlan_vni_mapping = ET.SubElement(map, "vlan-vni-mapping")
vid = ET.SubElement(vlan_vni_mapping, "vid")
vid.text = kwargs.pop('vid')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_map_vlan_vni_mapping_vni(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
map = ET.SubElement(overlay_gateway, "map")
vlan_vni_mapping = ET.SubElement(map, "vlan-vni-mapping")
vid_key = ET.SubElement(vlan_vni_mapping, "vid")
vid_key.text = kwargs.pop('vid')
vni = ET.SubElement(vlan_vni_mapping, "vni")
vni.text = kwargs.pop('vni')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_map_vlan_vni_auto(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
map = ET.SubElement(overlay_gateway, "map")
vlan = ET.SubElement(map, "vlan")
vni = ET.SubElement(vlan, "vni")
auto = ET.SubElement(vni, "auto")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name = ET.SubElement(site, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_tunnel_dst_address(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
tunnel_dst = ET.SubElement(site, "tunnel-dst")
address = ET.SubElement(tunnel_dst, "address")
address.text = kwargs.pop('address')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_extend_vlan_add(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
extend = ET.SubElement(site, "extend")
vlan = ET.SubElement(extend, "vlan")
add = ET.SubElement(vlan, "add")
add.text = kwargs.pop('add')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_extend_vlan_remove(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
extend = ET.SubElement(site, "extend")
vlan = ET.SubElement(extend, "vlan")
remove = ET.SubElement(vlan, "remove")
remove.text = kwargs.pop('remove')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_mac_learning_protocol(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
mac_learning = ET.SubElement(site, "mac-learning")
protocol = ET.SubElement(mac_learning, "protocol")
protocol.text = kwargs.pop('protocol')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_enable(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd_enable = ET.SubElement(site, "bfd-enable")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_params_interval_min_tx(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd = ET.SubElement(site, "bfd")
params = ET.SubElement(bfd, "params")
interval = ET.SubElement(params, "interval")
min_tx = ET.SubElement(interval, "min-tx")
min_tx.text = kwargs.pop('min_tx')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_params_interval_min_rx(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd = ET.SubElement(site, "bfd")
params = ET.SubElement(bfd, "params")
interval = ET.SubElement(params, "interval")
min_rx = ET.SubElement(interval, "min-rx")
min_rx.text = kwargs.pop('min_rx')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_params_interval_multiplier(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd = ET.SubElement(site, "bfd")
params = ET.SubElement(bfd, "params")
interval = ET.SubElement(params, "interval")
multiplier = ET.SubElement(interval, "multiplier")
multiplier.text = kwargs.pop('multiplier')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_bfd_params_bfd_shutdown(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
bfd = ET.SubElement(site, "bfd")
params = ET.SubElement(bfd, "params")
bfd_shutdown = ET.SubElement(params, "bfd-shutdown")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_site_shutdown(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
site = ET.SubElement(overlay_gateway, "site")
name_key = ET.SubElement(site, "name")
name_key.text = kwargs.pop('name')
shutdown = ET.SubElement(site, "shutdown")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_enable_statistics_stats_direction(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
enable = ET.SubElement(overlay_gateway, "enable")
statistics = ET.SubElement(enable, "statistics")
stats_direction = ET.SubElement(statistics, "stats-direction")
stats_direction.text = kwargs.pop('stats_direction')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_enable_statistics_vlan_action(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
enable = ET.SubElement(overlay_gateway, "enable")
statistics = ET.SubElement(enable, "statistics")
vlan_action = ET.SubElement(statistics, "vlan-action")
vlan_action.text = kwargs.pop('vlan_action')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_enable_statistics_vlan_list(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
enable = ET.SubElement(overlay_gateway, "enable")
statistics = ET.SubElement(enable, "statistics")
vlan_list = ET.SubElement(statistics, "vlan-list")
vlan_list.text = kwargs.pop('vlan_list')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_session(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session = ET.SubElement(monitor, "session")
session.text = kwargs.pop('session')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_direction(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
direction = ET.SubElement(monitor, "direction")
direction.text = kwargs.pop('direction')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_remote_endpoint(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
remote_endpoint = ET.SubElement(monitor, "remote-endpoint")
remote_endpoint.text = kwargs.pop('remote_endpoint')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_vlan_leaf(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
vlan_leaf = ET.SubElement(monitor, "vlan-leaf")
vlan_leaf.text = kwargs.pop('vlan_leaf')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_vlan_add_remove(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
vlan_add_remove = ET.SubElement(monitor, "vlan-add-remove")
vlan_add_remove.text = kwargs.pop('vlan_add_remove')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_monitor_vlan_range(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
monitor = ET.SubElement(overlay_gateway, "monitor")
session_key = ET.SubElement(monitor, "session")
session_key.text = kwargs.pop('session')
vlan_range = ET.SubElement(monitor, "vlan-range")
vlan_range.text = kwargs.pop('vlan_range')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_sflow_sflow_profile_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
sflow = ET.SubElement(overlay_gateway, "sflow")
sflow_profile_name = ET.SubElement(sflow, "sflow-profile-name")
sflow_profile_name.text = kwargs.pop('sflow_profile_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_sflow_sflow_remote_endpoint(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
sflow = ET.SubElement(overlay_gateway, "sflow")
sflow_profile_name_key = ET.SubElement(sflow, "sflow-profile-name")
sflow_profile_name_key.text = kwargs.pop('sflow_profile_name')
sflow_remote_endpoint = ET.SubElement(sflow, "sflow-remote-endpoint")
sflow_remote_endpoint.text = kwargs.pop('sflow_remote_endpoint')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_sflow_sflow_vlan_action(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
sflow = ET.SubElement(overlay_gateway, "sflow")
sflow_profile_name_key = ET.SubElement(sflow, "sflow-profile-name")
sflow_profile_name_key.text = kwargs.pop('sflow_profile_name')
sflow_vlan_action = ET.SubElement(sflow, "sflow-vlan-action")
sflow_vlan_action.text = kwargs.pop('sflow_vlan_action')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_sflow_sflow_vlan_range(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
sflow = ET.SubElement(overlay_gateway, "sflow")
sflow_profile_name_key = ET.SubElement(sflow, "sflow-profile-name")
sflow_profile_name_key.text = kwargs.pop('sflow_profile_name')
sflow_vlan_range = ET.SubElement(sflow, "sflow-vlan-range")
sflow_vlan_range.text = kwargs.pop('sflow_vlan_range')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_mac_in_cg_mac_acl_in_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
mac = ET.SubElement(access_lists, "mac")
in_cg = ET.SubElement(mac, "in")
mac_acl_in_name = ET.SubElement(in_cg, "mac-acl-in-name")
mac_acl_in_name.text = kwargs.pop('mac_acl_in_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_mac_in_cg_mac_acl_in_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
mac = ET.SubElement(access_lists, "mac")
in_cg = ET.SubElement(mac, "in")
mac_acl_in_dir = ET.SubElement(in_cg, "mac-acl-in-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_mac_out_mac_acl_out_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
mac = ET.SubElement(access_lists, "mac")
out = ET.SubElement(mac, "out")
mac_acl_out_name = ET.SubElement(out, "mac-acl-out-name")
mac_acl_out_name.text = kwargs.pop('mac_acl_out_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_mac_out_mac_acl_out_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
mac = ET.SubElement(access_lists, "mac")
out = ET.SubElement(mac, "out")
mac_acl_out_dir = ET.SubElement(out, "mac-acl-out-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv4_in_cg_ipv4_acl_in_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv4 = ET.SubElement(access_lists, "ipv4")
in_cg = ET.SubElement(ipv4, "in")
ipv4_acl_in_name = ET.SubElement(in_cg, "ipv4-acl-in-name")
ipv4_acl_in_name.text = kwargs.pop('ipv4_acl_in_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv4_in_cg_ipv4_acl_in_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv4 = ET.SubElement(access_lists, "ipv4")
in_cg = ET.SubElement(ipv4, "in")
ipv4_acl_in_dir = ET.SubElement(in_cg, "ipv4-acl-in-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv4_out_ipv4_acl_out_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv4 = ET.SubElement(access_lists, "ipv4")
out = ET.SubElement(ipv4, "out")
ipv4_acl_out_name = ET.SubElement(out, "ipv4-acl-out-name")
ipv4_acl_out_name.text = kwargs.pop('ipv4_acl_out_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv4_out_ipv4_acl_out_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv4 = ET.SubElement(access_lists, "ipv4")
out = ET.SubElement(ipv4, "out")
ipv4_acl_out_dir = ET.SubElement(out, "ipv4-acl-out-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv6_in_cg_ipv6_acl_in_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv6 = ET.SubElement(access_lists, "ipv6")
in_cg = ET.SubElement(ipv6, "in")
ipv6_acl_in_name = ET.SubElement(in_cg, "ipv6-acl-in-name")
ipv6_acl_in_name.text = kwargs.pop('ipv6_acl_in_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv6_in_cg_ipv6_acl_in_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv6 = ET.SubElement(access_lists, "ipv6")
in_cg = ET.SubElement(ipv6, "in")
ipv6_acl_in_dir = ET.SubElement(in_cg, "ipv6-acl-in-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv6_out_ipv6_acl_out_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv6 = ET.SubElement(access_lists, "ipv6")
out = ET.SubElement(ipv6, "out")
ipv6_acl_out_name = ET.SubElement(out, "ipv6-acl-out-name")
ipv6_acl_out_name.text = kwargs.pop('ipv6_acl_out_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_access_lists_ipv6_out_ipv6_acl_out_dir(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
access_lists = ET.SubElement(overlay_gateway, "access-lists")
ipv6 = ET.SubElement(access_lists, "ipv6")
out = ET.SubElement(ipv6, "out")
ipv6_acl_out_dir = ET.SubElement(out, "ipv6-acl-out-dir")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def overlay_gateway_activate(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
overlay_gateway = ET.SubElement(config, "overlay-gateway", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(overlay_gateway, "name")
name_key.text = kwargs.pop('name')
activate = ET.SubElement(overlay_gateway, "activate")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def ovsdb_server_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
ovsdb_server = ET.SubElement(config, "ovsdb-server", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name = ET.SubElement(ovsdb_server, "name")
name.text = kwargs.pop('name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def ovsdb_server_port(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
ovsdb_server = ET.SubElement(config, "ovsdb-server", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(ovsdb_server, "name")
name_key.text = kwargs.pop('name')
port = ET.SubElement(ovsdb_server, "port")
port.text = kwargs.pop('port')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def ovsdb_server_activate(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
ovsdb_server = ET.SubElement(config, "ovsdb-server", xmlns="urn:brocade.com:mgmt:brocade-tunnels")
name_key = ET.SubElement(ovsdb_server, "name")
name_key.text = kwargs.pop('name')
activate = ET.SubElement(ovsdb_server, "activate")
callback = kwargs.pop('callback', self._callback)
return callback(config)
| 45.034289 | 112 | 0.642631 | 9,398 | 80,116 | 5.280379 | 0.011173 | 0.144605 | 0.063395 | 0.101642 | 0.99732 | 0.99732 | 0.99732 | 0.99732 | 0.99732 | 0.99732 | 0 | 0.002065 | 0.226259 | 80,116 | 1,779 | 113 | 45.034289 | 0.79848 | 0.042576 | 0 | 0.996942 | 1 | 0 | 0.162007 | 0.055964 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08945 | false | 0 | 0.000765 | 0 | 0.179664 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b9348032fd474c594408d163f9fccf95328cc519 | 16,618 | py | Python | preprocessing-src/preprocess.py | sominw/cs585-q2q | 1efcd3d50972ee8746fae3d60130dc3dd964e5e9 | [
"RSA-MD"
] | 2 | 2019-12-20T01:11:02.000Z | 2019-12-28T03:34:12.000Z | preprocessing-src/preprocess.py | sominwadhwa/cs585-q2q | 1efcd3d50972ee8746fae3d60130dc3dd964e5e9 | [
"RSA-MD"
] | 7 | 2020-03-24T18:03:37.000Z | 2021-06-02T00:47:57.000Z | preprocessing-src/preprocess.py | sominw/cs585-q2q | 1efcd3d50972ee8746fae3d60130dc3dd964e5e9 | [
"RSA-MD"
] | 2 | 2019-12-20T01:10:48.000Z | 2021-04-27T20:57:49.000Z | import json
import re
import nltk
from nltk.corpus import stopwords
from nltk.stem import WordNetLemmatizer
import numpy as np
import copy
import functools
nltk.download('wordnet')
def lemmatize(qs):
'''
:param qs: List of NL questions
:return:
'''
lem = WordNetLemmatizer()
queries = []
for q in qs:
processed = q.lower()
processed = re.sub('[?]', '', processed) # Can I just remove last char?
tokens = processed.split()
tokens = [token for token in tokens if token not in stopwords.words('english')]
lem_tokens = [lem.lemmatize(token) for token in tokens]
processed = ' '.join(lem_tokens)
queries.append(processed)
return queries
def lemmatize_w_qwords(qs):
'''
:param qs: List of NL questions
:return:
'''
swords = stopwords.words('english')
swords = set(swords)
qwords = {'what', 'which', 'who', 'whom', 'when', 'where', 'why', 'how'}
swords = swords - qwords
lem = WordNetLemmatizer()
queries = []
for q in qs:
processed = q.lower()
processed = re.sub('[?]', '', processed) # Can I just remove last char?
tokens = processed.split()
tokens = [token for token in tokens if token not in swords]
lem_tokens = [lem.lemmatize(token) for token in tokens]
processed = ' '.join(lem_tokens)
queries.append(processed)
return queries
def lemmatize_single(q):
lem = WordNetLemmatizer()
processed = q.lower()
processed = re.sub('[?]', '', processed) # Can I just remove last char?
tokens = processed.split()
tokens = [token for token in tokens if token not in stopwords.words('english')]
lem_tokens = [lem.lemmatize(token) for token in tokens]
processed = ' '.join(lem_tokens)
return processed
def lemmatize_single_w_qwords(q):
swords = stopwords.words('english')
swords = set(swords)
qwords = {'what', 'which', 'who', 'whom', 'when', 'where', 'why', 'how'}
swords = swords - qwords
lem = WordNetLemmatizer()
processed = q.lower()
processed = re.sub('[?]', '', processed) # Can I just remove last char?
tokens = processed.split()
tokens = [token for token in tokens if token not in swords]
lem_tokens = [lem.lemmatize(token) for token in tokens]
processed = ' '.join(lem_tokens)
return processed
def lemmatize_google(qs):
'''
:param qs: List of NL questions
:return:
'''
lem = WordNetLemmatizer()
queries = []
for q in qs:
tokens = q.split()
tokens = [token for token in tokens if token not in stopwords.words('english')]
lem_tokens = [lem.lemmatize(token) for token in tokens]
processed = ' '.join(lem_tokens)
queries.append(processed)
return queries
def output_google_queries():
with open('v1.0-simplified_simplified-nq-train.jsonl', 'r') as json_file:
json_list = list(json_file)
qs = []
for json_str in json_list:
result = json.loads(json_str)
qs.append(result['question_text'])
queries = lemmatize_google(qs)
both = zip(queries, qs)
with open('google_queries.txt', 'w') as text_file:
for query, q in both:
text_file.write(query + '\n' + q + '\n\n')
def output_squad_queries():
json_file = open('train-v2.0.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
'''
# Check if all end in ?. At least one does not.
qs = [q['question'][-1] == '?' for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
print(all(qs))
'''
queries = lemmatize(qs)
#queries = lemmatize_w_qwords(qs)
both = zip(queries, qs)
with open('squad_queries.txt', 'w') as text_file:
for query, q in both:
text_file.write(query + '\n' + q + '\n\n')
def output_squad_queries_qwords():
json_file = open('train-v2.0.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
'''
# Check if all end in ?. At least one does not.
qs = [q['question'][-1] == '?' for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
print(all(qs))
'''
queries = lemmatize(qs)
queries2 = lemmatize_w_qwords(qs)
both = zip(queries2, queries, qs)
with open('squad_queries_3.txt', 'w') as text_file:
for query2, query, q in both:
text_file.write(query2 + '\n' + query + '\n' + q + '\n\n')
def output_squad_queries_somin():
json_file = open('train-v2.0.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
processed = [re.sub('[?]', '', q) for q in qs]
'''
# Check if all end in ?. At least one does not.
qs = [q['question'][-1] == '?' for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
print(all(qs))
'''
queries = lemmatize(processed)
both = zip(queries, processed)
with open('train_somin_2.txt', 'w') as text_file:
for query, q in both:
text_file.write(query + '.\t' + q + '.\n')
json_file = open('dev-v2.0.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
'''
# Check if all end in ?. At least one does not.
qs = [q['question'][-1] == '?' for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
print(all(qs))
'''
queries = lemmatize(processed)
both = zip(queries, processed)
with open('dev_somin_2.txt', 'w') as text_file:
for query, q in both:
text_file.write(query + '.\t' + q + '.\n')
def output_squad_queries_original():
json_file = open('train-v2.0.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
'''
# Check if all end in ?. At least one does not.
qs = [q['question'][-1] == '?' for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
print(all(qs))
'''
queries = lemmatize(qs)
both = zip(queries, qs)
with open('queries_train.txt', 'w') as text_file:
with open('questions_train.txt', 'w') as text_file2:
for query, q in both:
text_file.write(query + '\n')
text_file2.write(q + '\n')
json_file = open('dev-v2.0.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
'''
# Check if all end in ?. At least one does not.
qs = [q['question'][-1] == '?' for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
print(all(qs))
'''
queries = lemmatize(qs)
both = zip(queries, qs)
with open('queries_dev.txt', 'w') as text_file:
with open('questions_dev.txt', 'w') as text_file2:
for query, q in both:
text_file.write(query + '\n')
text_file2.write(q + '\n')
def output_squad_queries_nparrays():
json_file = open('train-v2.0.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
'''
# Check if all end in ?. At least one does not.
qs = [q['question'][-1] == '?' for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
print(all(qs))
'''
queries = lemmatize(qs)
both = zip(queries, qs)
big_array = []
for query, q in both:
big_array.append([query, q])
big_array = np.array(big_array)
print(big_array.shape)
#big_array = np.savetxt('keras_input_train.txt', big_array, fmt="%s", delimiter='/t')
big_array = np.save('keras_input_train.npy', big_array)
# json_file = open('dev-v2.0.json')
# json_str = json_file.read()
# json_data = json.loads(json_str)
# qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
#
# '''
# # Check if all end in ?. At least one does not.
# qs = [q['question'][-1] == '?' for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
# print(all(qs))
# '''
#
# queries = lemmatize(qs)
# both = zip(queries, qs)
#
# with open('queries_dev.txt', 'w') as text_file:
# with open('questions_dev.txt', 'w') as text_file2:
# for query, q in both:
# text_file.write(query + '\n')
# text_file2.write(q + '\n')
def squad_stats():
json_file = open('train-v2.0.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas']]
json_file_dev = open('dev-v2.0.json')
json_str_dev = json_file_dev.read()
json_data_dev = json.loads(json_str_dev)
qs_dev = [q['question'] for i in json_data_dev['data'] for d in i['paragraphs'] for q in d['qas']]
print('number of docs: ', len(json_data['data']) + len(json_data_dev['data']))
# for i in json_data['data']:
# for d in i['paragraphs']:
# len(d['qas'])
num_qs_per_para = np.array([len(d['qas']) for i in json_data['data'] for d in i['paragraphs']])
print('qs per para, train: ', np.mean(num_qs_per_para))
# Number of questions
print(len(qs)) # 130319.
# Number of dev questions
print(len(qs_dev))
qs.extend(qs_dev)
print('train + dev: ', len(qs))
# Number of unanswerable questions.
unanswer_qs = [q['question'] for i in json_data['data'] for d in i['paragraphs'] for q in d['qas'] if q['is_impossible']]
print(len(unanswer_qs)) # 43498
unanswer_qs_dev = [q['question'] for i in json_data_dev['data'] for d in i['paragraphs'] for q in d['qas'] if
q['is_impossible']]
print(len(unanswer_qs_dev))
print('train + dev unanswerable: ', len(unanswer_qs)+len(unanswer_qs_dev))
#context_lens = np.array([len(d['context'].split()) for i in json_data['data'] for d in i['paragraphs']])
context_lens = [len(d['context'].split()) for i in json_data['data'] for d in i['paragraphs']]
context_lens.extend([len(d['context'].split()) for i in json_data_dev['data'] for d in i['paragraphs']])
context_lens = np.array(context_lens)
avg_len_context = np.mean(context_lens)
max_len_context = np.max(context_lens)
min_len_context = np.min(context_lens)
print('avg context len: ', avg_len_context) # 116.58550039401104
print('max context len: ', max_len_context) # 653
print('min context len: ', min_len_context) # 20
len_qs = np.array([len(q.split()) for q in qs])
avg_len_qs = np.mean(len_qs)
max_len_qs = np.max(len_qs)
min_len_qs = np.min(len_qs)
queries = lemmatize(qs)
len_queries = np.array([len(q.split()) for q in queries])
avg_len_queries = np.mean(len_queries)
max_len_queries = np.max(len_queries)
min_len_queries = np.min(len_queries)
print('Questions:')
print('avg len: ', avg_len_qs) # 9.893822082735442
print('max len: ', max_len_qs) # 40
print('min len: ', min_len_qs) # 1
print('Queries:')
print('avg len: ', avg_len_queries) # 5.3351545054827
print('max len: ', max_len_queries) # 31
print('min len: ', min_len_queries) # 0
def google_stats():
with open('v1.0-simplified_simplified-nq-train.jsonl', 'r') as json_file:
json_list = list(json_file)
qs = []
for json_str in json_list:
result = json.loads(json_str)
qs.append(result['question_text'])
#print(result['question_text'])
#print(result['document_text'])
print(len(qs))
len_qs = np.array([len(q.split()) for q in qs])
avg_len_qs = np.mean(len_qs)
max_len_qs = np.max(len_qs)
min_len_qs = np.min(len_qs)
queries = lemmatize(qs)
len_queries = np.array([len(q.split()) for q in queries])
avg_len_queries = np.mean(len_queries)
max_len_queries = np.max(len_queries)
min_len_queries = np.min(len_queries)
print('Questions:')
print('avg len: ', avg_len_qs) # 9.893822082735442
print('max len: ', max_len_qs) # 40
print('min len: ', min_len_qs) # 1
print('Queries:')
print('avg len: ', avg_len_queries) # 5.3351545054827
print('max len: ', max_len_queries) # 31
print('min len: ', min_len_queries) # 0
def hotpot_stats():
json_file = open('hotpot_train_v1.1.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [item['question'] for item in json_data]
json_file_dev = open('hotpot_dev_distractor_v1.json')
json_str_dev = json_file_dev.read()
json_data_dev = json.loads(json_str_dev)
qs_dev = [item['question'] for item in json_data_dev]
qs.extend(qs_dev)
context_lens = []
for item in json_data:
context_lens.append(np.sum(np.array([functools.reduce(lambda x, y: x + len(y.split()), para[1], 0) for para in item['context']])))
for item in json_data_dev:
context_lens.append(np.sum(
np.array([functools.reduce(lambda x, y: x + len(y.split()), para[1], 0) for para in item['context']])))
context_lens = np.array(context_lens)
# Number of questions
print(len(qs)) # 90447
print(len(qs_dev))
print('Train + dev: ', len(qs)+len(qs_dev))
avg_len_context = np.mean(context_lens)
max_len_context = np.max(context_lens)
min_len_context = np.min(context_lens)
print('avg context len: ', avg_len_context) #
print('max context len: ', max_len_context) #
print('min context len: ', min_len_context) #
len_qs = np.array([len(q.split()) for q in qs])
avg_len_qs = np.mean(len_qs)
max_len_qs = np.max(len_qs)
min_len_qs = np.min(len_qs)
queries = lemmatize(qs)
len_queries = np.array([len(q.split()) for q in queries])
avg_len_queries = np.mean(len_queries)
max_len_queries = np.max(len_queries)
min_len_queries = np.min(len_queries)
print('Questions:')
print('avg len: ', avg_len_qs) #
print('max len: ', max_len_qs) #
print('min len: ', min_len_qs) #
print('Queries:')
print('avg len: ', avg_len_queries) #
print('max len: ', max_len_queries) #
print('min len: ', min_len_queries) #
def squad_query_json():
json_file = open('train-v2.0.json')
#json_file = open('dev-v2.0.json')
json_str = json_file.read()
json_data = json.loads(json_str)
json_query = copy.deepcopy(json_data)
for i in json_query['data']:
for d in i['paragraphs']:
for q in d['qas']:
#q['question'] = lemmatize_single(q['question'])
q['question'] = lemmatize_single_w_qwords(q['question'])
json_query_str = json.dumps(json_query)
with open('squad_query_qwords_train.json', 'w') as f:
# with open('squad_query_qwords_dev.json', 'w') as f:
f.write(json_query_str)
def output_hotpot_queries():
json_file = open('hotpot_train_v1.1.json')
json_str = json_file.read()
json_data = json.loads(json_str)
qs = [item['question'] for item in json_data]
queries = lemmatize(qs)
both = zip(queries, qs)
with open('hotpot_queries.txt', 'w') as text_file:
for query, q in both:
text_file.write(query + '\n' + q + '\n\n')
if __name__ == "__main__":
#output_hotpot_queries()
#output_squad_queries()
#squad_stats()
#squad_query_json()
#output_google_queries()
#print(stopwords.words('english'))
#google_stats()
#hotpot_stats()
# with open('v1.0-simplified_simplified-nq-train.jsonl', 'r') as json_file:
# json_list = list(json_file)
#
# qs = []
# for json_str in json_list:
# result = json.loads(json_str)
# print(result.keys())
# break
#output_squad_queries_nparrays()
# f = np.loadtxt('keras_input_train.txt', dtype='str', delimiter='/t')
#
# print(f.shape)
#squad_query_json()
# f = np.load('keras_input_train.npy')
# print(f.shape)
# l = [['a b c', 'd e f'], ['g h i', 'j k i']]
# f = np.array(l)
# print(f.shape)
output_squad_queries_somin() # He asked for a certain format.... | 30.888476 | 138 | 0.611746 | 2,517 | 16,618 | 3.85737 | 0.073103 | 0.038727 | 0.019157 | 0.026779 | 0.849418 | 0.825729 | 0.790607 | 0.761149 | 0.747348 | 0.738284 | 0 | 0.013366 | 0.234625 | 16,618 | 538 | 139 | 30.888476 | 0.74998 | 0.135275 | 0 | 0.731544 | 0 | 0 | 0.122666 | 0.015885 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053691 | false | 0 | 0.026846 | 0 | 0.097315 | 0.144295 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b94122a8f0dc6465053e3fcc7da820790d051b50 | 153 | py | Python | what/models/detection/utils/__init__.py | wuhanstudio/whitebox-adversarial-toolbox | 3c6eaecc130fa987bc470225e259d0b4b58017ce | [
"MIT"
] | 2 | 2022-02-06T17:25:31.000Z | 2022-03-25T13:39:48.000Z | what/models/detection/utils/__init__.py | wuhanstudio/whitebox-adversarial-toolbox | 3c6eaecc130fa987bc470225e259d0b4b58017ce | [
"MIT"
] | null | null | null | what/models/detection/utils/__init__.py | wuhanstudio/whitebox-adversarial-toolbox | 3c6eaecc130fa987bc470225e259d0b4b58017ce | [
"MIT"
] | null | null | null | from what.models.detection.utils import array_utils
from what.models.detection.utils import box_utils
from what.models.detection.utils import time_utils
| 38.25 | 51 | 0.862745 | 24 | 153 | 5.375 | 0.375 | 0.186047 | 0.325581 | 0.534884 | 0.868217 | 0.868217 | 0.604651 | 0 | 0 | 0 | 0 | 0 | 0.078431 | 153 | 3 | 52 | 51 | 0.914894 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
b9c676746f7068b46909ac10f61db6b7d7cac652 | 154,720 | py | Python | tests/native/test_dyn.py | dmgress/manticore | 61a5f934363980705ea1eea9717600eff005791b | [
"Apache-2.0"
] | null | null | null | tests/native/test_dyn.py | dmgress/manticore | 61a5f934363980705ea1eea9717600eff005791b | [
"Apache-2.0"
] | null | null | null | tests/native/test_dyn.py | dmgress/manticore | 61a5f934363980705ea1eea9717600eff005791b | [
"Apache-2.0"
] | null | null | null | import unittest
from manticore.native.cpu.abstractcpu import ConcretizeRegister
from manticore.native.cpu.x86 import AMD64Cpu
from manticore.native.memory import *
from manticore.core.smtlib.solver import Z3Solver
solver = Z3Solver.instance()
class CPUTest(unittest.TestCase):
_multiprocess_can_split_ = True
class ROOperand:
""" Mocking class for operand ronly """
def __init__(self, size, value):
self.size = size
self.value = value
def read(self):
return self.value & ((1 << self.size) - 1)
class RWOperand(ROOperand):
""" Mocking class for operand rw """
def write(self, value):
self.value = value & ((1 << self.size) - 1)
return self.value
def test_MOVHPD_1(self):
""" Instruction MOVHPD_1
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A249D1, "IVATE\x00\x00\x00")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0xFFFFFFFFFFFF00FF52505F4342494C47
cpu.RDI = 0x7FFFF7A249C9
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A249D1:0x7FFFF7A249D9],
[b"I", b"V", b"A", b"T", b"E", b"\x00", b"\x00", b"\x00"],
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 5492818941963568420245782219847)
self.assertEqual(cpu.RDI, 140737347996105)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_10(self):
""" Instruction MOVHPD_10
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0xFFFFFFFF00FFFFFF2E325F4342494C47
cpu.RDI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RDI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_11(self):
""" Instruction MOVHPD_11
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
cpu.XMM2 = 0x42494C4700352E322E325F4342494C47
cpu.RSI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(cpu.XMM2, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RSI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_12(self):
""" Instruction MOVHPD_12
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0xFFFFFFFF00FFFFFF2E325F4342494C47
cpu.RDI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RDI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_13(self):
""" Instruction MOVHPD_13
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A21000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A218DA, "tart_mai")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0x735F6362696C5F5F
cpu.RDI = 0x7FFFF7A218D2
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A218DA:0x7FFFF7A218E2], [b"t", b"a", b"r", b"t", b"_", b"m", b"a", b"i"]
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 140074810698054820722452200425796689759)
self.assertEqual(cpu.RDI, 140737347983570)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_14(self):
""" Instruction MOVHPD_14
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A20000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A20A9B, "\x00acct\x00_n")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
cpu.XMM2 = 0x36766772615F6C645F
cpu.RSI = 0x7FFFF7A20A93
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A20A9B:0x7FFFF7A20AA3],
[b"\x00", b"a", b"c", b"c", b"t", b"\x00", b"_", b"n"],
)
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(cpu.XMM2, 146708356959127564005328096862462043231)
self.assertEqual(cpu.RSI, 140737347979923)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_15(self):
""" Instruction MOVHPD_15
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A23000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A232EE, "nable_se")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
cpu.XMM2 = 0x36655F6362696C5F5F
cpu.RSI = 0x7FFFF7A232E6
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A232EE:0x7FFFF7A232F6], [b"n", b"a", b"b", b"l", b"e", b"_", b"s", b"e"]
)
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(cpu.XMM2, 134851076577508085086976746042965122911)
self.assertEqual(cpu.RSI, 140737347990246)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_16(self):
""" Instruction MOVHPD_16
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
cpu.XMM2 = 0x42494C4700352E322E325F4342494C47
cpu.RSI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(cpu.XMM2, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RSI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_17(self):
""" Instruction MOVHPD_17
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DD7000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DD7671, "_dso_for")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0x646E69665F6C645F
cpu.RDI = 0x7FFFF7DD7669
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DD7671:0x7FFFF7DD7679], [b"_", b"d", b"s", b"o", b"_", b"f", b"o", b"r"]
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 152110412837725123259047000460919333983)
self.assertEqual(cpu.RDI, 140737351874153)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_18(self):
""" Instruction MOVHPD_18
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
cpu.XMM2 = 0x42494C4700352E322E325F4342494C47
cpu.RSI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(cpu.XMM2, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RSI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_19(self):
""" Instruction MOVHPD_19
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DD7000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DD7750, "obal_ro\x00")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0x6C675F646C74725F
cpu.RDI = 0x7FFFF7DD7748
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DD7750:0x7FFFF7DD7758], [b"o", b"b", b"a", b"l", b"_", b"r", b"o", b"\x00"]
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 578664706209732724830403288697696863)
self.assertEqual(cpu.RDI, 140737351874376)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_2(self):
""" Instruction MOVHPD_2
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0xFFFFFFFF00FFFFFF2E325F4342494C47
cpu.RDI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RDI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_20(self):
""" Instruction MOVHPD_20
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248B7, "-x86-64.")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0x78756E696C2D646C
cpu.RDI = 0x7FFFF7A248AF
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248B7:0x7FFFF7A248BF], [b"-", b"x", b"8", b"6", b"-", b"6", b"4", b"."]
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 61415586074916309421369241318231729260)
self.assertEqual(cpu.RDI, 140737347995823)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_21(self):
""" Instruction MOVHPD_21
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7B99000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7B99A30, "6\x00__vdso")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
cpu.XMM2 = 0x64765F5F00656D692E325F58554E494C
cpu.RSI = 0x7FFFF7B99A28
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7B99A30:0x7FFFF7B99A38], [b"6", b"\x00", b"_", b"_", b"v", b"d", b"s", b"o"]
)
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(cpu.XMM2, 148143459290256633805182000720633547084)
self.assertEqual(cpu.RSI, 140737349524008)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_3(self):
""" Instruction MOVHPD_3
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0xFFFFFFFF00FFFFFF2E325F4342494C47
cpu.RDI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RDI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_4(self):
""" Instruction MOVHPD_4
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
cpu.XMM2 = 0x42494C4700352E322E325F4342494C47
cpu.RSI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(cpu.XMM2, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RSI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_5(self):
""" Instruction MOVHPD_5
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.mmap(0x7FFFF7FFA000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
mem.write(0x7FFFF7FFA30C, "6\x00\x00\x00\x00\x00\x02\x00")
cpu.XMM1 = 0xFFFFFFFF00FFFFFF2E325F58554E494C
cpu.RDI = 0x7FFFF7FFA304
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(
mem[0x7FFFF7FFA30C:0x7FFFF7FFA314],
[b"6", b"\x00", b"\x00", b"\x00", b"\x00", b"\x00", b"\x02", b"\x00"],
)
self.assertEqual(cpu.XMM1, 10384593717070654710068880547400012)
self.assertEqual(cpu.RDI, 140737354113796)
self.assertEqual(cpu.RIP, 140737351985491)
def test_MOVHPD_6(self):
""" Instruction MOVHPD_6
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
cpu.XMM2 = 0x42494C4700352E322E325F4342494C47
cpu.RSI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(cpu.XMM2, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RSI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_7(self):
""" Instruction MOVHPD_7
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A248D6, "2.5\x00GLIB")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
cpu.XMM2 = 0x42494C4700352E322E325F4342494C47
cpu.RSI = 0x7FFFF7A248CE
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A248D6:0x7FFFF7A248DE], [b"2", b".", b"5", b"\x00", b"G", b"L", b"I", b"B"]
)
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(cpu.XMM2, 88109632480871197291218000195730623559)
self.assertEqual(cpu.RSI, 140737347995854)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_8(self):
""" Instruction MOVHPD_8
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.mmap(0x7FFFF7FF7000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
mem.write(0x7FFFF7FF74A8, "_64-linu")
cpu.XMM2 = 0x3638782F62696C2F
cpu.RSI = 0x7FFFF7FF74A0
cpu.RIP = 0x7FFFF7DF2953
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF2953:0x7FFFF7DF2958], [b"f", b"\x0f", b"\x16", b"V", b"\x08"]
)
self.assertEqual(
mem[0x7FFFF7FF74A8:0x7FFFF7FF74B0], [b"_", b"6", b"4", b"-", b"l", b"i", b"n", b"u"]
)
self.assertEqual(cpu.XMM2, 156092966384913869483545010807748783151)
self.assertEqual(cpu.RSI, 140737354101920)
self.assertEqual(cpu.RIP, 140737351985496)
def test_MOVHPD_9(self):
""" Instruction MOVHPD_9
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A21000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7A21315, "emalign\x00")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
cpu.XMM1 = 0xFFFFFFFF00FFFFFF6D5F6362696C5F5F
cpu.RDI = 0x7FFFF7A2130D
cpu.RIP = 0x7FFFF7DF294E
cpu.execute()
self.assertEqual(
mem[0x7FFFF7A21315:0x7FFFF7A2131D], [b"e", b"m", b"a", b"l", b"i", b"g", b"n", b"\x00"]
)
self.assertEqual(
mem[0x7FFFF7DF294E:0x7FFFF7DF2953], [b"f", b"\x0f", b"\x16", b"O", b"\x08"]
)
self.assertEqual(cpu.XMM1, 573250095127234633104266320675626847)
self.assertEqual(cpu.RDI, 140737347982093)
self.assertEqual(cpu.RIP, 140737351985491)
def test_PSLLDQ_1(self):
""" Instruction PSLLDQ_1
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x1
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 72057594037927936)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_10(self):
""" Instruction PSLLDQ_10
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x6972705F5F00362E6F732E6362696C00
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 61723168909761380161964749838612430848)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_11(self):
""" Instruction PSLLDQ_11
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x6972705F5F00362E6F732E6362696C00
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 61723168909761380161964749838612430848)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_12(self):
""" Instruction PSLLDQ_12
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x6972705F5F00362E6F732E6362696C00
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 61723168909761380161964749838612430848)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_13(self):
""" Instruction PSLLDQ_13
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x1
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 72057594037927936)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_14(self):
""" Instruction PSLLDQ_14
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x6972705F5F00362E6F732E6362696C00
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 61723168909761380161964749838612430848)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_15(self):
""" Instruction PSLLDQ_15
Groups: sse2
0x7ffff7df389d: pslldq xmm2, 4
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF389D, "f\x0fs\xfa\x04")
cpu.XMM2 = 0x3000000020002000000352E322E32
cpu.RIP = 0x7FFFF7DF389D
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF389D:0x7FFFF7DF38A2], [b"f", b"\x0f", b"s", b"\xfa", b"\x04"]
)
self.assertEqual(cpu.XMM2, 10384752173395664791945953216036864)
self.assertEqual(cpu.RIP, 140737351989410)
def test_PSLLDQ_16(self):
""" Instruction PSLLDQ_16
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x6972705F5F00362E6F732E6362696C00
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 61723168909761380161964749838612430848)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_17(self):
""" Instruction PSLLDQ_17
Groups: sse2
0x7ffff7df39dd: pslldq xmm2, 3
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF39DD, "f\x0fs\xfa\x03")
cpu.XMM2 = 0x494C4700352E322E325F4342494C4700
cpu.RIP = 0x7FFFF7DF39DD
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF39DD:0x7FFFF7DF39E2], [b"f", b"\x0f", b"s", b"\xfa", b"\x03"]
)
self.assertEqual(cpu.XMM2, 276128700049446162655260478745346048)
self.assertEqual(cpu.RIP, 140737351989730)
def test_PSLLDQ_18(self):
""" Instruction PSLLDQ_18
Groups: sse2
0x7ffff7df389d: pslldq xmm2, 4
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF389D, "f\x0fs\xfa\x04")
cpu.XMM2 = 0x665F4F495F006F6C6C657466006B6863
cpu.RIP = 0x7FFFF7DF389D
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF389D:0x7FFFF7DF38A2], [b"f", b"\x0f", b"s", b"\xfa", b"\x04"]
)
self.assertEqual(cpu.XMM2, 126278919537221597046423674937331941376)
self.assertEqual(cpu.RIP, 140737351989410)
def test_PSLLDQ_19(self):
""" Instruction PSLLDQ_19
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x1
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 72057594037927936)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_2(self):
""" Instruction PSLLDQ_2
Groups: sse2
0x7ffff7df2f70: pslldq xmm2, 0xb
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2F70, "f\x0fs\xfa\x0b")
cpu.XMM2 = 0x6972705F5F00362E6F732E6362696C00
cpu.RIP = 0x7FFFF7DF2F70
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF2F70:0x7FFFF7DF2F75], [b"f", b"\x0f", b"s", b"\xfa", b"\x0b"]
)
self.assertEqual(cpu.XMM2, 132104554884493019491015862172149350400)
self.assertEqual(cpu.RIP, 140737351987061)
def test_PSLLDQ_20(self):
""" Instruction PSLLDQ_20
Groups: sse2
0x7ffff7df3970: pslldq xmm2, 3
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3970, "f\x0fs\xfa\x03")
cpu.XMM2 = 0x322E6F732E34362D3638782D78756E69
cpu.RIP = 0x7FFFF7DF3970
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3970:0x7FFFF7DF3975], [b"f", b"\x0f", b"s", b"\xfa", b"\x03"]
)
self.assertEqual(cpu.XMM2, 153101124148370467217615035531131879424)
self.assertEqual(cpu.RIP, 140737351989621)
def test_PSLLDQ_21(self):
""" Instruction PSLLDQ_21
Groups: sse2
0x7ffff7df3830: pslldq xmm2, 4
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3830, "f\x0fs\xfa\x04")
cpu.XMM2 = 0x5F4342494C4700342E332E325F434249
cpu.RIP = 0x7FFFF7DF3830
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3830:0x7FFFF7DF3835], [b"f", b"\x0f", b"s", b"\xfa", b"\x04"]
)
self.assertEqual(cpu.XMM2, 101389984890772213670702594761716400128)
self.assertEqual(cpu.RIP, 140737351989301)
def test_PSLLDQ_3(self):
""" Instruction PSLLDQ_3
Groups: sse2
0x7ffff7df3ab0: pslldq xmm2, 2
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3AB0, "f\x0fs\xfa\x02")
cpu.XMM2 = 0x63007463656A626F5F726F665F6F7364
cpu.RIP = 0x7FFFF7DF3AB0
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3AB0:0x7FFFF7DF3AB5], [b"f", b"\x0f", b"s", b"\xfa", b"\x02"]
)
self.assertEqual(cpu.XMM2, 154706541852064556987039687627872927744)
self.assertEqual(cpu.RIP, 140737351989941)
def test_PSLLDQ_4(self):
""" Instruction PSLLDQ_4
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x6972705F5F00362E6F732E6362696C00
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 61723168909761380161964749838612430848)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_5(self):
""" Instruction PSLLDQ_5
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x6972705F5F00362E6F732E6362696C00
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 61723168909761380161964749838612430848)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_6(self):
""" Instruction PSLLDQ_6
Groups: sse2
0x7ffff7df389d: pslldq xmm2, 4
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF389D, "f\x0fs\xfa\x04")
cpu.XMM2 = 0x3000000020002000000352E322E32
cpu.RIP = 0x7FFFF7DF389D
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF389D:0x7FFFF7DF38A2], [b"f", b"\x0f", b"s", b"\xfa", b"\x04"]
)
self.assertEqual(cpu.XMM2, 10384752173395664791945953216036864)
self.assertEqual(cpu.RIP, 140737351989410)
def test_PSLLDQ_7(self):
""" Instruction PSLLDQ_7
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = 0x1
cpu.RIP = 0x7FFFF7DF3470
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3470:0x7FFFF7DF3475], [b"f", b"\x0f", b"s", b"\xfa", b"\x07"]
)
self.assertEqual(cpu.XMM2, 72057594037927936)
self.assertEqual(cpu.RIP, 140737351988341)
def test_PSLLDQ_8(self):
""" Instruction PSLLDQ_8
Groups: sse2
0x7ffff7df39dd: pslldq xmm2, 3
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF39DD, "f\x0fs\xfa\x03")
cpu.XMM2 = 0x7461636F6C6C6165645F6C645F00636F
cpu.RIP = 0x7FFFF7DF39DD
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF39DD:0x7FFFF7DF39E2], [b"f", b"\x0f", b"s", b"\xfa", b"\x03"]
)
self.assertEqual(cpu.XMM2, 148107273809595710738464457560820809728)
self.assertEqual(cpu.RIP, 140737351989730)
def test_PSLLDQ_9(self):
""" Instruction PSLLDQ_9
Groups: sse2
0x7ffff7df3c5d: pslldq xmm2, 1
"""
mem = Memory64()
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3C5D, "f\x0fs\xfa\x01")
cpu.XMM2 = 0x68252E7568254D00796164666F656D69
cpu.RIP = 0x7FFFF7DF3C5D
cpu.execute()
self.assertEqual(
mem[0x7FFFF7DF3C5D:0x7FFFF7DF3C62], [b"f", b"\x0f", b"s", b"\xfa", b"\x01"]
)
self.assertEqual(cpu.XMM2, 49422662792731052987857949274592340224)
self.assertEqual(cpu.RIP, 140737351990370)
def test_MOVHPD_1_symbolic(self):
""" Instruction MOVHPD_1
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A249D1)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A249D2)
value = cs.new_bitvec(8)
cs.add(value == 0x56)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A249D3)
value = cs.new_bitvec(8)
cs.add(value == 0x41)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A249D4)
value = cs.new_bitvec(8)
cs.add(value == 0x54)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A249D5)
value = cs.new_bitvec(8)
cs.add(value == 0x45)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A249D6)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A249D7)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A249D8)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
mem.write(0x7FFFF7DF2951, "O\x08")
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0xFFFFFFFFFFFF00FF52505F4342494C47)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7A249C9)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A249D3, 8) == ord("A"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A249D4, 8) == ord("T"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A249D5, 8) == ord("E"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A249D6, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A249D7, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A249D8, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A249D1, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A249D2, 8) == ord("V"))
condition = Operators.AND(condition, cpu.XMM1 == 0x455441564952505F4342494C47)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7A249C9)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_10_symbolic(self):
""" Instruction MOVHPD_10
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0xFFFFFFFF00FFFFFF2E325F4342494C47)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM1 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_11_symbolic(self):
""" Instruction MOVHPD_11
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_12_symbolic(self):
""" Instruction MOVHPD_12
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0xFFFFFFFF00FFFFFF2E325F4342494C47)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM1 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_13_symbolic(self):
""" Instruction MOVHPD_13
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A21000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A218DA)
value = cs.new_bitvec(8)
cs.add(value == 0x74)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A218DB)
value = cs.new_bitvec(8)
cs.add(value == 0x61)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A218DC)
value = cs.new_bitvec(8)
cs.add(value == 0x72)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A218DD)
value = cs.new_bitvec(8)
cs.add(value == 0x74)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A218DE)
value = cs.new_bitvec(8)
cs.add(value == 0x5F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A218DF)
value = cs.new_bitvec(8)
cs.add(value == 0x6D)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A218E0)
value = cs.new_bitvec(8)
cs.add(value == 0x61)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A218E1)
value = cs.new_bitvec(8)
cs.add(value == 0x69)
mem[addr] = value
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0x735F6362696C5F5F)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7A218D2)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A218DA, 8) == ord("t"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A218DB, 8) == ord("a"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A218DC, 8) == ord("r"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A218DD, 8) == ord("t"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A218DE, 8) == ord("_"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A218DF, 8) == ord("m"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A218E0, 8) == ord("a"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A218E1, 8) == ord("i"))
condition = Operators.AND(condition, cpu.XMM1 == 0x69616D5F74726174735F6362696C5F5F)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7A218D2)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_14_symbolic(self):
""" Instruction MOVHPD_14
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A20000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A20A9B)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A20A9C)
value = cs.new_bitvec(8)
cs.add(value == 0x61)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A20A9D)
value = cs.new_bitvec(8)
cs.add(value == 0x63)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A20A9E)
value = cs.new_bitvec(8)
cs.add(value == 0x63)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A20A9F)
value = cs.new_bitvec(8)
cs.add(value == 0x74)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A20AA0)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A20AA1)
value = cs.new_bitvec(8)
cs.add(value == 0x5F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A20AA2)
value = cs.new_bitvec(8)
cs.add(value == 0x6E)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x36766772615F6C645F)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7A20A93)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A20A9B, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A20A9C, 8) == ord("a"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A20A9D, 8) == ord("c"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A20A9E, 8) == ord("c"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A20A9F, 8) == ord("t"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A20AA0, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A20AA1, 8) == ord("_"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A20AA2, 8) == ord("n"))
condition = Operators.AND(condition, cpu.XMM2 == 0x6E5F007463636100766772615F6C645F)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7A20A93)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_15_symbolic(self):
""" Instruction MOVHPD_15
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A23000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A232EE)
value = cs.new_bitvec(8)
cs.add(value == 0x6E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A232EF)
value = cs.new_bitvec(8)
cs.add(value == 0x61)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A232F0)
value = cs.new_bitvec(8)
cs.add(value == 0x62)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A232F1)
value = cs.new_bitvec(8)
cs.add(value == 0x6C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A232F2)
value = cs.new_bitvec(8)
cs.add(value == 0x65)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A232F3)
value = cs.new_bitvec(8)
cs.add(value == 0x5F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A232F4)
value = cs.new_bitvec(8)
cs.add(value == 0x73)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A232F5)
value = cs.new_bitvec(8)
cs.add(value == 0x65)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x36655F6362696C5F5F)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7A232E6)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A232EE, 8) == ord("n"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A232EF, 8) == ord("a"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A232F0, 8) == ord("b"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A232F1, 8) == ord("l"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A232F2, 8) == ord("e"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A232F3, 8) == ord("_"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A232F4, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A232F5, 8) == ord("e"))
condition = Operators.AND(condition, cpu.XMM2 == 0x65735F656C62616E655F6362696C5F5F)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7A232E6)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_16_symbolic(self):
""" Instruction MOVHPD_16
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_17_symbolic(self):
""" Instruction MOVHPD_17
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DD7000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7671)
value = cs.new_bitvec(8)
cs.add(value == 0x5F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7672)
value = cs.new_bitvec(8)
cs.add(value == 0x64)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7673)
value = cs.new_bitvec(8)
cs.add(value == 0x73)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7674)
value = cs.new_bitvec(8)
cs.add(value == 0x6F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7675)
value = cs.new_bitvec(8)
cs.add(value == 0x5F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7676)
value = cs.new_bitvec(8)
cs.add(value == 0x66)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7677)
value = cs.new_bitvec(8)
cs.add(value == 0x6F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7678)
value = cs.new_bitvec(8)
cs.add(value == 0x72)
mem[addr] = value
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0x646E69665F6C645F)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7DD7669)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7671, 8) == ord("_"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7672, 8) == ord("d"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7673, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7674, 8) == ord("o"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7675, 8) == ord("_"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7676, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7677, 8) == ord("o"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7678, 8) == ord("r"))
condition = Operators.AND(condition, cpu.XMM1 == 0x726F665F6F73645F646E69665F6C645F)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7DD7669)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_18_symbolic(self):
""" Instruction MOVHPD_18
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_19_symbolic(self):
""" Instruction MOVHPD_19
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DD7000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7750)
value = cs.new_bitvec(8)
cs.add(value == 0x6F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7751)
value = cs.new_bitvec(8)
cs.add(value == 0x62)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7752)
value = cs.new_bitvec(8)
cs.add(value == 0x61)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7753)
value = cs.new_bitvec(8)
cs.add(value == 0x6C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7754)
value = cs.new_bitvec(8)
cs.add(value == 0x5F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7755)
value = cs.new_bitvec(8)
cs.add(value == 0x72)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7756)
value = cs.new_bitvec(8)
cs.add(value == 0x6F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7DD7757)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
mem.write(0x7FFFF7DF2950, "\x16O\x08")
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0x6C675F646C74725F)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7DD7748)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7753, 8) == ord("l"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7754, 8) == ord("_"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7755, 8) == ord("r"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7756, 8) == ord("o"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7757, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7750, 8) == ord("o"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7751, 8) == ord("b"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DD7752, 8) == ord("a"))
condition = Operators.AND(condition, cpu.XMM1 == 0x6F725F6C61626F6C675F646C74725F)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7DD7748)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_2_symbolic(self):
""" Instruction MOVHPD_2
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0xFFFFFFFF00FFFFFF2E325F4342494C47)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM1 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_20_symbolic(self):
""" Instruction MOVHPD_20
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248B7)
value = cs.new_bitvec(8)
cs.add(value == 0x2D)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248B8)
value = cs.new_bitvec(8)
cs.add(value == 0x78)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248B9)
value = cs.new_bitvec(8)
cs.add(value == 0x38)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248BA)
value = cs.new_bitvec(8)
cs.add(value == 0x36)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248BB)
value = cs.new_bitvec(8)
cs.add(value == 0x2D)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248BC)
value = cs.new_bitvec(8)
cs.add(value == 0x36)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248BD)
value = cs.new_bitvec(8)
cs.add(value == 0x34)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248BE)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0x78756E696C2D646C)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7A248AF)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248B7, 8) == ord("-"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248B8, 8) == ord("x"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248B9, 8) == ord("8"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248BA, 8) == ord("6"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248BB, 8) == ord("-"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248BC, 8) == ord("6"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248BD, 8) == ord("4"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248BE, 8) == ord("."))
condition = Operators.AND(condition, cpu.XMM1 == 0x2E34362D3638782D78756E696C2D646C)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7A248AF)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_21_symbolic(self):
""" Instruction MOVHPD_21
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7B99000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7B99A30)
value = cs.new_bitvec(8)
cs.add(value == 0x36)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7B99A31)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7B99A32)
value = cs.new_bitvec(8)
cs.add(value == 0x5F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7B99A33)
value = cs.new_bitvec(8)
cs.add(value == 0x5F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7B99A34)
value = cs.new_bitvec(8)
cs.add(value == 0x76)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7B99A35)
value = cs.new_bitvec(8)
cs.add(value == 0x64)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7B99A36)
value = cs.new_bitvec(8)
cs.add(value == 0x73)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7B99A37)
value = cs.new_bitvec(8)
cs.add(value == 0x6F)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x64765F5F00656D692E325F58554E494C)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7B99A28)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7B99A30, 8) == ord("6"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7B99A31, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7B99A32, 8) == ord("_"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7B99A33, 8) == ord("_"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7B99A34, 8) == ord("v"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7B99A35, 8) == ord("d"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7B99A36, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7B99A37, 8) == ord("o"))
condition = Operators.AND(condition, cpu.XMM2 == 0x6F7364765F5F00362E325F58554E494C)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7B99A28)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_3_symbolic(self):
""" Instruction MOVHPD_3
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0xFFFFFFFF00FFFFFF2E325F4342494C47)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM1 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_4_symbolic(self):
""" Instruction MOVHPD_4
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_5_symbolic(self):
""" Instruction MOVHPD_5
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.mmap(0x7FFFF7FFA000, 0x1000, "rwx")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FFA30C)
value = cs.new_bitvec(8)
cs.add(value == 0x36)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FFA30D)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FFA30E)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
mem.write(0x7FFFF7DF294F, "\x0f")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FFA310)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FFA311)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
mem.write(0x7FFFF7DF2952, "\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FFA313)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
mem.write(0x7FFFF7DF294E, "f")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FFA30F)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
mem.write(0x7FFFF7DF2950, "\x16O")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FFA312)
value = cs.new_bitvec(8)
cs.add(value == 0x2)
mem[addr] = value
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0xFFFFFFFF00FFFFFF2E325F58554E494C)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7FFA304)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FFA30C, 8) == ord("6"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FFA30D, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FFA30E, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FFA313, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FFA30F, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FFA310, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FFA311, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FFA312, 8) == ord("\x02"))
condition = Operators.AND(condition, cpu.XMM1 == 0x20000000000362E325F58554E494C)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7FFA304)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_6_symbolic(self):
""" Instruction MOVHPD_6
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_7_symbolic(self):
""" Instruction MOVHPD_7
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A24000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D6)
value = cs.new_bitvec(8)
cs.add(value == 0x32)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D7)
value = cs.new_bitvec(8)
cs.add(value == 0x2E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D8)
value = cs.new_bitvec(8)
cs.add(value == 0x35)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248D9)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DA)
value = cs.new_bitvec(8)
cs.add(value == 0x47)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DB)
value = cs.new_bitvec(8)
cs.add(value == 0x4C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DC)
value = cs.new_bitvec(8)
cs.add(value == 0x49)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A248DD)
value = cs.new_bitvec(8)
cs.add(value == 0x42)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7A248CE)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D7, 8) == ord("."))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D6, 8) == ord("2"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D8, 8) == ord("5"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248D9, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DA, 8) == ord("G"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DB, 8) == ord("L"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DC, 8) == ord("I"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A248DD, 8) == ord("B"))
condition = Operators.AND(condition, cpu.XMM2 == 0x42494C4700352E322E325F4342494C47)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7A248CE)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_8_symbolic(self):
""" Instruction MOVHPD_8
Groups: sse2
0x7ffff7df2953: movhpd xmm2, qword ptr [rsi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.mmap(0x7FFFF7FF7000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2953, "f\x0f\x16V\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FF74A8)
value = cs.new_bitvec(8)
cs.add(value == 0x5F)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FF74A9)
value = cs.new_bitvec(8)
cs.add(value == 0x36)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FF74AA)
value = cs.new_bitvec(8)
cs.add(value == 0x34)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FF74AB)
value = cs.new_bitvec(8)
cs.add(value == 0x2D)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FF74AC)
value = cs.new_bitvec(8)
cs.add(value == 0x6C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FF74AD)
value = cs.new_bitvec(8)
cs.add(value == 0x69)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FF74AE)
value = cs.new_bitvec(8)
cs.add(value == 0x6E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7FF74AF)
value = cs.new_bitvec(8)
cs.add(value == 0x75)
mem[addr] = value
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x3638782F62696C2F)
cpu.RSI = cs.new_bitvec(64)
cs.add(cpu.RSI == 0x7FFFF7FF74A0)
cpu.RIP = 0x7FFFF7DF2953
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2953, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2954, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2955, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2956, 8) == ord("V"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2957, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FF74A8, 8) == ord("_"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FF74A9, 8) == ord("6"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FF74AA, 8) == ord("4"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FF74AB, 8) == ord("-"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FF74AC, 8) == ord("l"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FF74AD, 8) == ord("i"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FF74AE, 8) == ord("n"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7FF74AF, 8) == ord("u"))
condition = Operators.AND(condition, cpu.XMM2 == 0x756E696C2D34365F3638782F62696C2F)
condition = Operators.AND(condition, cpu.RSI == 0x7FFFF7FF74A0)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2958)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_MOVHPD_9_symbolic(self):
""" Instruction MOVHPD_9
Groups: sse2
0x7ffff7df294e: movhpd xmm1, qword ptr [rdi + 8]
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7A21000, 0x1000, "rwx")
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF294E, "f\x0f\x16O\x08")
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A21315)
value = cs.new_bitvec(8)
cs.add(value == 0x65)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A21316)
value = cs.new_bitvec(8)
cs.add(value == 0x6D)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A21317)
value = cs.new_bitvec(8)
cs.add(value == 0x61)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A21318)
value = cs.new_bitvec(8)
cs.add(value == 0x6C)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A21319)
value = cs.new_bitvec(8)
cs.add(value == 0x69)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A2131A)
value = cs.new_bitvec(8)
cs.add(value == 0x67)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A2131B)
value = cs.new_bitvec(8)
cs.add(value == 0x6E)
mem[addr] = value
addr = cs.new_bitvec(64)
cs.add(addr == 0x7FFFF7A2131C)
value = cs.new_bitvec(8)
cs.add(value == 0x0)
mem[addr] = value
cpu.XMM1 = cs.new_bitvec(128)
cs.add(cpu.XMM1 == 0xFFFFFFFF00FFFFFF6D5F6362696C5F5F)
cpu.RDI = cs.new_bitvec(64)
cs.add(cpu.RDI == 0x7FFFF7A2130D)
cpu.RIP = 0x7FFFF7DF294E
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294E, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF294F, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2950, 8) == ord("\x16"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2951, 8) == ord("O"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2952, 8) == ord("\x08"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A21315, 8) == ord("e"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A21316, 8) == ord("m"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A21317, 8) == ord("a"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A21318, 8) == ord("l"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A21319, 8) == ord("i"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A2131A, 8) == ord("g"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A2131B, 8) == ord("n"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7A2131C, 8) == ord("\x00"))
condition = Operators.AND(condition, cpu.XMM1 == 0x6E67696C616D656D5F6362696C5F5F)
condition = Operators.AND(condition, cpu.RDI == 0x7FFFF7A2130D)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2953)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_1_symbolic(self):
""" Instruction PSLLDQ_1
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x1)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x100000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_10_symbolic(self):
""" Instruction PSLLDQ_10
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x6972705F5F00362E6F732E6362696C00)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x2E6F732E6362696C0000000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_11_symbolic(self):
""" Instruction PSLLDQ_11
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x6972705F5F00362E6F732E6362696C00)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x2E6F732E6362696C0000000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_12_symbolic(self):
""" Instruction PSLLDQ_12
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x6972705F5F00362E6F732E6362696C00)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x2E6F732E6362696C0000000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_13_symbolic(self):
""" Instruction PSLLDQ_13
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x1)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x100000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_14_symbolic(self):
""" Instruction PSLLDQ_14
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x6972705F5F00362E6F732E6362696C00)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x2E6F732E6362696C0000000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_15_symbolic(self):
""" Instruction PSLLDQ_15
Groups: sse2
0x7ffff7df389d: pslldq xmm2, 4
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF389D, "f\x0fs\xfa\x04")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x3000000020002000000352E322E32)
cpu.RIP = 0x7FFFF7DF389D
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF38A0, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF38A1, 8) == ord("\x04"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF389D, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF389E, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF389F, 8) == ord("s"))
condition = Operators.AND(condition, cpu.XMM2 == 0x20002000000352E322E3200000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF38A2)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_16_symbolic(self):
""" Instruction PSLLDQ_16
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x6972705F5F00362E6F732E6362696C00)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x2E6F732E6362696C0000000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_17_symbolic(self):
""" Instruction PSLLDQ_17
Groups: sse2
0x7ffff7df39dd: pslldq xmm2, 3
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF39DD, "f\x0fs\xfa\x03")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x494C4700352E322E325F4342494C4700)
cpu.RIP = 0x7FFFF7DF39DD
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39E0, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39E1, 8) == ord("\x03"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39DD, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39DE, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39DF, 8) == ord("s"))
condition = Operators.AND(condition, cpu.XMM2 == 0x352E322E325F4342494C4700000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF39E2)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_18_symbolic(self):
""" Instruction PSLLDQ_18
Groups: sse2
0x7ffff7df389d: pslldq xmm2, 4
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF389D, "f\x0fs\xfa\x04")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x665F4F495F006F6C6C657466006B6863)
cpu.RIP = 0x7FFFF7DF389D
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF38A0, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF38A1, 8) == ord("\x04"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF389D, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF389E, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF389F, 8) == ord("s"))
condition = Operators.AND(condition, cpu.XMM2 == 0x5F006F6C6C657466006B686300000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF38A2)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_19_symbolic(self):
""" Instruction PSLLDQ_19
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x1)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x100000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_2_symbolic(self):
""" Instruction PSLLDQ_2
Groups: sse2
0x7ffff7df2f70: pslldq xmm2, 0xb
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF2000, 0x1000, "rwx")
mem.write(0x7FFFF7DF2F70, "f\x0fs\xfa\x0b")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x6972705F5F00362E6F732E6362696C00)
cpu.RIP = 0x7FFFF7DF2F70
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2F70, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2F71, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2F72, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2F73, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF2F74, 8) == ord("\x0b"))
condition = Operators.AND(condition, cpu.XMM2 == 0x6362696C000000000000000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF2F75)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_20_symbolic(self):
""" Instruction PSLLDQ_20
Groups: sse2
0x7ffff7df3970: pslldq xmm2, 3
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3970, "f\x0fs\xfa\x03")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x322E6F732E34362D3638782D78756E69)
cpu.RIP = 0x7FFFF7DF3970
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3970, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3971, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3972, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3973, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3974, 8) == ord("\x03"))
condition = Operators.AND(condition, cpu.XMM2 == 0x732E34362D3638782D78756E69000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3975)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_21_symbolic(self):
""" Instruction PSLLDQ_21
Groups: sse2
0x7ffff7df3830: pslldq xmm2, 4
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3830, "f\x0fs\xfa\x04")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x5F4342494C4700342E332E325F434249)
cpu.RIP = 0x7FFFF7DF3830
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3830, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3831, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3832, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3833, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3834, 8) == ord("\x04"))
condition = Operators.AND(condition, cpu.XMM2 == 0x4C4700342E332E325F43424900000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3835)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_3_symbolic(self):
""" Instruction PSLLDQ_3
Groups: sse2
0x7ffff7df3ab0: pslldq xmm2, 2
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3AB0, "f\x0fs\xfa\x02")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x63007463656A626F5F726F665F6F7364)
cpu.RIP = 0x7FFFF7DF3AB0
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3AB0, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3AB1, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3AB2, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3AB3, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3AB4, 8) == ord("\x02"))
condition = Operators.AND(condition, cpu.XMM2 == 0x7463656A626F5F726F665F6F73640000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3AB5)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_4_symbolic(self):
""" Instruction PSLLDQ_4
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x6972705F5F00362E6F732E6362696C00)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x2E6F732E6362696C0000000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_5_symbolic(self):
""" Instruction PSLLDQ_5
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x6972705F5F00362E6F732E6362696C00)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x2E6F732E6362696C0000000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_6_symbolic(self):
""" Instruction PSLLDQ_6
Groups: sse2
0x7ffff7df389d: pslldq xmm2, 4
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF389D, "f\x0fs\xfa\x04")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x3000000020002000000352E322E32)
cpu.RIP = 0x7FFFF7DF389D
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF38A0, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF38A1, 8) == ord("\x04"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF389D, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF389E, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF389F, 8) == ord("s"))
condition = Operators.AND(condition, cpu.XMM2 == 0x20002000000352E322E3200000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF38A2)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_7_symbolic(self):
""" Instruction PSLLDQ_7
Groups: sse2
0x7ffff7df3470: pslldq xmm2, 7
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3470, "f\x0fs\xfa\x07")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x1)
cpu.RIP = 0x7FFFF7DF3470
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3470, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3471, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3472, 8) == ord("s"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3473, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3474, 8) == ord("\x07"))
condition = Operators.AND(condition, cpu.XMM2 == 0x100000000000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3475)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_8_symbolic(self):
""" Instruction PSLLDQ_8
Groups: sse2
0x7ffff7df39dd: pslldq xmm2, 3
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF39DD, "f\x0fs\xfa\x03")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x7461636F6C6C6165645F6C645F00636F)
cpu.RIP = 0x7FFFF7DF39DD
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39E0, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39E1, 8) == ord("\x03"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39DD, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39DE, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF39DF, 8) == ord("s"))
condition = Operators.AND(condition, cpu.XMM2 == 0x6F6C6C6165645F6C645F00636F000000)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF39E2)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
def test_PSLLDQ_9_symbolic(self):
""" Instruction PSLLDQ_9
Groups: sse2
0x7ffff7df3c5d: pslldq xmm2, 1
"""
cs = ConstraintSet()
mem = SMemory64(cs)
cpu = AMD64Cpu(mem)
mem.mmap(0x7FFFF7DF3000, 0x1000, "rwx")
mem.write(0x7FFFF7DF3C5D, "f\x0fs\xfa\x01")
cpu.XMM2 = cs.new_bitvec(128)
cs.add(cpu.XMM2 == 0x68252E7568254D00796164666F656D69)
cpu.RIP = 0x7FFFF7DF3C5D
done = False
while not done:
try:
cpu.execute()
done = True
except ConcretizeRegister as e:
symbol = getattr(cpu, e.reg_name)
values = solver.get_all_values(cs, symbol)
self.assertEqual(len(values), 1)
setattr(cpu, e.reg_name, values[0])
condition = True
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3C60, 8) == ord("\xfa"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3C61, 8) == ord("\x01"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3C5D, 8) == ord("f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3C5E, 8) == ord("\x0f"))
condition = Operators.AND(condition, cpu.read_int(0x7FFFF7DF3C5F, 8) == ord("s"))
condition = Operators.AND(condition, cpu.XMM2 == 0x252E7568254D00796164666F656D6900)
condition = Operators.AND(condition, cpu.RIP == 0x7FFFF7DF3C62)
with cs as temp_cs:
temp_cs.add(condition)
self.assertTrue(solver.check(temp_cs))
with cs as temp_cs:
temp_cs.add(condition == False)
self.assertFalse(solver.check(temp_cs))
if __name__ == "__main__":
unittest.main()
| 40.705078 | 99 | 0.589394 | 17,557 | 154,720 | 5.106624 | 0.027283 | 0.026936 | 0.113131 | 0.161616 | 0.922036 | 0.912924 | 0.892724 | 0.890494 | 0.868231 | 0.857223 | 0 | 0.15604 | 0.283958 | 154,720 | 3,800 | 100 | 40.715789 | 0.653241 | 0.043627 | 0 | 0.829634 | 0 | 0 | 0.023894 | 0.0002 | 0 | 0 | 0.141143 | 0 | 0.091904 | 1 | 0.027196 | false | 0 | 0.001563 | 0.000313 | 0.030635 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b9e1131523d07d0cdf99cc8ea3b2b313bddc072c | 20,280 | py | Python | tests/test_operator_commands.py | chandleroking/MantaTail | fe475889dfb0c5929df5d2eed2cc855185f76d6e | [
"MIT"
] | null | null | null | tests/test_operator_commands.py | chandleroking/MantaTail | fe475889dfb0c5929df5d2eed2cc855185f76d6e | [
"MIT"
] | null | null | null | tests/test_operator_commands.py | chandleroking/MantaTail | fe475889dfb0c5929df5d2eed2cc855185f76d6e | [
"MIT"
] | null | null | null | import pytest
import socket
import time
def test_channel_topics(user_alice, user_bob, user_charlie, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
while True:
received = helpers.receive_line(user_alice)
assert b"332" not in received
assert b"333" not in received
if received == b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
break
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"TOPIC\r\n")
assert helpers.receive_line(user_alice) == b":mantatail 461 Alice TOPIC :Not enough parameters\r\n"
user_alice.sendall(b"TOPIC #foo\r\n")
assert helpers.receive_line(user_alice) == b":mantatail 331 Alice #foo :No topic is set.\r\n"
user_alice.sendall(b"TOPIC #foo :This is a topic\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 TOPIC #foo :This is a topic\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 TOPIC #foo :This is a topic\r\n"
time.sleep(0.1)
user_charlie.sendall(b"JOIN #foo\r\n")
helpers.receive_line(user_charlie)
assert helpers.receive_line(user_charlie) == b":mantatail 332 Charlie #foo :This is a topic\r\n"
assert helpers.receive_line(user_charlie) == b":mantatail 333 Charlie #foo :Alice\r\n"
user_alice.sendall(b"TOPIC #foo\r\n")
helpers.receive_line(user_alice) # Charlie's JOIN message
assert helpers.receive_line(user_alice) == b":mantatail 332 Alice #foo :This is a topic\r\n"
assert helpers.receive_line(user_alice) == b":mantatail 333 Alice #foo :Alice\r\n"
user_bob.sendall(b"TOPIC #foo\r\n")
helpers.receive_line(user_bob) # Charlie's JOIN message
assert helpers.receive_line(user_bob) == b":mantatail 332 Bob #foo :This is a topic\r\n"
assert helpers.receive_line(user_bob) == b":mantatail 333 Bob #foo :Alice\r\n"
user_bob.sendall(b"TOPIC #foo :Bob is setting a topic\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 482 Bob #foo :You're not channel operator\r\n"
user_bob.sendall(b"TOPIC #foo :\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 482 Bob #foo :You're not channel operator\r\n"
user_alice.sendall(b"TOPIC #foo :\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 TOPIC #foo :\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 TOPIC #foo :\r\n"
user_alice.sendall(b"TOPIC #foo\r\n")
assert helpers.receive_line(user_alice) == b":mantatail 331 Alice #foo :No topic is set.\r\n"
user_bob.sendall(b"TOPIC #foo\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 331 Bob #foo :No topic is set.\r\n"
def test_mode_several_flags(user_alice, user_bob, user_charlie, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_charlie.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Charlie!CharlieUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":Charlie!CharlieUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_charlie) != b":mantatail 366 Charlie #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"MODE #foo +ob Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +o Bob\r\n"
assert helpers.receive_line(user_alice) == b":mantatail 368 Alice #foo :End of Channel Ban List\r\n"
user_alice.sendall(b"MODE #foo -o Bob\r\n")
helpers.receive_line(user_alice)
user_alice.sendall(b"MODE #foo +ob Bob Charlie\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +o Bob\r\n"
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b Charlie!*@*\r\n"
user_alice.sendall(b"MODE #foo -o Bob\r\n")
user_alice.sendall(b"MODE #foo -b Charlie\r\n")
helpers.receive_line(user_alice)
helpers.receive_line(user_alice)
user_alice.sendall(b"MODE #foo +bo Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b Bob!*@*\r\n"
assert helpers.receive_line(user_alice) == b":mantatail 461 Alice MODE :Not enough parameters\r\n"
def test_repeated_mode_messages(user_alice, user_bob, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"MODE #foo +o Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +o Bob\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +o Bob\r\n"
user_alice.sendall(b"MODE #foo +o Bob\r\n")
with pytest.raises(socket.timeout):
helpers.receive_line(user_alice)
with pytest.raises(socket.timeout):
helpers.receive_line(user_bob)
user_alice.sendall(b"MODE #foo +b Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b Bob!*@*\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b Bob!*@*\r\n"
user_alice.sendall(b"MODE #foo +b Bob\r\n")
with pytest.raises(socket.timeout):
helpers.receive_line(user_alice)
with pytest.raises(socket.timeout):
helpers.receive_line(user_bob)
user_alice.sendall(b"MODE #foo -b Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo -b Bob!*@*\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 MODE #foo -b Bob!*@*\r\n"
user_alice.sendall(b"MODE #foo +b *!*@*\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b *!*@*\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b *!*@*\r\n"
user_alice.sendall(b"MODE #foo +b Bob\r\n")
with pytest.raises(socket.timeout):
helpers.receive_line(user_alice)
with pytest.raises(socket.timeout):
helpers.receive_line(user_bob)
def test_mode_errors(user_alice, user_bob, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":mantatail 366 Alice #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"MODE #foo ^g Bob\r\n")
assert helpers.receive_line(user_alice) == b":mantatail 472 Alice ^ :is an unknown mode char to me\r\n"
user_alice.sendall(b"MODE #foo +g Bob\r\n")
assert helpers.receive_line(user_alice) == b":mantatail 472 Alice g :is an unknown mode char to me\r\n"
user_bob.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_alice.sendall(b"MODE +o #foo Bob\r\n")
while helpers.receive_line(user_alice) != b":mantatail 403 Alice +o :No such channel\r\n":
pass
user_alice.sendall(b"MODE Bob #foo +o\r\n")
# TODO: The actual IRC error for this should be "502 Can't change mode for other users"
# This will be implemented when MODE becomes more widely supported
assert helpers.receive_line(user_alice) == b":mantatail 403 Alice Bob :No such channel\r\n"
def test_op_deop_user(user_alice, user_bob, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"MODE #foo +o Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +o Bob\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +o Bob\r\n"
user_alice.sendall(b"MODE #foo -o Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo -o Bob\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 MODE #foo -o Bob\r\n"
def test_channel_owner(user_alice, user_bob, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":mantatail 366 Alice #foo :End of /NAMES list.\r\n":
pass
while True:
received = helpers.receive_line(user_bob)
if b"353" in received:
assert helpers.compare_if_word_match_in_any_order(received, b":mantatail 353 Bob = #foo :Bob @Alice\r\n")
break
user_alice.sendall(b"PART #foo\r\n")
user_bob.sendall(b"PART #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_alice.sendall(b"JOIN #foo\r\n")
while True:
received = helpers.receive_line(user_alice)
if b"353" in received:
assert helpers.compare_if_word_match_in_any_order(received, b":mantatail 353 Alice = #foo :Alice @Bob\r\n")
break
def test_operator_prefix(user_alice, user_bob, user_charlie, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
helpers.receive_line(user_alice) # JOIN message from server
assert helpers.receive_line(user_alice) == b":mantatail 353 Alice = #foo :@Alice\r\n"
user_bob.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_alice.sendall(b"MODE #foo +o Bob\r\n")
time.sleep(0.1)
user_charlie.sendall(b"JOIN #foo\r\n")
while True:
received = helpers.receive_line(user_charlie)
if b"353" in received:
assert helpers.compare_if_word_match_in_any_order(
received, b":mantatail 353 Charlie = #foo :Charlie @Alice @Bob\r\n"
)
break
user_charlie.sendall(b"PART #foo\r\n")
user_alice.sendall(b"MODE #foo -o Bob\r\n")
time.sleep(0.1)
user_charlie.sendall(b"JOIN #foo\r\n")
while True:
received = helpers.receive_line(user_charlie)
if b"353" in received:
assert helpers.compare_if_word_match_in_any_order(
received, b":mantatail 353 Charlie = #foo :Charlie @Alice Bob\r\n"
)
break
user_charlie.sendall(b"PART #foo\r\n")
user_alice.sendall(b"MODE #foo +o Bob\r\n")
time.sleep(0.1)
user_charlie.sendall(b"JOIN #foo\r\n")
while True:
received = helpers.receive_line(user_charlie)
if b"353" in received:
assert helpers.compare_if_word_match_in_any_order(
received, b":mantatail 353 Charlie = #foo :Charlie @Alice @Bob\r\n"
)
break
def operator_nickchange_then_kick(user_alice, user_bob, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"NICK :NewNick\r\n")
helpers.receive_line(user_bob)
user_alice.sendall(b"KICK #foo Bob")
assert helpers.receive_line(user_bob) == b":NewNick!AliceUsr@127.0.0.1 KICK #foo Bob :Bob\r\n"
user_bob.sendall(b"PRIVMSG #foo :Foo\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 442 #foo :You're not on that channel\r\n"
def test_operator_no_such_channel(user_alice, helpers):
user_alice.sendall(b"MODE #foo +o Bob\r\n")
assert helpers.receive_line(user_alice) == b":mantatail 403 Alice #foo :No such channel\r\n"
def test_operator_no_privileges(user_alice, user_bob, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_bob.sendall(b"MODE #foo +o Alice\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 482 Bob #foo :You're not channel operator\r\n"
def test_operator_user_not_in_channel(user_alice, user_bob, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":mantatail 366 Alice #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"MODE #foo +o Bob\r\n")
assert helpers.receive_line(user_alice) == b":mantatail 441 Alice Bob #foo :They aren't on that channel\r\n"
def test_kick_user(user_alice, user_bob, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"KICK #foo Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 KICK #foo Bob :Bob\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 KICK #foo Bob :Bob\r\n"
user_bob.sendall(b"PRIVMSG #foo :Foo\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 442 Bob #foo :You're not on that channel\r\n"
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"KICK #foo Bob Bye bye\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 KICK #foo Bob :Bye\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 KICK #foo Bob :Bye\r\n"
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"KICK #foo Bob :Reason with many words\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 KICK #foo Bob :Reason with many words\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 KICK #foo Bob :Reason with many words\r\n"
user_alice.sendall(b"KICK #foo Alice\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 KICK #foo Alice :Alice\r\n"
user_alice.sendall(b"PRIVMSG #foo :Foo\r\n")
while helpers.receive_line(user_alice) != b":mantatail 403 Alice #foo :No such channel\r\n":
pass
def test_kick_operator(user_alice, user_bob, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_alice.sendall(b"MODE #foo +o Bob\r\n")
while helpers.receive_line(user_alice) != b":Alice!AliceUsr@127.0.0.1 MODE #foo +o Bob\r\n":
pass
while helpers.receive_line(user_bob) != b":Alice!AliceUsr@127.0.0.1 MODE #foo +o Bob\r\n":
pass
user_alice.sendall(b"KICK #foo Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 KICK #foo Bob :Bob\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 KICK #foo Bob :Bob\r\n"
user_bob.sendall(b"PRIVMSG #foo :Foo\r\n")
while helpers.receive_line(user_bob) != b":mantatail 442 Bob #foo :You're not on that channel\r\n":
pass
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_bob.sendall(b"KICK #foo Alice\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 482 Bob #foo :You're not channel operator\r\n"
def test_ban_functionality(user_alice, user_bob, user_charlie, helpers):
user_alice.sendall(b"JOIN #foo\r\n")
time.sleep(0.1)
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_alice.sendall(b"MODE #foo +b Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b Bob!*@*\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b Bob!*@*\r\n"
user_bob.sendall(b"PRIVMSG #foo :This is a message\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 404 Bob #foo :Cannot send to nick/channel\r\n"
user_bob.sendall(b"PART #foo\r\n")
assert helpers.receive_line(user_bob) == b":Bob!BobUsr@127.0.0.1 PART #foo\r\n"
assert helpers.receive_line(user_alice) == b":Bob!BobUsr@127.0.0.1 PART #foo\r\n"
user_bob.sendall(b"JOIN #foo\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 474 Bob #foo :Cannot join channel (+b) - you are banned\r\n"
time.sleep(0.1)
user_alice.sendall(b"MODE #foo +b\r\n")
assert helpers.receive_line(user_alice) == b":mantatail 367 Alice #foo Bob!*@* Alice!AliceUsr@127.0.0.1\r\n"
assert helpers.receive_line(user_alice) == b":mantatail 368 Alice #foo :End of Channel Ban List\r\n"
user_alice.sendall(b"MODE #foo -b Bob\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo -b Bob!*@*\r\n"
user_bob.sendall(b"JOIN #foo\r\n")
while helpers.receive_line(user_alice) != b":Bob!BobUsr@127.0.0.1 JOIN #foo\r\n":
pass
while helpers.receive_line(user_bob) != b":mantatail 366 Bob #foo :End of /NAMES list.\r\n":
pass
user_bob.sendall(b"MODE #foo +b Alice\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 482 Bob #foo :You're not channel operator\r\n"
user_alice.sendall(b"MODE #foo +b BobUsr@\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b *!BobUsr@*\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b *!BobUsr@*\r\n"
user_bob.sendall(b"PRIVMSG #foo :This is a message\r\n")
assert helpers.receive_line(user_bob) == b":mantatail 404 Bob #foo :Cannot send to nick/channel\r\n"
user_alice.sendall(b"MODE #foo +b @127.0.0.1\r\n")
assert helpers.receive_line(user_alice) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b *!*@127.0.0.1\r\n"
assert helpers.receive_line(user_bob) == b":Alice!AliceUsr@127.0.0.1 MODE #foo +b *!*@127.0.0.1\r\n"
user_charlie.sendall(b"JOIN #foo\r\n")
assert (
helpers.receive_line(user_charlie)
== b":mantatail 474 Charlie #foo :Cannot join channel (+b) - you are banned\r\n"
)
### Netcat tests
def test_channel_owner_kick_self(run_server, helpers):
"""Checks that a channel is properly removed when a channel's last user (operator) kicks themselves."""
with socket.socket() as nc:
nc.connect(("localhost", 6667))
nc.sendall(b"NICK nc\n")
nc.sendall(b"USER nc 0 * :netcat\n")
nc.sendall(b"JOIN #foo\n")
while helpers.receive_line(nc) != b":mantatail 366 nc #foo :End of /NAMES list.\r\n":
pass
nc.sendall(b"KICK #foo nc\n")
assert helpers.receive_line(nc) == b":nc!nc@127.0.0.1 KICK #foo nc :nc\r\n"
nc.sendall(b"QUIT\n")
with socket.socket() as nc:
nc.connect(("localhost", 6667))
nc.sendall(b"NICK nc\n")
nc.sendall(b"USER nc 0 * :netcat\n")
while helpers.receive_line(nc) != b":mantatail 376 nc :End of /MOTD command\r\n":
pass
nc.sendall(b"PART #foo\n")
assert helpers.receive_line(nc) == b":mantatail 403 nc #foo :No such channel\r\n"
nc.sendall(b"JOIN #foo\n")
while helpers.receive_line(nc) != b":mantatail 366 nc #foo :End of /NAMES list.\r\n":
pass
nc.sendall(b"KICK #foo nc\n")
assert helpers.receive_line(nc) == b":nc!nc@127.0.0.1 KICK #foo nc :nc\r\n"
nc.sendall(b"QUIT\n")
| 41.219512 | 119 | 0.670118 | 3,545 | 20,280 | 3.705783 | 0.04598 | 0.032275 | 0.175383 | 0.204308 | 0.934536 | 0.929208 | 0.91954 | 0.897617 | 0.880186 | 0.865494 | 0 | 0.036654 | 0.179389 | 20,280 | 491 | 120 | 41.303462 | 0.752734 | 0.01642 | 0 | 0.691011 | 0 | 0.117978 | 0.369921 | 0.067021 | 0 | 0 | 0 | 0.002037 | 0.224719 | 1 | 0.042135 | false | 0.098315 | 0.008427 | 0 | 0.050562 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
e02aafd2f2fd2f6cd54002b571721a867452f82b | 23,020 | py | Python | pyEX/premium/wallstreethorizon/__init__.py | andrescevp/pyEX | 4c8daa411b01133a292d341a78f6e1b80cc2be99 | [
"Apache-2.0"
] | null | null | null | pyEX/premium/wallstreethorizon/__init__.py | andrescevp/pyEX | 4c8daa411b01133a292d341a78f6e1b80cc2be99 | [
"Apache-2.0"
] | null | null | null | pyEX/premium/wallstreethorizon/__init__.py | andrescevp/pyEX | 4c8daa411b01133a292d341a78f6e1b80cc2be99 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from functools import wraps
from ...stocks import timeSeries, timeSeriesDF
from ...common import _interval
@_interval(hours=4)
def _base(id, symbol="", **kwargs):
"""internal"""
kwargs["id"] = id
kwargs["key"] = symbol or kwargs.pop("key", "")
return timeSeries(**kwargs)
@_interval(hours=4)
def _baseDF(id, symbol="", **kwargs):
"""internal"""
kwargs["id"] = id
kwargs["key"] = symbol or kwargs.pop("key", "")
return timeSeriesDF(**kwargs)
@wraps(timeSeries)
def analystDays(symbol="", **kwargs):
"""This is a meeting where company executives provide information about the company’s performance and its future prospects.
https://iexcloud.io/docs/api/#analyst-days
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_ANALYST_DAY", symbol=symbol, **kwargs)
@wraps(timeSeries)
def analystDaysDF(symbol="", **kwargs):
"""This is a meeting where company executives provide information about the company’s performance and its future prospects.
https://iexcloud.io/docs/api/#analyst-days
Args:
symbol (str): symbol to use
"""
return _baseDF(id="PREMIUM_WALLSTREETHORIZON_ANALYST_DAY", symbol=symbol, **kwargs)
@wraps(timeSeries)
def boardOfDirectorsMeeting(symbol="", **kwargs):
"""This is an end-point for getting information about a formal meeting of a company’s board of directors to establish corporate management related policies and to make decisions on major company issues.
https://iexcloud.io/docs/api/#analyst-days
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_BOARD_OF_DIRECTORS_MEETING",
symbol=symbol,
**kwargs
)
@wraps(timeSeries)
def boardOfDirectorsMeetingDF(symbol="", **kwargs):
"""This is a meeting where company executives provide information about the company’s performance and its future prospects.
https://iexcloud.io/docs/api/#board-of-directors-meeting
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_BOARD_OF_DIRECTORS_MEETING",
symbol=symbol,
**kwargs
)
@wraps(timeSeries)
def businessUpdates(symbol="", **kwargs):
"""This is a meeting orconference call in which company information is reviewed by one or more company executives.
https://iexcloud.io/docs/api/#business-updates
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_BUSINESS_UPDATE", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def businessUpdatesDF(symbol="", **kwargs):
"""This is a meeting orconference call in which company information is reviewed by one or more company executives.
https://iexcloud.io/docs/api/#business-updates
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_BUSINESS_UPDATE", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def buybacks(symbol="", **kwargs):
"""The repurchase of outstanding shares by a company to reduce the number of shares on the market.
https://iexcloud.io/docs/api/#buybacks
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_BUYBACK", symbol=symbol, **kwargs)
@wraps(timeSeries)
def buybacksDF(symbol="", **kwargs):
"""The repurchase of outstanding shares by a company to reduce the number of shares on the market.
https://iexcloud.io/docs/api/#buybacks
Args:
symbol (str): symbol to use
"""
return _baseDF(id="PREMIUM_WALLSTREETHORIZON_BUYBACK", symbol=symbol, **kwargs)
@wraps(timeSeries)
def capitalMarketsDay(symbol="", **kwargs):
"""This is a meeting where company executives provide information about the company’s performance and its future prospects.
https://iexcloud.io/docs/api/#capital-markets-day
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_CAPITAL_MARKETS_DAY", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def capitalMarketsDayDF(symbol="", **kwargs):
"""This is a meeting where company executives provide information about the company’s performance and its future prospects.
https://iexcloud.io/docs/api/#capital-markets-day
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_CAPITAL_MARKETS_DAY", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def companyTravel(symbol="", **kwargs):
"""This is a roadshow or bus tour event in which one or more company executives speaks to interested investors and analysts.
https://iexcloud.io/docs/api/#company-travel
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_COMPANY_TRAVEL", symbol=symbol, **kwargs)
@wraps(timeSeries)
def companyTravelDF(symbol="", **kwargs):
"""This is a roadshow or bus tour event in which one or more company executives speaks to interested investors and analysts.
https://iexcloud.io/docs/api/#company-travel
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_COMPANY_TRAVEL", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def filingDueDates(symbol="", **kwargs):
"""This is an estimated date, based on historical trends for this company in which a company must file the appropriate Form for the quarter/year or file for an extension.
https://iexcloud.io/docs/api/#filing-due-dates
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_FILING_DUE_DATE", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def filingDueDatesDF(symbol="", **kwargs):
"""This is an estimated date, based on historical trends for this company in which a company must file the appropriate Form for the quarter/year or file for an extension.
https://iexcloud.io/docs/api/#filing-due-dates
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_FILING_DUE_DATE", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def fiscalQuarterEnd(symbol="", **kwargs):
"""This is a forecasted quarterly ending announcement date for a company. This may or may not correspond to a calendar quarter.
https://iexcloud.io/docs/api/#fiscal-quarter-end
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_FISCAL_QUARTER_END_DATE", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def fiscalQuarterEndDF(symbol="", **kwargs):
"""This is a forecasted quarterly ending announcement date for a company. This may or may not correspond to a calendar quarter.
https://iexcloud.io/docs/api/#fiscal-quarter-end
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_FISCAL_QUARTER_END_DATE", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def forum(symbol="", **kwargs):
"""This is a meeting where ideas and views of a business nature can be exchanged.
https://iexcloud.io/docs/api/#forum
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_FORUM", symbol=symbol, **kwargs)
@wraps(timeSeries)
def forumDF(symbol="", **kwargs):
"""This is a meeting where ideas and views of a business nature can be exchanged.
https://iexcloud.io/docs/api/#forum
Args:
symbol (str): symbol to use
"""
return _baseDF(id="PREMIUM_WALLSTREETHORIZON_FORUM", symbol=symbol, **kwargs)
@wraps(timeSeries)
def generalConference(symbol="", **kwargs):
"""This is a formal meeting in which representatives of many companies gather to discuss ideas or issues related to a particular topic or business, usually held for several days. This item indicates at least one representative from the company will be presenting at the conference on the specified date and time. Note: Conference details include full Conference dates.
https://iexcloud.io/docs/api/#general-conference
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_GENERAL_CONFERENCE", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def generalConferenceDF(symbol="", **kwargs):
"""This is a formal meeting in which representatives of many companies gather to discuss ideas or issues related to a particular topic or business, usually held for several days. This item indicates at least one representative from the company will be presenting at the conference on the specified date and time. Note: Conference details include full Conference dates.
https://iexcloud.io/docs/api/#general-conference
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_GENERAL_CONFERENCE", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def fdaAdvisoryCommitteeMeetings(symbol="", **kwargs):
"""The FDA uses 50 committees and panels to obtain independent expert advice on scientific, technical, and policy matters
https://iexcloud.io/docs/api/#fda-advisory-committee-meetings
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_STOCK_SPECIFIC_FDA_ADVISORY_COMMITTEE_MEETING",
symbol=symbol,
**kwargs
)
@wraps(timeSeries)
def fdaAdvisoryCommitteeMeetingsDF(symbol="", **kwargs):
"""The FDA uses 50 committees and panels to obtain independent expert advice on scientific, technical, and policy matters
https://iexcloud.io/docs/api/#fda-advisory-committee-meetings
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_STOCK_SPECIFIC_FDA_ADVISORY_COMMITTEE_MEETING",
symbol=symbol,
**kwargs
)
@wraps(timeSeries)
def holidaysWSH(symbol="", **kwargs):
"""This returns a list of market holidays.
https://iexcloud.io/docs/api/#holidays
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_HOLIDAYS", symbol=symbol, **kwargs)
@wraps(timeSeries)
def holidaysWSHDF(symbol="", **kwargs):
"""This returns a list of market holidays.
https://iexcloud.io/docs/api/#holidays
Args:
symbol (str): symbol to use
"""
return _baseDF(id="PREMIUM_WALLSTREETHORIZON_HOLIDAYS", symbol=symbol, **kwargs)
@wraps(timeSeries)
def indexChanges(symbol="", **kwargs):
"""This shows additions and removals from various indexes for particular stocks.
https://iexcloud.io/docs/api/#index-changes
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_INDEX_CHANGE", symbol=symbol, **kwargs)
@wraps(timeSeries)
def indexChangesDF(symbol="", **kwargs):
"""This shows additions and removals from various indexes for particular stocks.
https://iexcloud.io/docs/api/#index-changes
Args:
symbol (str): symbol to use
"""
return _baseDF(id="PREMIUM_WALLSTREETHORIZON_INDEX_CHANGE", symbol=symbol, **kwargs)
@wraps(timeSeries)
def iposWSH(symbol="", **kwargs):
"""Get a list of upcoming IPOs.
https://iexcloud.io/docs/api/#ipos
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_INITIAL_PUBLIC_OFFERING", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def iposWSHDF(symbol="", **kwargs):
"""Get a list of upcoming IPOs.
https://iexcloud.io/docs/api/#ipos
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_INITIAL_PUBLIC_OFFERING", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def legalActions(symbol="", **kwargs):
"""These are legal actions where an individual represents a group in a court claim. The judgment from the suit is for all the members of the group or class.
https://iexcloud.io/docs/api/#legal-actions
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_LEGAL_ACTIONS", symbol=symbol, **kwargs)
@wraps(timeSeries)
def legalActionsDF(symbol="", **kwargs):
"""These are legal actions where an individual represents a group in a court claim. The judgment from the suit is for all the members of the group or class.
https://iexcloud.io/docs/api/#legal-actions
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_LEGAL_ACTIONS", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def mergersAndAcquisitions(symbol="", **kwargs):
"""These are a type of corporate action in which two companies combine to form a single company, or one company is taken over by another.
https://iexcloud.io/docs/api/#mergers-acquisitions
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_MERGER_ACQUISITIONS", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def mergersAndAcquisitionsDF(symbol="", **kwargs):
"""These are a type of corporate action in which two companies combine to form a single company, or one company is taken over by another.
https://iexcloud.io/docs/api/#mergers-acquisitions
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_MERGER_ACQUISITIONS", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def productEvents(symbol="", **kwargs):
"""Represents movie and video releases. This is the date on which a movie distributor plans to release a movie to theaters
https://iexcloud.io/docs/api/#product-events
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_PRODUCT_EVENTS", symbol=symbol, **kwargs)
@wraps(timeSeries)
def productEventsDF(symbol="", **kwargs):
"""Represents movie and video releases. This is the date on which a movie distributor plans to release a movie to theaters
https://iexcloud.io/docs/api/#product-events
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_PRODUCT_EVENTS", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def researchAndDevelopmentDays(symbol="", **kwargs):
"""This is a day in which investors and analysts can meet with a company’s R&D representatives to learn more about new or improved products and services.
https://iexcloud.io/docs/api/#research-and-development-days
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_RD_DAY", symbol=symbol, **kwargs)
@wraps(timeSeries)
def researchAndDevelopmentDaysDF(symbol="", **kwargs):
"""This is a day in which investors and analysts can meet with a company’s R&D representatives to learn more about new or improved products and services.
https://iexcloud.io/docs/api/#research-and-development-days
Args:
symbol (str): symbol to use
"""
return _baseDF(id="PREMIUM_WALLSTREETHORIZON_RD_DAY", symbol=symbol, **kwargs)
@wraps(timeSeries)
def sameStoreSales(symbol="", **kwargs):
"""Same-store sales, also referred to as comparable-store sales, SSS or identical-store sales, is a financial metric that companies in the retail industry use to evaluate the total dollar amount of sales in the company’s stores that have been operating for a year or more.
https://iexcloud.io/docs/api/#same-store-sales
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_SAME_STORE_SALES", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def sameStoreSalesDF(symbol="", **kwargs):
"""Same-store sales, also referred to as comparable-store sales, SSS or identical-store sales, is a financial metric that companies in the retail industry use to evaluate the total dollar amount of sales in the company’s stores that have been operating for a year or more.
https://iexcloud.io/docs/api/#same-store-sales
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_SAME_STORE_SALES", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def secondaryOfferings(symbol="", **kwargs):
"""Secondary Offerings are the issuance of new stock for public sale from a company that has already made its initial public offering (IPO).
Usually, these kinds of public offerings are made by companies wishing to refinance, or raise capital for growth.
Money raised from these kinds of secondary offerings goes to the company, through the investment bank that underwrites the offering.
Investment banks are issued an allotment, and possibly an overallotment which they may choose to exercise if there is a strong possibility of making money on the spread between the allotment price and the selling price of the securities. Short Selling is prohibited during the period of the secondary offering.
https://iexcloud.io/docs/api/#secondary-offerings
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_SECONDARY_OFFERING", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def secondaryOfferingsDF(symbol="", **kwargs):
"""Secondary Offerings are the issuance of new stock for public sale from a company that has already made its initial public offering (IPO).
Usually, these kinds of public offerings are made by companies wishing to refinance, or raise capital for growth.
Money raised from these kinds of secondary offerings goes to the company, through the investment bank that underwrites the offering.
Investment banks are issued an allotment, and possibly an overallotment which they may choose to exercise if there is a strong possibility of making money on the spread between the allotment price and the selling price of the securities. Short Selling is prohibited during the period of the secondary offering.
https://iexcloud.io/docs/api/#secondary-offerings
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_SECONDARY_OFFERING", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def seminars(symbol="", **kwargs):
"""This is an educational event that features one or more subject matter experts delivering information via lecture and discussion.
https://iexcloud.io/docs/api/#seminars
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_SEMINAR", symbol=symbol, **kwargs)
@wraps(timeSeries)
def seminarsDF(symbol="", **kwargs):
"""This is an educational event that features one or more subject matter experts delivering information via lecture and discussion.
https://iexcloud.io/docs/api/#seminars
Args:
symbol (str): symbol to use
"""
return _baseDF(id="PREMIUM_WALLSTREETHORIZON_SEMINAR", symbol=symbol, **kwargs)
@wraps(timeSeries)
def shareholderMeetings(symbol="", **kwargs):
"""This is a meeting, held at least annually, to elect members to the board of directors and hear reports on the business’ financial situation as well as new policy initiatives from the corporation’s management.
https://iexcloud.io/docs/api/#shareholder-meetings
Args:
symbol (str): symbol to use
"""
return _base(
id="PREMIUM_WALLSTREETHORIZON_SHAREHOLDER_MEETING", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def shareholderMeetingsDF(symbol="", **kwargs):
"""This is a meeting, held at least annually, to elect members to the board of directors and hear reports on the business’ financial situation as well as new policy initiatives from the corporation’s management.
https://iexcloud.io/docs/api/#shareholder-meetings
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_SHAREHOLDER_MEETING", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def summitMeetings(symbol="", **kwargs):
"""This is a gathering of people who are interested in the same business subject or topic.
https://iexcloud.io/docs/api/#summit-meetings
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_SUMMIT_MEETING", symbol=symbol, **kwargs)
@wraps(timeSeries)
def summitMeetingsDF(symbol="", **kwargs):
"""This is a gathering of people who are interested in the same business subject or topic.
https://iexcloud.io/docs/api/#summit-meetings
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_SUMMIT_MEETING", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def tradeShows(symbol="", **kwargs):
"""This is a large gathering in which different companies in a particular field or industry show their products to possible customers.
https://iexcloud.io/docs/api/#trade-shows
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_TRADE_SHOW", symbol=symbol, **kwargs)
@wraps(timeSeries)
def tradeShowsDF(symbol="", **kwargs):
"""This is a large gathering in which different companies in a particular field or industry show their products to possible customers.
https://iexcloud.io/docs/api/#trade-shows
Args:
symbol (str): symbol to use
"""
return _baseDF(id="PREMIUM_WALLSTREETHORIZON_TRADE_SHOW", symbol=symbol, **kwargs)
@wraps(timeSeries)
def witchingHours(symbol="", **kwargs):
"""This is when option contracts and futures contracts expire on the exact same day.
https://iexcloud.io/docs/api/#witching-hours
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_WITCHING_HOURS", symbol=symbol, **kwargs)
@wraps(timeSeries)
def witchingHoursDF(symbol="", **kwargs):
"""This is when option contracts and futures contracts expire on the exact same day.
https://iexcloud.io/docs/api/#witching-hours
Args:
symbol (str): symbol to use
"""
return _baseDF(
id="PREMIUM_WALLSTREETHORIZON_WITCHING_HOURS", symbol=symbol, **kwargs
)
@wraps(timeSeries)
def workshops(symbol="", **kwargs):
"""This is a meeting or series of meetings at which a group of people engage in discussion and activity on a particular subject, product or service to gain hands-on experience.
https://iexcloud.io/docs/api/#workshops
Args:
symbol (str): symbol to use
"""
return _base(id="PREMIUM_WALLSTREETHORIZON_WORKSHOP", symbol=symbol, **kwargs)
@wraps(timeSeries)
def workshopsDF(symbol="", **kwargs):
"""This is a meeting or series of meetings at which a group of people engage in discussion and activity on a particular subject, product or service to gain hands-on experience.
https://iexcloud.io/docs/api/#workshops
Args:
symbol (str): symbol to use
"""
return _baseDF(id="PREMIUM_WALLSTREETHORIZON_WORKSHOP", symbol=symbol, **kwargs)
| 34.512744 | 372 | 0.707515 | 2,951 | 23,020 | 5.442223 | 0.136225 | 0.079203 | 0.067995 | 0.077709 | 0.926837 | 0.925592 | 0.924471 | 0.921233 | 0.921233 | 0.921233 | 0 | 0.000375 | 0.189661 | 23,020 | 666 | 373 | 34.564565 | 0.860566 | 0.547915 | 0 | 0.506383 | 0 | 0 | 0.23472 | 0.232987 | 0 | 0 | 0 | 0 | 0 | 1 | 0.229787 | false | 0 | 0.012766 | 0 | 0.47234 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e0640057adabf4cf70a95b0b4792a4bdf3213e39 | 5,444 | py | Python | server/migrations/versions/644fed244721_.py | uptownnickbrown/metaseq | fc6853640921ca4853b3d4ed3d3354855983db11 | [
"MIT"
] | 7 | 2017-03-27T09:57:55.000Z | 2018-06-09T17:44:31.000Z | server/migrations/versions/644fed244721_.py | uptownnickbrown/metaseq | fc6853640921ca4853b3d4ed3d3354855983db11 | [
"MIT"
] | 4 | 2019-06-05T15:07:49.000Z | 2021-12-13T19:46:40.000Z | server/migrations/versions/644fed244721_.py | uptownnickbrown/metaseq | fc6853640921ca4853b3d4ed3d3354855983db11 | [
"MIT"
] | 1 | 2019-01-28T07:02:25.000Z | 2019-01-28T07:02:25.000Z | """empty message
Revision ID: 644fed244721
Revises: 23f9adef79ea
Create Date: 2017-07-02 17:47:39.031208
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import mysql
# revision identifiers, used by Alembic.
revision = '644fed244721'
down_revision = '23f9adef79ea'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('dataset', 'library_construction_method',
existing_type=mysql.TEXT(charset=u'utf8', collation=u'utf8_bin'),
type_=sa.String(length=20),
existing_nullable=True)
op.alter_column('dataset', 'library_screening_strategy',
existing_type=mysql.TEXT(charset=u'utf8', collation=u'utf8_bin'),
type_=sa.String(length=80),
existing_nullable=True)
op.alter_column('dataset', 'library_source',
existing_type=mysql.TEXT(),
type_=sa.String(length=50),
existing_nullable=True)
op.alter_column('dataset', 'library_strategy',
existing_type=mysql.TEXT(charset=u'utf8', collation=u'utf8_bin'),
type_=sa.String(length=50),
existing_nullable=True)
op.alter_column('dataset', 'sequencing_method',
existing_type=mysql.TEXT(charset=u'utf8', collation=u'utf8_bin'),
type_=sa.String(length=50),
existing_nullable=True)
op.alter_column('dataset', 'env_biome',
existing_type=mysql.TEXT(),
type_=sa.String(length=100),
existing_nullable=True)
op.alter_column('dataset', 'env_feature',
existing_type=mysql.TEXT(),
type_=sa.String(length=200),
existing_nullable=True)
op.alter_column('dataset', 'env_material',
existing_type=mysql.TEXT(),
type_=sa.String(length=150),
existing_nullable=True)
op.alter_column('dataset', 'env_package',
existing_type=mysql.TEXT(),
type_=sa.String(length=100),
existing_nullable=True)
op.alter_column('dataset', 'geo_loc_name',
existing_type=mysql.TEXT(),
type_=sa.String(length=100),
existing_nullable=True)
op.alter_column('dataset', 'instrument_model',
existing_type=mysql.TEXT(),
type_=sa.String(length=50),
existing_nullable=True)
op.alter_column('dataset', 'investigation_type',
existing_type=mysql.TEXT(),
type_=sa.String(length=80),
existing_nullable=True)
op.alter_column('dataset', 'study_type',
existing_type=mysql.TEXT(),
type_=sa.String(length=50),
existing_nullable=True)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('dataset', 'sequencing_method',
existing_type=sa.String(length=50),
type_=mysql.TEXT(charset=u'utf8', collation=u'utf8_bin'),
existing_nullable=True)
op.alter_column('dataset', 'library_strategy',
existing_type=sa.String(length=50),
type_=mysql.TEXT(charset=u'utf8', collation=u'utf8_bin'),
existing_nullable=True)
op.alter_column('dataset', 'library_source',
existing_type=sa.String(length=50),
type_=mysql.TEXT(),
existing_nullable=True)
op.alter_column('dataset', 'library_screening_strategy',
existing_type=sa.String(length=80),
type_=mysql.TEXT(charset=u'utf8', collation=u'utf8_bin'),
existing_nullable=True)
op.alter_column('dataset', 'library_construction_method',
existing_type=sa.String(length=20),
type_=mysql.TEXT(charset=u'utf8', collation=u'utf8_bin'),
existing_nullable=True)
op.alter_column('dataset', 'study_type',
existing_type=sa.String(length=50),
type_=mysql.TEXT(),
existing_nullable=True)
op.alter_column('dataset', 'investigation_type',
existing_type=sa.String(length=80),
type_=mysql.TEXT(),
existing_nullable=True)
op.alter_column('dataset', 'instrument_model',
existing_type=sa.String(length=50),
type_=mysql.TEXT(),
existing_nullable=True)
op.alter_column('dataset', 'geo_loc_name',
existing_type=sa.String(length=100),
type_=mysql.TEXT(),
existing_nullable=True)
op.alter_column('dataset', 'env_package',
existing_type=sa.String(length=100),
type_=mysql.TEXT(),
existing_nullable=True)
op.alter_column('dataset', 'env_material',
existing_type=sa.String(length=150),
type_=mysql.TEXT(),
existing_nullable=True)
op.alter_column('dataset', 'env_feature',
existing_type=sa.String(length=200),
type_=mysql.TEXT(),
existing_nullable=True)
op.alter_column('dataset', 'env_biome',
existing_type=sa.String(length=100),
type_=mysql.TEXT(),
existing_nullable=True)
# ### end Alembic commands ###
| 41.557252 | 80 | 0.599375 | 599 | 5,444 | 5.202003 | 0.138564 | 0.058408 | 0.108472 | 0.166881 | 0.909499 | 0.895379 | 0.879332 | 0.879332 | 0.853659 | 0.824775 | 0 | 0.032069 | 0.278288 | 5,444 | 130 | 81 | 41.876923 | 0.761008 | 0.054188 | 0 | 0.867257 | 0 | 0 | 0.136933 | 0.020736 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017699 | false | 0 | 0.026549 | 0 | 0.044248 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
16275a0989f7dbc620038f855ae3563cca21ceaa | 71,688 | py | Python | cfg/config.py | a1247418/MT18_LH_human-sleep-classification | c4a40571390aaa14b1cc8a458100e21252fe05d2 | [
"MIT"
] | null | null | null | cfg/config.py | a1247418/MT18_LH_human-sleep-classification | c4a40571390aaa14b1cc8a458100e21252fe05d2 | [
"MIT"
] | null | null | null | cfg/config.py | a1247418/MT18_LH_human-sleep-classification | c4a40571390aaa14b1cc8a458100e21252fe05d2 | [
"MIT"
] | null | null | null | import os
import torch
import sys
from sacred import Experiment
from sacred.observers import MongoObserver, FileStorageObserver
root_dir = os.path.abspath(os.path.join(os.path.dirname('__file__'), '..'))
sys.path.insert(0, root_dir)
basedir = os.path.join(root_dir, 'sleeplearning', 'lib')
ex = Experiment(base_dir=basedir)
mongo_url = 'mongodb://toor:y0qXDe3qumoawG0rPfnS@cab-e81-31/admin?authMechanism' \
'=SCRAM-SHA-1'
MONGO_OBSERVER = MongoObserver.create(url=mongo_url, db_name='sacred')
#ex.observers.append(MONGO_OBSERVER)
LOGDIR = "/cluster/scratch/llorenz/logs"#'../logs'
ex.observers.append(FileStorageObserver.create(LOGDIR))
@ex.config
def cfg():
cmt = '' # comment for this run
cuda = torch.cuda.is_available()
seed = 42 # for reproducibility
log_dir = LOGDIR
save_model = False
save_best_only = False
early_stop = True
unsupervized = False
# default dataset settings
ds = {
'channels': None,
'data_dir': os.path.join('../data/sleepedf'),
'train_csv': os.path.join('../cfg/sleepedf/cv_train.csv'),
'val_csv': os.path.join('../cfg/sleepedf/cv_val.csv'),
'tune_csv': None,
'batch_size_train': 32,
'batch_size_val': 128,
'loader': 'Sleepedf', #'Physionet18',
'nbrs': 8,
'osnbrs': False, # one-sided neighbours: only consider neighbours to the left
'fold': None, # only specify for CV
'nfolds': None,
'oversample': False,
'transforms': None,
'nclasses': 5,
}
@ex.named_config
def ChannelDropout10():
ds = {
'transforms': ['SensorDropout((.1,.1,.1,.1,.1,.1,.1,.1,.1,.1))']
}
@ex.named_config
def ChannelDropout7():
ds = {
'transforms': ['SensorDropout((.1,.1,.1,.1,.1,.1,.1))']
}
@ex.named_config
def ChannelDropout10_2():
ds = {
'transforms': ['SensorDropout((.2,.2,.2,.2,.2,.2,.2,.2,.2,.2))']
}
@ex.named_config
def ChannelDropout3():
ds = {
'transforms': ['SensorDropout((.1,.1,.1))']
}
@ex.named_config
def MediumAdaptive():
arch = 'MediumAdaptive'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00005',
'weighted_loss': True
}
@ex.named_config
def paris():
arch = 'Paris'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.001',
'weighted_loss': False,
}
@ex.named_config
def paris2d():
arch = 'Paris2d'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00005',
'weighted_loss': True
}
@ex.named_config
def multvarnet():
arch = 'MultivariateNet'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00005',
'fc_d': [[4096, 0],[2048, 0]],
'weighted_loss': True
}
@ex.named_config
def Mode():
arch = 'Mode'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'expert_models': ['../models/?'],
'train_emb': True,
'attention': '',
'weighted_loss': True
}
@ex.named_config
def Amoe_rs40_0():
arch = 'Amoe'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2208-F3M2-rs40_0.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2235-O2M1-rs40_0.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2239-E1M2-rs40_0.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2241-C3M2-rs40_0.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2246-C4M1-rs40-0.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2247-F4M1-rs40_0.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2251-O1M2-rs40_0.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2665-CHIN-rs40_0.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2666-ABD-rs40_0.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0',
'2667-CHEST-rs40_0.pth.tar'),
],
'train_emb': False,
'weighted_loss': True
}
@ex.named_config
def Amoe_rs40_0_part1():
arch = 'Amoe'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2682-F3M2-rs40_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2685-O2M1-rs40_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2679-E1M2-rs40_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2680-C3M2-rs40_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2681-C4M1-rs40_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2683-F4M1-rs40_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2684-O1M2-rs40_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2686-CHIN-rs40_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2688-ABD-rs40_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs40_0_part1',
'2687-CHEST-rs40_0_part1.pth.tar'),
],
'train_emb': False,
'weighted_loss': True
}
@ex.named_config
def Amoe_rs320_0_part1_3C():
arch = 'Amoe'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
],
'train_emb': False,
'weighted_loss': True
}
@ex.named_config
def AttentionNet_rs320_0_part1_3C():
arch = 'AttentionNet'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'attention': True,
'normalize_context': False,
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
],
'train_emb': False,
'weighted_loss': True
}
@ex.named_config
def ConvAmoe():
arch = 'ConvAmoe'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'attention': True,
'weighted_loss': True
}
@ex.named_config
def AttentionNetConv_rs320_0_part1_3C():
arch = 'AttentionNetConv'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'attention': True,
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
],
'train_emb': False,
'weighted_loss': True
}
@ex.named_config
def AttentionNet_RS160():
arch = 'AttentionNet'
ms = {
'epochs': 100,
'dropout': .2,
'optim': 'adam,lr=0.00001',
'attention': True,
'normalize_context': False,
'context': True,
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2713-O2M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2710-C3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2709-F4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2712-O1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2714-CHIN-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2716-ABD-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2715-CHEST-rs160_0_part1.pth.tar'),
],
'train_emb': True,
'weighted_loss': True
}
@ex.named_config
def AttentionNetConv_rs320_0_part1():
arch = 'AttentionNetConv'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'attention': True,
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2713-O2M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2710-C3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2709-F4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2712-O1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2714-CHIN-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2716-ABD-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2715-CHEST-rs160_0_part1.pth.tar'),
],
'train_emb': False,
'weighted_loss': True
}
@ex.named_config
def AMOE_RS160():
arch = 'Amoe'
ms = {
'epochs': 100,
'dropout': .1,
'optim': 'adam,lr=0.00001',
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2713-O2M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2710-C3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2709-F4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2712-O1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2714-CHIN-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2716-ABD-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2715-CHEST-rs160_0_part1.pth.tar'),
],
'train_emb': True,
'weighted_loss': True
}
@ex.named_config
def Amoe_FzPz():
arch = 'Amoe'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'expert_models':
[os.path.join('..', 'models',
'cv_sleepedf_Fz_2D_singlechanexp2_6464962FC_MP'),
os.path.join('..', 'models',
'cv_sleepedf_Pz_2D_singlechanexp2_6464962FC_MP'),
],
'train_emb': False,
'weighted_loss': True
}
@ex.named_config
def trainedExpAtt():
arch = 'TrainedExpertsAtt'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.000005',
#'fc_d': [[512, .5],[256, .3]],
'expert_ids': list(range(1242,1249)),
#'input_dim': None, # will be set automatically
'weighted_loss': True
}
@ex.named_config
def sleepstage():
arch = 'SleepStage'
ms = {
'epochs': 30,
'dropout': .5,
'optim': 'adam,lr=0.000005',
'weighted_loss': True
}
@ex.named_config
def EarlyFusion():
arch = 'EarlyFusion'
ms = {
'epochs': 25,
'dropout': .5,
'optim': 'adam,lr=0.000005',
'weighted_loss': True
}
@ex.named_config
def LateFusion():
arch = 'LateFusion'
ms = {
'epochs': 50,
'dropout': .5,
'train_emb': True,
'optim': 'adam,lr=0.00001',
'weighted_loss': True,
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2713-O2M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2710-C3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2709-F4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2712-O1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2714-CHIN-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2716-ABD-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2715-CHEST-rs160_0_part1.pth.tar'),
],
}
@ex.named_config
def LateFusion_3C():
arch = 'LateFusion'
ms = {
'epochs': 50,
'dropout': .5,
'train_emb': True,
'optim': 'adam,lr=0.00001',
'weighted_loss': True,
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
],
}
@ex.named_config
def LateFusion2d():
arch = 'LateFusion2d'
ms = {
'epochs': 50,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'weighted_loss': True
}
@ex.named_config
def trainedExpAtt2():
arch = 'TrainedExpertsAtt2'
ms = {
'epochs': 15,
'dropout': .5,
'optim': 'adam,lr=0.000005',
'sum_exp': False,
#'xavier_init': True,
'expert_ids': list(range(1242,1249)),
#'input_dim': None, # will be set automatically
'weighted_loss': True
}
@ex.named_config
def GrangerAmoe():
arch = 'GrangerAmoe'
ms = {
'epochs': 15,
'dropout': .5,
'optim': 'adam,lr=0.000005',
'sum_exp': False,
# 'xavier_init': True,
'expert_ids': list(range(1242, 1249)),
# 'input_dim': None, # will be set automatically
'weighted_loss': True,
'loss': 'granger'
}
@ex.named_config
def SimpleAmoe():
arch = 'SimpleAmoe'
ms = {
'epochs': 15,
'dropout': .5,
'optim': 'adam,lr=0.000005',
'sum_exp': False,
# 'xavier_init': True,
'expert_ids': list(range(1242, 1249)),
# 'input_dim': None, # will be set automatically
'weighted_loss': True,
}
@ex.named_config
def multvar2dnet():
arch = 'Multivariate2dNet'
ms = {
'epochs': 100,
'dropout': .5,
'optim': 'adam,lr=0.00005',
'fc_d': [],
'input_dim': None, # will be set automatically
'weighted_loss': True
}
@ex.named_config
def singlechanexp_bak():
arch = 'SingleChanExpert'
ms = {
'epochs': 25,
'dropout': .5,
'optim': 'adam,lr=0.000005',
'fc_d': [[128,0]],
'input_dim': None, # will be set automatically
'weighted_loss': True
}
@ex.named_config
def ResNet():
arch = 'Resnet'
ms = {
'epochs': 25,
'optim': 'adam,lr=0.000005',
'weighted_loss': True
}
@ex.named_config
def singlechanexp2_bak():
arch = 'SingleChanExpert2'
ms = {
'epochs': 50,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'weighted_loss': True
}
@ex.named_config
def singlechanexp():
arch = 'SingleChanExpert'
ms = {
'epochs': 15,
'dropout': .5,
'optim': 'adam,lr=0.00001',
'weighted_loss': True,
'batch_norm': True
}
@ex.named_config
def multvarnet2d():
arch = 'MultivariateNet2d'
ms = {
'epochs': 100,
'attention': 'feature',
'dropout': .5,
'optim': 'adam,lr=0.00001',
'weighted_loss': True
}
@ex.named_config
def exp_avg():
arch = 'ExpertsAvg'
ms = {
'epochs': 1,
'optim': 'adam,lr=0.00001',
'dropout': 0,
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2713-O2M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2710-C3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2709-F4M1-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2712-O1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2714-CHIN-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2716-ABD-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2715-CHEST-rs160_0_part1.pth.tar'),
],
'train_emb': False,
'weighted_loss': True
}
@ex.named_config
def exp_avg_3C():
arch = 'ExpertsAvg'
ms = {
'epochs': 1,
'optim': 'adam,lr=0.00001',
'dropout': 0,
'expert_models':
[os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2711-F3M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2707-E1M2-rs160_0_part1.pth.tar'),
os.path.join('..', 'models', 'Mixture-Of-Experts-rs160_0_part1',
'2708-C4M1-rs160_0_part1.pth.tar'),
],
'train_emb': False,
'weighted_loss': True
}
@ex.named_config
def PHYSIONET_EEG_EOG_2D():
ds = {
'channels': [
('C3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('C4-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('E1-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('F4-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('O1-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('O2-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
]
}
@ex.named_config
def PHYSIONET_EEG_EOG_EMG_2D():
ds = {
'channels': [
('ABD', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('C3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('C4-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('CHEST', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('Chin1-Chin2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('E1-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('F4-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('O1-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('O2-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def C4E1F3_2D():
ds = {
'channels': [
('C4-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('E1-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
]
}
@ex.named_config
def ALL_ALL_CHAN_2DF():
ds = {
'channels': [
('ABD', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=12, highpass=.3)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('C3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('C4-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('CHEST', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=12, highpass=.3)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('Chin1-Chin2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=12, highpass=.3)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('E1-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('F4-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('O1-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('O2-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_C4M1_E1M2_Chin_2D():
ds = {
'channels': [('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('C4-M1', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('E1-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('Chin1-Chin2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=12, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_C4M1_E1M2_Chin():
ds = {
'channels': [('F3-M2', [
'BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=128)',
'OneDScaler()'
]
),
('C4-M1', [
'BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=128)',
'OneDScaler()'
]
),
('E1-M2', [
'BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=128)',
'OneDScaler()'
]
),
('Chin1-Chin2', [
'BandPass(fs=200, lowpass=12, highpass=.5)',
'Resample(epoch_len=30, fs=128)',
'OneDScaler()'
]
)
]
}
@ex.named_config
def F3M2():
ds = {
'channels': [
('F3-M2', ['BandPass(fs=200, lowpass=45, highpass=.5)',
'Resample(epoch_len=30, fs=100)',
'OneDScaler()']) # 'ConvToInt16()'
]
}
@ex.named_config
def F3M2_2D():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2DP():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2DQQ():
ds = {
'channels': [
('F3-M2', [
'QuantileNormalization("F3M2")',
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
]
)
]
}
@ex.named_config
def F3M2_2DP2():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly2(epoch_len=30, fs=200, target_fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_U():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly2(epoch_len=30, fs=200, target_fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
]
)
]
}
@ex.named_config
def F3M2_2D_NF():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly2(epoch_len=30, fs=200, target_fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_NT():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly2(epoch_len=30, fs=200, target_fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDTimeSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_N():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly2(epoch_len=30, fs=200, target_fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFrequencyTimeSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_NFE():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly2(epoch_len=30, fs=200, target_fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqEpochScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_NTE():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly2(epoch_len=30, fs=200, target_fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDTimeEpochScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_NE():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly2(epoch_len=30, fs=200, target_fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
)
]
}
@ex.named_config
def F3CHIN_2DP9():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('Chin1-Chin2', [
'BandPass(fs=200, lowpass=30, highpass=.3)',
'ResamplePoly(epoch_len=30, fs=200)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
),
]
}
@ex.named_config
def EEG1EMG3_2D1():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('Chin1-Chin2', [
'BandPass(fs=200, lowpass=30, highpass=.3)',
'ResamplePoly(epoch_len=30, fs=200)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
),
('ABD', [
'BandPass(fs=200, lowpass=30, highpass=.3)',
'ResamplePoly(epoch_len=30, fs=200)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
),
('CHEST', [
'BandPass(fs=200, lowpass=30, highpass=.3)',
'ResamplePoly(epoch_len=30, fs=200)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
),
]
}
@ex.named_config
def F3CHIN_2DP10():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
('Chin1-Chin2', [
'BandPass(fs=200, lowpass=30, highpass=.3)',
'ResamplePoly(epoch_len=30, fs=200)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
),
]
}
@ex.named_config
def F3CHIN_2DP11():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
),
('Chin1-Chin2', [
'BandPass(fs=200, lowpass=30, highpass=.3)',
'ResamplePoly(epoch_len=30, fs=200)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
),
]
}
@ex.named_config
def F3M2_2DEPOCH():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
)
]
}
@ex.named_config
def F3M2_2DEPOCH01():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'ZeroOneScaler()',
'TwoDScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D01():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'ZeroOneScaler()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D02():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'ZeroOneSubjectScaler()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2DCUT():
ds = {
'channels': [
('F3-M2', [
'Spectrogram(fs=200, window=300, stride=200)',
'CutFrequencies(fs=200, window=300, lower=0, upper=45)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def E1M2_2DP():
ds = {
'channels': [
('E1-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def O2M1_2DP():
ds = {
'channels': [
('O2-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def C4M1_2DP():
ds = {
'channels': [
('C4-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def C3M2_2DP():
ds = {
'channels': [
('C3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F4M1_2DP():
ds = {
'channels': [
('F4-M1', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def O1M2_2DP():
ds = {
'channels': [
('O1-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def CHIN_2DP():
ds = {
'channels': [
('Chin1-Chin2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def CHEST_2DP():
ds = {
'channels': [
('CHEST', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def ABD_2DP():
ds = {
'channels': [
('ABD', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2DP_MULTIT():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'SpectrogramMultiTaper(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2DPEP():
ds = {
'channels': [
('F3-M2', [
'ResamplePoly(epoch_len=30, fs=200)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_30HZ():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=30, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_CUT():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=30, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'CutFrequencies(fs=100, window=150,lower=0, upper=30)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_CUTM():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=30, highpass=.5)',
'SpectrogramM(fs=100, window=150, stride=100)',
'CutFrequencies(fs=100, window=150,lower=0, upper=30)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_CUT_LOG2():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=30, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'CutFrequencies(fs=100, window=150,lower=0, upper=30)',
'LogTransform2()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_CUT_EPN():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=30, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'CutFrequencies(fs=100, window=150,lower=0, upper=30)',
'LogTransform()',
'TwoDScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_MF():
ds = {
'channels': [
('F3-M2', [
'Spectrogram(fs=200, window=300, stride=200)',
'CutFrequencies(fs=200, window=300,lower=0, upper=30)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_CUT_NOLOG():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=30, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'CutFrequencies(fs=100, window=150,lower=0, upper=30)',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_CUT_NOLOG_EN():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=30, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'CutFrequencies(fs=100, window=150,lower=0, upper=30)',
'TwoDScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_NOSC():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=30, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'CutFrequencies(fs=100, window=150,lower=0, upper=30)',
'LogTransform()',
]
)
]
}
@ex.named_config
def F3M2_2D_EPNORM():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_UNNORM():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
]
)
]
}
@ex.named_config
def F3M2_2D_PER_SAMP_SCALE():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'TwoDScaler()'
]
)
]
}
@ex.named_config
def F3M2_2D_NOLOG():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_Z3():
ds = {
'channels': [
('F3-M2', [
'Z3()',
'TwoDFreqSubjScaler()'
]
)
]
}
@ex.named_config
def F3M2_SleepStage():
ds = {
'channels': [
('F3-M2', [
'Resample(epoch_len=30, fs=100)',
'SleepStage()',
]
)
]
}
@ex.named_config
def F3M2_200Hz_2D():
ds = {
'channels': [('F3-M2', [
'BandPass(fs=200, lowpass=45, highpass=.5)',
'Spectrogram(fs=200, window=300, stride=200)',
'CutFrequencies(fs=200, window=300, '
'lower=0, upper=45)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]
)]
}
@ex.named_config
def sleepedf():
loader = 'Sleepedf'
dchannels = [
('EEG-Fpz-Cz', []),
]
@ex.named_config
def sleepedf_2D():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Fpz-Cz', [
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def sleepedf_Fpz_2D():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Fpz-Cz', [
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def sleepedf_FpzPz_2D():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Fpz-Cz', [
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EEG-Pz-Oz', [
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def sleepedf_Fpz_2D_NE():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Fpz-Cz', [
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDScaler()'
])
]
}
@ex.named_config
def sleepedf_Fpz_1D():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Fpz-Cz', [
'BandPass(fs=100, lowpass=45, highpass=.5)',
'OneDScaler()'
])
]
}
@ex.named_config
def sleepedf_Pz_2D():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Pz-Oz', [
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def sleepedf_EOG_2D():
ds = {
'loader': 'Sleepedf',
'channels': [
('EOG-horizontal', [
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def sleepedf_Fpz_2D_CUT():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Fpz-Cz', [
'BandPass(fs=100, lowpass=30, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'CutFrequencies(fs=100, window=150,lower=0, upper=30)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def sleepedf_2D_BP30():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Fpz-Cz', [
'BandPass(fs=100, lowpass=30, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def DSSM_caro():
arch = 'DSSM'
ms = {
'epochs': 200,
'hidden_size': 128,
'filter_size': 128,
'sep_channels': False,
'dropout': .2, # for conv nets
'optim': 'adam,lr=0.005',
"theta_size": 50,
'use_theta': False,
'normalize_context': False, #TODO check if needed
'context': True, #TODO check if needed
'train_emb': True, #TODO check if needed
'weighted_loss': True,
'label_nbrs': False,
'kl_annealing': False,
'kl_weight': 0,
'reconstruction_weight': 1,
'mmd_weight': 1
}
@ex.named_config
def Sleep_Classifier_caro():
arch = 'StateClassifier'
ms = {
'epochs': 100,
'optim': 'adam,lr=0.0001',
'expert_models': [os.path.join(LOGDIR, 'dssm_usv')],
'weighted_loss': True
}
@ex.named_config
def Ensembler_caro():
arch = 'Ensembler'
ms = {
'epochs': 100,
'optim': 'adam,lr=0.00001',
'weighted_loss': True,
'expert_models': [os.path.join(LOGDIR, 'dssm1_'),
os.path.join(LOGDIR, 'dssm2_'),
os.path.join(LOGDIR, 'dssm3_')],
}
@ex.named_config
def AttentionNet_RS160_caro():
arch = 'AttentionNet'
ms = {
'epochs': 100,
'dropout': .2,
'optim': 'adam,lr=0.00001',
'attention': True,
'normalize_context': False,
'context': True,
'expert_models':
[os.path.join(LOGDIR, 'EOGR'),
os.path.join(LOGDIR, 'EMG'),
os.path.join(LOGDIR, 'EOGL'),
os.path.join(LOGDIR, 'EEG_raw'),
os.path.join(LOGDIR, 'EEG')],
'train_emb': True,
'weighted_loss': True
}
@ex.named_config
def AttentionNet_RS160_edf():
arch = 'AttentionNet'
ms = {
'epochs': 100,
'dropout': .2,
'optim': 'adam,lr=0.00001',
'attention': True,
'normalize_context': False,
'context': True,
'expert_models':
[os.path.join("/".join(LOGDIR.split("/")[:-1]), 'models', 'debug_edf_1'),
os.path.join("/".join(LOGDIR.split("/")[:-1]), 'models', 'debug_edf_2'),
os.path.join("/".join(LOGDIR.split("/")[:-1]), 'models', 'debug_edf_3')],
'train_emb': True,
'weighted_loss': True
}
@ex.named_config
def sleepedf_all_2D():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Pz-Oz', [
'BandPass(fs=250, lowpass=45, highpass=.5)',
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EOG-horizontal', [
'BandPass(fs=250, lowpass=45, highpass=.5)',
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EEG-Fpz-Cz', [
'BandPass(fs=250, lowpass=45, highpass=.5)',
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_all_2D():
ds = {
'loader': 'Carofile',
'channels': [
('EEG', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EOGL', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EOGR', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EMG', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_all_2D_no_sweat():
ds = {
'loader': 'Carofile',
'channels': [
# To filter out sweating artefacts
('EEG', [
'BandPass(fs=250, lowpass=45, highpass=1)',
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EEG_raw', [
'BandPass(fs=250, lowpass=1, highpass=0.1)',
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EOGL', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EOGR', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EMG', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_all_2D_onesided():
ds = {
'loader': 'Carofile',
'osnbrs': True,
'channels': [
('EEG', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EOGL', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EOGR', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EMG', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_all_2D_onesided_raw():
ds = {
'loader': 'Carofile',
'osnbrs': True,
'channels': [
('EEG', [
'TwoDFreqSubjScaler()'
]),
('EOGL', [
'TwoDFreqSubjScaler()'
]),
('EOGR', [
'TwoDFreqSubjScaler()'
]),
('EMG', [
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_all_2D_onesided_no_sweat():
ds = {
'loader': 'Carofile',
'osnbrs': True,
'channels': [
# To filter out sweating artefacts
('EEG', [
'BandPass(fs=250, lowpass=45, highpass=1)',
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EEG_raw', [
'BandPass(fs=250, lowpass=1, highpass=0.1)',
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EOGL', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EOGR', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
]),
('EMG', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_EMG_2D():
ds = {
'loader': 'Carofile',
'channels': [
('EMG', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_EOGL_2D():
ds = {
'loader': 'Carofile',
'channels': [
('EOGL', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_EOGR_2D():
ds = {
'loader': 'Carofile',
'channels': [
('EOGR', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_EEG_2D():
ds = {
'loader': 'Carofile',
'channels': [
('EEG', [
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def caro_EEG_2D_no_sweat():
ds = {
'loader': 'Carofile',
'channels': [
('EEG', [
'BandPass(fs=250, lowpass=1, highpass=0.1)',
'Spectrogram(fs=250, window=150, stride=100)',
'LogTransform()',
'TwoDFreqSubjScaler()'
])
]
}
@ex.named_config
def sleepedf_2D_BAK():
ds = {
'loader': 'Sleepedf',
'channels': [
('EEG-Fpz-Cz', [
'BandPass(fs=100, lowpass=45, highpass=.5)',
'Spectrogram(fs=100, window=150, stride=100)',
'TwoDScaler()'
])
]
}
@ex.named_config
def three_channels_noscale():
ds = {
'channels': [
('F3-M2', []),
('C4-M1', []),
('E1-M2', []),
]
}
@ex.named_config
def three_channels():
ds = {
'channels': [
('F3-M2', ['OneDScaler()']),
('C4-M1', ['OneDScaler()']),
('E1-M2', ['OneDScaler()']),
]
}
@ex.named_config
def three_channels_int16():
ds = {
'channels': [
('F3-M2', ['Resample(epoch_len=30, fs=100)', 'ConvToInt16()']),
('C4-M1', ['Resample(epoch_len=30, fs=100)', 'ConvToInt16()']),
('E1-M2', ['Resample(epoch_len=30, fs=100)', 'ConvToInt16()']),
]
}
@ex.named_config
def three_channels_filt():
ds = {
'channels': [
('F3-M2',
['BandPass(fs=100, lowpass=45, highpass=.5)', 'ConvToInt16()']),
('C4-M1',
['BandPass(fs=100, lowpass=45, highpass=.5)', 'ConvToInt16()']),
('E1-M2',
['BandPass(fs=100, lowpass=45, highpass=.5)', 'ConvToInt16()']),
]
}
@ex.named_config
def seven_channels():
ds = {
'channels': [
('F3-M2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('C4-M1', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('C3-M2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('E1-M2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('F4-M1', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('O1-M2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('O2-M1', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
]
}
@ex.named_config
def ten_channels():
ds = {
'channels': [
('F3-M2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('C4-M1', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('C3-M2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('E1-M2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('F4-M1', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('O1-M2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('O2-M1', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('CHEST', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('Chin1-Chin2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('ABD', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
]
}
@ex.named_config
def ten_channels_30Hz():
ds = {
'channels': [
('F3-M2', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)','OneDScaler()']),
('C4-M1', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)','OneDScaler()']),
('C3-M2', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)','OneDScaler()']),
('E1-M2', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)','OneDScaler()']),
('F4-M1', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)','OneDScaler()']),
('O1-M2', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)','OneDScaler()']),
('O2-M1', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)','OneDScaler()']),
('CHEST', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)','OneDScaler()']),
('Chin1-Chin2', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
('ABD', ['BandPass(fs=200, lowpass=30, highpass=.5)',
'Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
]
}
@ex.named_config
def one_channel():
ds = {
'channels': [
('F3-M2', ['Resample(epoch_len=30, fs=100)', 'OneDScaler()']),
]
}
| 29.464858 | 86 | 0.457622 | 6,780 | 71,688 | 4.701475 | 0.059292 | 0.038901 | 0.044861 | 0.062743 | 0.902529 | 0.887313 | 0.866859 | 0.852836 | 0.843926 | 0.82868 | 0 | 0.107596 | 0.376534 | 71,688 | 2,432 | 87 | 29.476974 | 0.605593 | 0.009639 | 0 | 0.703772 | 0 | 0.00092 | 0.432589 | 0.12524 | 0 | 0 | 0 | 0.000411 | 0 | 1 | 0.057958 | false | 0.055198 | 0.0023 | 0 | 0.060258 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
1653b669f6417e3c8f7a7564c98249862fe4d734 | 88 | py | Python | qamlz/__init__.py | tcoulvert/qamlz | 2e3c4b4fd3a5c7665ad99b19c995d0da50000f8a | [
"MIT"
] | null | null | null | qamlz/__init__.py | tcoulvert/qamlz | 2e3c4b4fd3a5c7665ad99b19c995d0da50000f8a | [
"MIT"
] | null | null | null | qamlz/__init__.py | tcoulvert/qamlz | 2e3c4b4fd3a5c7665ad99b19c995d0da50000f8a | [
"MIT"
] | null | null | null | from .train_env import TrainEnv
from .model import ModelConfig
from .model import Model
| 22 | 31 | 0.829545 | 13 | 88 | 5.538462 | 0.538462 | 0.25 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 88 | 3 | 32 | 29.333333 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
16623f02811d7cae12cee452daf6cdf30c6f9c0b | 26,125 | py | Python | tests/tests/test_tools.py | liminspace/dju-image | b06eb3be2069cd6cb52cf1e26c2c761883142d4e | [
"MIT"
] | 6 | 2016-01-23T18:17:06.000Z | 2017-02-23T16:22:39.000Z | tests/tests/test_tools.py | liminspace/dju-image | b06eb3be2069cd6cb52cf1e26c2c761883142d4e | [
"MIT"
] | null | null | null | tests/tests/test_tools.py | liminspace/dju-image | b06eb3be2069cd6cb52cf1e26c2c761883142d4e | [
"MIT"
] | null | null | null | import hashlib
import os
import re
import shutil
from django.core.urlresolvers import reverse
from dju_image.image import image_get_format
from dju_image.tools import (get_relative_path_from_img_id, generate_img_id, get_profile_configs,
get_variant_label, save_file, get_files_by_img_id, HASH_SIZE,
remove_tmp_prefix_from_filename, remove_tmp_prefix_from_file_path,
make_permalink, is_img_id_exists, is_img_id_valid,
remove_all_files_of_img_id, media_path, upload_from_fs,
img_id_has_tmp_prefix, upload_from_fileobject)
from dju_image import settings as dju_settings
from tests.tests.tools import (get_img_file, create_test_image, clean_media_dir, ViewTestCase,
save_img_file, CleanTmpDirMixin)
class TestTools(ViewTestCase, CleanTmpDirMixin):
@classmethod
def setUpClass(cls):
super(TestTools, cls).setUpClass()
cls.upload_url = reverse('dju_image_upload')
cls._clean_tmp_dir()
@classmethod
def tearDownClass(cls):
super(ViewTestCase, cls).tearDownClass()
cls._clean_tmp_dir()
def setUp(self):
super(TestTools, self).setUp()
clean_media_dir()
def tearDown(self):
super(TestTools, self).tearDown()
clean_media_dir()
def test_generate_img_id(self):
for i in xrange(50):
self.assertIsNotNone(re.match(r'^simple1:[a-z0-9]{8,11}_[a-z0-9]{4}$',
generate_img_id('simple1')))
for i in xrange(50):
self.assertIsNotNone(re.match(r'^simple1:[a-z0-9]{8,11}_[a-z0-9]{4}\.png$',
generate_img_id('simple1', ext='png')))
for i in xrange(50):
self.assertIsNotNone(re.match(r'^simple1:[a-z0-9]{8,11}_[a-z0-9]{4}_tst$',
generate_img_id('simple1', label='tst')))
for i in xrange(50):
self.assertIsNotNone(re.match(r'^simple1:[a-z0-9]{8,11}_[a-z0-9]{4}_tst\.png$',
generate_img_id('simple1', ext='png', label='tst')))
for i in xrange(50):
self.assertIsNotNone(re.match(r'^simple1:__t_[a-z0-9]{8,11}_[a-z0-9]{4}$',
generate_img_id('simple1', tmp=True)))
for i in xrange(50):
self.assertIsNotNone(re.match(r'^simple1:__t_[a-z0-9]{8,11}_[a-z0-9]{4}\.png$',
generate_img_id('simple1', tmp=True, ext='png')))
for i in xrange(50):
self.assertIsNotNone(re.match(r'^simple1:__t_[a-z0-9]{8,11}_[a-z0-9]{4}_tst$',
generate_img_id('simple1', tmp=True, label='tst')))
for i in xrange(50):
self.assertIsNotNone(re.match(r'^simple1:__t_[a-z0-9]{8,11}_[a-z0-9]{4}_tst\.png$',
generate_img_id('simple1', tmp=True, label='tst', ext='png')))
self.assertIsNotNone(re.match(
r'^simple1:[a-z0-9]{8,11}_[a-z0-9]{4}_ts_ABC159\-q_qweyuoopzts_ABC159\-q_qweyuoopzts_ABC159\-q_qweyuo$',
generate_img_id('simple1', label=('ts_/ABC159-q___q(w..e#y%uo&op*?z' * 5))
))
def test_get_relative_path_from_img_id(self):
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12.jpeg'),
'upload-img/common/34/abcde1234_ab12__m_93c87b2a66.jpeg'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12'),
'upload-img/common/34/abcde1234_ab12__m_93c87b2a66'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12.jpeg', ext='png'),
'upload-img/common/34/abcde1234_ab12__m_93c87b2a66.png'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12', ext='png'),
'upload-img/common/34/abcde1234_ab12__m_93c87b2a66.png'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12.jpeg', variant_label='test'),
'upload-img/common/34/abcde1234_ab12__v_b8a30d0227_test.jpeg'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12', variant_label='test'),
'upload-img/common/34/abcde1234_ab12__v_b8a30d0227_test'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12.jpeg', variant_label='test', ext='png'),
'upload-img/common/34/abcde1234_ab12__v_b8a30d0227_test.png'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12', variant_label='test', ext='png'),
'upload-img/common/34/abcde1234_ab12__v_b8a30d0227_test.png'
)
self.assertEqual(
get_relative_path_from_img_id('default:__t_abcde1234_ab12.jpeg'),
'upload-img/common/34/__t_abcde1234_ab12__m_93c87b2a66.jpeg'
)
self.assertEqual(
get_relative_path_from_img_id('default:__t_abcde1234_ab12.jpeg', ext='png'),
'upload-img/common/34/__t_abcde1234_ab12__m_93c87b2a66.png'
)
self.assertEqual(
get_relative_path_from_img_id('default:__t_abcde1234_ab12.jpeg', variant_label='test'),
'upload-img/common/34/__t_abcde1234_ab12__v_b8a30d0227_test.jpeg'
)
self.assertEqual(
get_relative_path_from_img_id('default:__t_abcde1234_ab12.jpeg', variant_label='test', ext='png'),
'upload-img/common/34/__t_abcde1234_ab12__v_b8a30d0227_test.png'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12_myname.jpeg'),
'upload-img/common/34/abcde1234_ab12_myname__m_93c87b2a66.jpeg'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12_myname'),
'upload-img/common/34/abcde1234_ab12_myname__m_93c87b2a66'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12_myname.jpeg', ext='png'),
'upload-img/common/34/abcde1234_ab12_myname__m_93c87b2a66.png'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12_myname.jpeg', variant_label='test'),
'upload-img/common/34/abcde1234_ab12_myname__v_b8a30d0227_test.jpeg'
)
self.assertEqual(
get_relative_path_from_img_id('default:abcde1234_ab12_myname.jpeg', variant_label='test', ext='png'),
'upload-img/common/34/abcde1234_ab12_myname__v_b8a30d0227_test.png'
)
self.assertEqual(
get_relative_path_from_img_id('default:__t_abcde1234_ab12_myname.jpeg'),
'upload-img/common/34/__t_abcde1234_ab12_myname__m_93c87b2a66.jpeg'
)
self.assertEqual(
get_relative_path_from_img_id('default:__t_abcde1234_ab12_myname.jpeg', ext='png'),
'upload-img/common/34/__t_abcde1234_ab12_myname__m_93c87b2a66.png'
)
self.assertEqual(
get_relative_path_from_img_id('default:__t_abcde1234_ab12_myname.jpeg', variant_label='test'),
'upload-img/common/34/__t_abcde1234_ab12_myname__v_b8a30d0227_test.jpeg'
)
self.assertEqual(
get_relative_path_from_img_id('default:__t_abcde1234_ab12_myname.jpeg', variant_label='test', ext='png'),
'upload-img/common/34/__t_abcde1234_ab12_myname__v_b8a30d0227_test.png'
)
self.assertEqual(
get_relative_path_from_img_id('simple0:__t_abcde1234_ab12.jpeg', variant_label='20x30'),
'upload-img/s0/34/__t_abcde1234_ab12__v_51c425ba08_20x30.png'
)
self.assertEqual(
get_relative_path_from_img_id('simple0:__t_abcde1234_ab12', variant_label='20x30'),
'upload-img/s0/34/__t_abcde1234_ab12__v_51c425ba08_20x30.png'
)
self.assertEqual(
get_relative_path_from_img_id('simple0:__t_abcde1234_ab12.jpeg', variant_label='w20'),
'upload-img/s0/34/__t_abcde1234_ab12__v_a4b31265b5_w20.gif'
)
self.assertEqual(
get_relative_path_from_img_id('simple0:__t_abcde1234_ab12.jpeg', variant_label='w20', ext='png'),
'upload-img/s0/34/__t_abcde1234_ab12__v_a4b31265b5_w20.png'
)
self.assertEqual(
get_relative_path_from_img_id('simple0:__t_abcde1234_ab12.png', variant_label='lab0'),
'upload-img/s0/34/__t_abcde1234_ab12__v_4f495406be_lab0.jpeg'
)
self.assertEqual(
get_relative_path_from_img_id('simple0:__t_abcde1234_ab12.png', variant_label='lab0', ext='gif'),
'upload-img/s0/34/__t_abcde1234_ab12__v_4f495406be_lab0.gif'
)
def test_get_relative_path_from_img_id_with_create_dirs(self):
self.assertEqual(
get_relative_path_from_img_id('default:__t_abcde1234_ab12_myname.jpeg', create_dirs=True),
'upload-img/common/34/__t_abcde1234_ab12_myname__m_93c87b2a66.jpeg'
)
self.assertTrue(os.path.isdir(media_path('upload-img/common/34')))
def test_get_profile_configs(self):
c = get_profile_configs('simple1')
rc = dju_settings.DJU_IMG_UPLOAD_PROFILES['simple1']
self.assertEqual(c['PATH'], rc['PATH'])
self.assertEqual(c['MAX_SIZE'], rc['MAX_SIZE'])
self.assertEqual(len(c['VARIANTS']), len(rc['VARIANTS']))
for i in xrange(len(rc['VARIANTS'])):
for k in rc['VARIANTS'][i]:
self.assertEqual(rc['VARIANTS'][i][k], c['VARIANTS'][i][k])
self.assertEqual(get_profile_configs('default'), dju_settings.DJU_IMG_UPLOAD_PROFILE_DEFAULT)
with self.assertRaises(ValueError):
get_profile_configs('none')
def test_get_variant_label(self):
self.assertEqual(get_variant_label({'MAX_SIZE': (10, 20)}), '10x20')
self.assertEqual(get_variant_label({'MAX_SIZE': (10, None)}), 'w10')
self.assertEqual(get_variant_label({'MAX_SIZE': (None, 20)}), 'h20')
def test_save_file(self):
f = get_img_file(create_test_image(1000, 1000))
img_id = generate_img_id('simple1', ext=image_get_format(f), label='test-save-file')
relative_path = get_relative_path_from_img_id(img_id)
full_path = media_path(relative_path)
save_file(f, full_path)
file_exists = os.path.exists(full_path)
self.assertTrue(file_exists)
if file_exists:
f.seek(0)
h1 = hashlib.md5(f.read()).hexdigest()
h2 = hashlib.md5(open(full_path, 'rb').read()).hexdigest()
self.assertEqual(h1, h2)
def test_get_files_by_img_id(self):
r = self.client.post(self.upload_url, {
'images[]': [
get_img_file(create_test_image(1000, 1000)),
get_img_file(create_test_image(900, 900)),
get_img_file(create_test_image(800, 800)),
get_img_file(create_test_image(700, 700)),
],
'profile': 'simple1',
})
self.assertEqual(r.status_code, 200)
d = self.get_json(r)
self.assertEqual(len(d['uploaded']), dju_settings.DJU_IMG_UPLOAD_MAX_FILES)
self.assertEqual(len(d['errors']), 0)
self.assertUploadedFilesExist(d)
for item in d['uploaded']:
r = get_files_by_img_id(item['img_id'])
self.assertEqual(r['main'], item['rel_url'])
self.assertEqual(len(item['variants']), len(r['variants']))
for var_label, var_data in item['variants'].iteritems():
self.assertEqual(r['variants'][var_label], var_data['rel_url'])
r = self.client.post(self.upload_url, {
'images[]': [
get_img_file(create_test_image(1000, 1000)),
get_img_file(create_test_image(900, 900)),
get_img_file(create_test_image(800, 800)),
get_img_file(create_test_image(700, 700)),
],
'profile': 'simple1',
'label': 'world1',
})
self.assertEqual(r.status_code, 200)
d = self.get_json(r)
self.assertEqual(len(d['uploaded']), dju_settings.DJU_IMG_UPLOAD_MAX_FILES)
self.assertEqual(len(d['errors']), 0)
self.assertUploadedFilesExist(d)
for item in d['uploaded']:
r = get_files_by_img_id(item['img_id'])
self.assertEqual(r['main'], item['rel_url'])
self.assertEqual(len(item['variants']), len(r['variants']))
for var_label, var_data in item['variants'].iteritems():
self.assertEqual(r['variants'][var_label], var_data['rel_url'])
def test_get_files_by_img_id_removed_variants_ext(self):
r = self.client.post(self.upload_url, {
'images[]': [
get_img_file(create_test_image(1000, 1000)),
get_img_file(create_test_image(900, 900)),
get_img_file(create_test_image(800, 800)),
get_img_file(create_test_image(700, 700)),
],
'profile': 'simple0',
'label': 'world0',
})
self.assertEqual(r.status_code, 200)
d = self.get_json(r)
self.assertEqual(len(d['uploaded']), dju_settings.DJU_IMG_UPLOAD_MAX_FILES)
self.assertEqual(len(d['errors']), 0)
self.assertUploadedFilesExist(d)
for item in d['uploaded']:
# for i in xrange(len(item['variants'])):
for var_data in item['variants'].values():
# remove ext for all variants files
rel_url = var_data['rel_url']
rel_url_new = os.path.splitext(rel_url)[0]
os.rename(media_path(rel_url), media_path(rel_url_new))
var_data['rel_url'] = rel_url_new
r = get_files_by_img_id(item['img_id'])
self.assertEqual(r['main'], item['rel_url'])
self.assertEqual(len(item['variants']), len(r['variants']))
for var_label, var_data in item['variants'].iteritems():
self.assertEqual(r['variants'][var_label], var_data['rel_url'])
def test_get_files_by_img_id_with_invalid_hash_and_filename_pattern(self):
r = self.client.post(self.upload_url, {
'images[]': [
get_img_file(create_test_image(1000, 1000)),
get_img_file(create_test_image(900, 900)),
get_img_file(create_test_image(800, 800)),
get_img_file(create_test_image(700, 700)),
],
'profile': 'simple0',
'label': 'world0',
})
self.assertEqual(r.status_code, 200)
d = self.get_json(r)
self.assertEqual(len(d['uploaded']), dju_settings.DJU_IMG_UPLOAD_MAX_FILES)
self.assertEqual(len(d['errors']), 0)
self.assertUploadedFilesExist(d)
for item in d['uploaded']:
for var_data in item['variants'].values():
# add file with invalid hash
rel_url = var_data['rel_url']
rel_url_new = re.sub(
r'({suf})[a-z0-9]{{hs}}(_.+)'.replace(
'{suf}', dju_settings.DJU_IMG_UPLOAD_VARIANT_SUFFIX
).replace(
'{hs}', str(HASH_SIZE)
),
r'\1{h}\2'.replace('{h}', 'z' * HASH_SIZE),
rel_url
)
shutil.copy(media_path(rel_url), media_path(rel_url_new))
# add file with invalid filename pattern
rel_url = var_data['rel_url']
rel_url_new = re.sub(
r'({suf})[a-z0-9]{{hs}}(_.+)'.replace(
'{suf}', dju_settings.DJU_IMG_UPLOAD_VARIANT_SUFFIX
).replace(
'{hs}', str(HASH_SIZE)
),
r'\1{h}\2'.replace('{h}', 'z' * (HASH_SIZE + 1)),
rel_url
)
shutil.copy(media_path(rel_url), media_path(rel_url_new))
r = get_files_by_img_id(item['img_id'])
self.assertEqual(r['main'], item['rel_url'])
self.assertEqual(len(item['variants']), len(r['variants']))
for var_label, var_data in item['variants'].iteritems():
self.assertEqual(r['variants'][var_label], var_data['rel_url'])
def test_get_files_by_img_id_with_invalid_hash_and_ignore_check_hash(self):
r = self.client.post(self.upload_url, {
'images[]': [
get_img_file(create_test_image(1000, 1000)),
get_img_file(create_test_image(900, 900)),
get_img_file(create_test_image(800, 800)),
get_img_file(create_test_image(700, 700)),
],
'profile': 'simple0',
'label': 'world0',
})
self.assertEqual(r.status_code, 200)
d = self.get_json(r)
self.assertEqual(len(d['uploaded']), dju_settings.DJU_IMG_UPLOAD_MAX_FILES)
self.assertEqual(len(d['errors']), 0)
self.assertUploadedFilesExist(d)
for item in d['uploaded']:
for var_data in item['variants'].values():
# add file with invalid hash
rel_url = var_data['rel_url']
rel_url_new = re.sub(
r'({suf})[a-z0-9]{{hs}}(_.+)'.replace(
'{suf}', dju_settings.DJU_IMG_UPLOAD_VARIANT_SUFFIX
).replace(
'{hs}', str(HASH_SIZE)
),
r'\1{h}\2'.replace('{h}', 'z' * HASH_SIZE),
rel_url
)
os.rename(media_path(rel_url), media_path(rel_url_new))
var_data['rel_url'] = rel_url_new
r = get_files_by_img_id(item['img_id'], check_hash=False)
self.assertEqual(r['main'], item['rel_url'])
self.assertEqual(len(item['variants']), len(r['variants']))
for var_label, var_data in item['variants'].iteritems():
self.assertEqual(r['variants'][var_label], var_data['rel_url'])
def test_get_files_by_img_id_file_is_not_exists(self):
r = get_files_by_img_id(generate_img_id('simple0'))
self.assertIsNone(r)
def test_remove_tmp_prefix_from_filename(self):
fn = 'test_file_name.jpeg'
fn_tmp = dju_settings.DJU_IMG_UPLOAD_TMP_PREFIX + fn
self.assertEqual(remove_tmp_prefix_from_filename(fn_tmp), fn)
with self.assertRaises(RuntimeError):
remove_tmp_prefix_from_filename(fn)
def test_remove_tmp_prefix_from_file_path(self):
fn = 'test_file_name.jpeg'
fn_tmp = dju_settings.DJU_IMG_UPLOAD_TMP_PREFIX + fn
path = '/some/path/'
file_path = path + fn
file_path_tmp = path + fn_tmp
self.assertEqual(remove_tmp_prefix_from_file_path(file_path_tmp), file_path)
with self.assertRaises(RuntimeError):
remove_tmp_prefix_from_file_path(file_path)
def test_make_permalink(self):
r = self.client.post(self.upload_url, {
'images[]': [
get_img_file(create_test_image(1000, 1000)),
get_img_file(create_test_image(900, 900)),
get_img_file(create_test_image(800, 800)),
get_img_file(create_test_image(700, 700)),
],
'profile': 'simple0',
'label': 'world0',
})
self.assertEqual(r.status_code, 200)
d = self.get_json(r)
self.assertEqual(len(d['uploaded']), dju_settings.DJU_IMG_UPLOAD_MAX_FILES)
self.assertEqual(len(d['errors']), 0)
self.assertUploadedFilesExist(d)
for item in d['uploaded']:
new_img_id = make_permalink(item['img_id'])
files = get_files_by_img_id(new_img_id)
self.assertEqual(files['main'], remove_tmp_prefix_from_file_path(item['rel_url']))
new_item = {
'rel_url': files['main'],
'variants': {},
}
for var_label, var_data in item['variants'].iteritems():
self.assertEqual(
files['variants'][var_label],
remove_tmp_prefix_from_file_path(var_data['rel_url'])
)
new_item['variants'][var_label] = {'rel_url': files['variants'][var_label]}
self.assertUploadedFilesExist({'uploaded': [new_item]})
def test_is_img_id_exists(self):
self.assertFalse(is_img_id_exists('default:abcde1234_ab12_myname.jpeg'))
r = self.client.post(self.upload_url, {
'images[]': [
get_img_file(create_test_image(1000, 1000)),
get_img_file(create_test_image(900, 900)),
get_img_file(create_test_image(800, 800)),
get_img_file(create_test_image(700, 700)),
],
'profile': 'simple0',
'label': 'world0',
})
self.assertEqual(r.status_code, 200)
d = self.get_json(r)
self.assertEqual(len(d['errors']), 0)
self.assertUploadedFilesExist(d)
for item in d['uploaded']:
self.assertTrue(item['img_id'])
def test_is_img_id_valid(self):
self.assertTrue(is_img_id_valid('default:abcde1234_ab12_myname.jpeg'))
self.assertTrue(is_img_id_valid('default:abcde1234_ab12.jpeg'))
self.assertTrue(is_img_id_valid('default:abcde1234_ab12'))
self.assertTrue(is_img_id_valid('default:__t_abcde1234_ab12_myname.jpeg'))
self.assertTrue(is_img_id_valid('default:__t_abcde1234_ab12.jpeg'))
self.assertTrue(is_img_id_valid('default:__t_abcde1234_ab12'))
self.assertFalse(is_img_id_valid('none:abcde1234_ab12_myname.jpeg'))
self.assertFalse(is_img_id_valid('default::abcde1234_ab12_myname.jpeg'))
self.assertFalse(is_img_id_valid('defaultabcde1234_ab12.jpeg'))
self.assertFalse(is_img_id_valid(':default:abcde1234_ab12.jpeg'))
self.assertFalse(is_img_id_valid(':defaultabcde1234_ab12.jpeg'))
self.assertFalse(is_img_id_valid('defaultabcde1234_ab12.jpeg:'))
self.assertFalse(is_img_id_valid('default:abcde1234_ab12..jpeg'))
self.assertFalse(is_img_id_valid('default:abcd/e1234_ab12.jpeg'))
self.assertFalse(is_img_id_valid('default:../../../abcde1234_ab12..jpeg'))
def test_remove_all_files_of_img_id(self):
r1 = self.client.post(self.upload_url, {
'images[]': [
get_img_file(create_test_image(800, 800)),
],
'profile': 'simple0',
'label': 'world0',
})
self.assertEqual(r1.status_code, 200)
d1 = self.get_json(r1)
self.assertEqual(len(d1['errors']), 0)
self.assertUploadedFilesExist(d1)
r2 = self.client.post(self.upload_url, {
'images[]': [
get_img_file(create_test_image(700, 700)),
],
'profile': 'simple0',
'label': 'world0',
})
self.assertEqual(r2.status_code, 200)
d2 = self.get_json(r2)
self.assertEqual(len(d2['errors']), 0)
self.assertUploadedFilesExist(d2)
remove_all_files_of_img_id(d1['uploaded'][0]['img_id'])
self.assertUploadedFilesExist(d2)
self.assertUploadedFilesNotExist(d1)
remove_all_files_of_img_id(d2['uploaded'][0]['img_id'])
self.assertUploadedFilesNotExist(d2)
def test_upload_from_fs(self):
fn = save_img_file('t1.jpeg', create_test_image(600, 600))
try:
img_id = upload_from_fs(fn)
except (ValueError, RuntimeError), e:
raise self.failureException(e)
else:
self.assertTrue(is_img_id_valid(img_id))
self.assertTrue(is_img_id_exists(img_id))
with self.assertRaises(ValueError):
upload_from_fs('none')
fn = save_img_file('t2.jpeg', create_test_image(500, 500))
with self.assertRaises(RuntimeError):
upload_from_fs(fn, profile='simple2')
fn = save_img_file('t3.jpeg', create_test_image(400, 400))
try:
img_id = upload_from_fs(fn, profile='simple1', label='ttt')
except (ValueError, RuntimeError), e:
raise self.failureException(e)
else:
self.assertTrue(is_img_id_valid(img_id))
self.assertTrue(is_img_id_exists(img_id))
self.assertTrue(get_files_by_img_id(img_id)['variants'])
def test_upload_from_fileobject(self):
f = get_img_file(create_test_image(600, 600))
try:
img_id = upload_from_fileobject(f)
except (ValueError, RuntimeError), e:
raise self.failureException(e)
else:
self.assertTrue(is_img_id_valid(img_id))
self.assertTrue(is_img_id_exists(img_id))
with self.assertRaises(ValueError):
upload_from_fs('none')
f = get_img_file(create_test_image(500, 500))
with self.assertRaises(RuntimeError):
upload_from_fileobject(f, profile='simple2')
f = get_img_file(create_test_image(400, 400))
try:
img_id = upload_from_fileobject(f, profile='simple1', label='ttt')
except (ValueError, RuntimeError), e:
raise self.failureException(e)
else:
self.assertTrue(is_img_id_valid(img_id))
self.assertTrue(is_img_id_exists(img_id))
self.assertTrue(get_files_by_img_id(img_id)['variants'])
def test_img_id_has_tmp_prefix(self):
img_id = generate_img_id('none', ext='png', label='zz', tmp=False)
img_id_tmp = generate_img_id('none', ext='png', label='zz', tmp=True)
self.assertTrue(img_id_has_tmp_prefix(img_id_tmp))
self.assertFalse(img_id_has_tmp_prefix(img_id))
| 45.833333 | 117 | 0.60999 | 3,308 | 26,125 | 4.423821 | 0.068017 | 0.044075 | 0.03895 | 0.038267 | 0.826773 | 0.796433 | 0.765204 | 0.751674 | 0.722906 | 0.709922 | 0 | 0.062748 | 0.266144 | 26,125 | 569 | 118 | 45.913884 | 0.700553 | 0.006354 | 0 | 0.532039 | 0 | 0.017476 | 0.194814 | 0.136824 | 0 | 0 | 0 | 0 | 0.273786 | 0 | null | null | 0 | 0.017476 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
168cc2bcab5344c864286efdd46fc9bffa24875c | 9,935 | py | Python | test/programytest/nlp/sentiment/test_extension.py | cdoebler1/AIML2 | ee692ec5ea3794cd1bc4cc8ec2a6b5e5c20a0d6a | [
"MIT"
] | 345 | 2016-11-23T22:37:04.000Z | 2022-03-30T20:44:44.000Z | test/programytest/nlp/sentiment/test_extension.py | MikeyBeez/program-y | 00d7a0c7d50062f18f0ab6f4a041068e119ef7f0 | [
"MIT"
] | 275 | 2016-12-07T10:30:28.000Z | 2022-02-08T21:28:33.000Z | test/programytest/nlp/sentiment/test_extension.py | VProgramMist/modified-program-y | f32efcafafd773683b3fe30054d5485fe9002b7d | [
"MIT"
] | 159 | 2016-11-28T18:59:30.000Z | 2022-03-20T18:02:44.000Z | import unittest
from programy.bot import Bot
from programy.config.bot.bot import BotConfiguration
from programy.nlp.sentiment.extension import SentimentExtension
from programy.dialog.question import Question
from programytest.client import TestClient
class SentimentExtensionTests(unittest.TestCase):
def setUp(self):
self._client = TestClient()
config = BotConfiguration()
config.sentiment_analyser._classname = "programy.nlp.sentiment.textblob_sentiment.TextBlobSentimentAnalyser"
config.sentiment_analyser._scores = "programy.nlp.sentiment.scores.SentimentScores"
self.client_context = self._client.create_client_context("testuser")
self.client_context._bot = Bot(config=config, client=self._client)
self.client_context._bot.initiate_sentiment_analyser()
def test_invalid_command(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
result = extension.execute(self.client_context, "")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "XXX")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "SENTIMENT")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "SENTIMENT SCOREX")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "SENTIMENT FEELING")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "SENTIMENT FEELING X")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "SENTIMENT FEELING LAST")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "SENTIMENT FEELING LAST ONE")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "SENTIMENT FEELING OTHER ")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "SENTIMENT SCORES")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
result = extension.execute(self.client_context, "SENTIMENT CURRENT")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
def test_sentiment_enabled(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
result = extension.execute(self.client_context, "SENTIMENT ENABLED")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT ENABLED", result)
def test_sentiment_disabled(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
self.client_context.bot._sentiment_analyser = None
result = extension.execute(self.client_context, "SENTIMENT ENABLED")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT DISABLED", result)
def test_sentiment_feeling_current_numeric(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
conversation = self.client_context.bot.get_conversation(self.client_context)
self.assertIsNotNone(conversation)
conversation.properties['positivity'] = 0.00
conversation.properties['subjectivity'] = 0.00
result = extension.execute(self.client_context, "SENTIMENT CURRENT NUMERIC")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT SCORES POSITIVITY 0.0 SUBJECTIVITY 0.0", result)
def test_sentiment_feeling_current_text(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
conversation = self.client_context.bot.get_conversation(self.client_context)
self.assertIsNotNone(conversation)
conversation.properties['positivity'] = 0.00
conversation.properties['subjectivity'] = 0.00
result = extension.execute(self.client_context, "SENTIMENT CURRENT TEXT")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT SCORES POSITIVITY NEUTRAL SUBJECTIVITY COMPLETELY OBJECTIVE", result)
def test_sentiment_feeling_current_other(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
conversation = self.client_context.bot.get_conversation(self.client_context)
self.assertIsNotNone(conversation)
conversation.properties['positivity'] = 0.00
conversation.properties['subjectivity'] = 0.00
result = extension.execute(self.client_context, "SENTIMENT CURRENT OTHER")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT INVALID COMMAND", result)
def test_sentiment_score(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
result = extension.execute(self.client_context, "SENTIMENT SCORE I LIKE PEAS")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT SCORES POSITIVITY NEUTRAL SUBJECTIVITY COMPLETELY OBJECTIVE", result)
def test_sentiment_score_no_analyser(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
self.client_context.bot._sentiment_analyser = None
result = extension.execute(self.client_context, "SENTIMENT SCORE I LIKE PEAS")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT DISABLED", result)
def test_sentiment_feeling_last(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
# Need to create a conversation first
conversation = self.client_context.bot.get_conversation(self.client_context)
question1 = Question.create_from_text(self.client_context, "Hello", self.client_context.bot.sentence_splitter)
conversation.record_dialog(question1)
question2 = Question.create_from_text(self.client_context, "Hello", self.client_context.bot.sentence_splitter)
conversation.record_dialog(question2)
conversation.recalculate_sentiment_score(self.client_context)
result = extension.execute(self.client_context, "SENTIMENT FEELING LAST 1")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT FEELING NEUTRAL AND COMPLETELY OBJECTIVE", result)
def test_sentiment_feeling_last_10(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
# Need to create a conversation first
conversation = self.client_context.bot.get_conversation(self.client_context)
self.assertIsNotNone(conversation)
question = Question.create_from_text(self.client_context, "Hello", self.client_context.bot.sentence_splitter)
conversation.record_dialog(question)
result = extension.execute(self.client_context, "SENTIMENT FEELING LAST 10")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT FEELING NEUTRAL AND NEUTRAL", result)
def test_sentiment_feeling_overall(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
result = extension.execute(self.client_context, "SENTIMENT FEELING OVERALL")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT FEELING NEUTRAL AND NEUTRAL", result)
def test_sentiment_feeling_no_scores(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
self.client_context.bot._sentiment_scores = None
result = extension.execute(self.client_context, "SENTIMENT FEELING OVERALL")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT FEELING NEUTRAL AND NEUTRAL", result)
def test_sentiment_feeling_disabled(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
self.client_context.bot._sentiment_analyser = None
result = extension.execute(self.client_context, "SENTIMENT FEELING LAST 1")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT DISABLED", result)
def test_sentiment_score_disabled(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
self.client_context.bot._sentiment_scores = None
result = extension.execute(self.client_context, "SENTIMENT SCORE I LIKE YOU")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT SCORES POSITIVITY UNKNOWN SUBJECTIVITY UNKNOWN", result)
def test_sentiment_sentiment_score(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
result = extension.execute(self.client_context, "SENTIMENT SCORE I LIKE YOU")
self.assertIsNotNone(result)
self.assertEqual("SENTIMENT SCORES POSITIVITY NEUTRAL SUBJECTIVITY COMPLETELY OBJECTIVE", result)
def test_sentiment_sentiment_positivity(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
result = extension.execute(self.client_context, "SENTIMENT POSITIVITY 0.1")
self.assertIsNotNone(result)
self.assertEqual("NEUTRAL", result)
def test_sentiment_sentiment_positivity(self):
extension = SentimentExtension()
self.assertIsNotNone(extension)
result = extension.execute(self.client_context, "SENTIMENT SUBJECTIVITY 0.5")
self.assertIsNotNone(result)
self.assertEqual("NEUTRAL", result)
| 40.884774 | 118 | 0.724308 | 989 | 9,935 | 7.12639 | 0.085945 | 0.078036 | 0.125426 | 0.099603 | 0.867764 | 0.862656 | 0.854285 | 0.832293 | 0.822077 | 0.811294 | 0 | 0.00448 | 0.191142 | 9,935 | 242 | 119 | 41.053719 | 0.872573 | 0.007146 | 0 | 0.666667 | 0 | 0 | 0.16359 | 0.011359 | 0 | 0 | 0 | 0 | 0.431034 | 1 | 0.103448 | false | 0 | 0.034483 | 0 | 0.143678 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
16cf9ede9ecd5a62b0bffd86d1ce58cfb671e75a | 1,868 | py | Python | masker.py | cjfro/ivolo | 78ecf19918c87989da64201536807b35ef7a6238 | [
"MIT"
] | null | null | null | masker.py | cjfro/ivolo | 78ecf19918c87989da64201536807b35ef7a6238 | [
"MIT"
] | null | null | null | masker.py | cjfro/ivolo | 78ecf19918c87989da64201536807b35ef7a6238 | [
"MIT"
] | null | null | null | import numpy as np
from config import *
def bottom_up(m):
'''
Creates a mask of which leds to turn on given an amplitude.
This mask lights a strip of LEDs starting at the bottom and reaching higher
with increasing amplitude
Arguments:
m (float): The amplitude in 0 to 1
Returns:
A [LED_1_COUNT x 3] array of zeros and ones
'''
num_leds_on = m * LED_1_COUNT
return np.tile(np.arange(LED_1_COUNT) < num_leds_on, (3,1)).T
def top_down(m):
'''
Creates a mask of which leds to turn on given an amplitude.
This mask lights a strip of LEDs starting at the top and reaching lower
with increasing amplitude
Arguments:
m (float): The amplitude in 0 to 1
Returns:
A [LED_1_COUNT x 3] array of zeros and ones
'''
num_leds_on = m * LED_1_COUNT
return np.tile(LED_1_COUNT - np.arange(LED_1_COUNT) < num_leds_on, (3,1)).T
def middle_out(m):
'''
Creates a mask of which leds to turn on given an amplitude.
This mask lights a strip of LEDs starting at the middle and reaching out
with increasing amplitude
Arguments:
m (float): The amplitude in 0 to 1
Returns:
A [LED_1_COUNT x 3] array of zeros and ones
'''
num_leds_on = m * LED_1_COUNT
return np.tile(np.abs(LED_1_COUNT/2.0 - np.arange(LED_1_COUNT)) < num_leds_on/2., (3,1)).T
def clamp(m):
'''
Creates a mask of which leds to turn on given an amplitude.
This mask lights a strip of LEDs starting at the top and bottom and reaching towards
the middle with increasing amplitude
Arguments:
m (float): The amplitude in 0 to 1
Returns:
A [LED_1_COUNT x 3] array of zeros and ones
'''
num_leds_on = (1. - m) * LED_1_COUNT
return 1 - np.tile(np.abs(LED_1_COUNT/2.0 - np.arange(LED_1_COUNT)) < num_leds_on/2., (3,1)).T
| 29.650794 | 98 | 0.658994 | 331 | 1,868 | 3.570997 | 0.181269 | 0.050761 | 0.114213 | 0.043993 | 0.85956 | 0.846024 | 0.846024 | 0.846024 | 0.846024 | 0.846024 | 0 | 0.031205 | 0.262313 | 1,868 | 62 | 99 | 30.129032 | 0.82656 | 0.586724 | 0 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
bcda6baa17e500f10937b83f3ab48190e22eae4b | 10,516 | py | Python | payfast/south_migrations/0001_initial.py | reinbach/django-payfast-convert | 20267f76252cc61eba582d06694ce8e24ed413f0 | [
"MIT"
] | 2 | 2017-11-16T16:47:55.000Z | 2018-01-02T17:00:09.000Z | payfast/south_migrations/0001_initial.py | reinbach/django-payfast-convert | 20267f76252cc61eba582d06694ce8e24ed413f0 | [
"MIT"
] | null | null | null | payfast/south_migrations/0001_initial.py | reinbach/django-payfast-convert | 20267f76252cc61eba582d06694ce8e24ed413f0 | [
"MIT"
] | 2 | 2019-08-06T11:57:35.000Z | 2020-09-01T15:50:51.000Z | # encoding: utf-8
import datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Adding model 'PayFastOrder'
db.create_table('payfast_payfastorder', (
('m_payment_id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('pf_payment_id', self.gf('django.db.models.fields.CharField')(max_length=40, unique=True, null=True, blank=True)),
('payment_status', self.gf('django.db.models.fields.CharField')(max_length=20, null=True, blank=True)),
('item_name', self.gf('django.db.models.fields.CharField')(max_length=100)),
('item_description', self.gf('django.db.models.fields.CharField')(max_length=255, null=True, blank=True)),
('amount_gross', self.gf('django.db.models.fields.DecimalField')(null=True, max_digits=15, decimal_places=2, blank=True)),
('amount_fee', self.gf('django.db.models.fields.DecimalField')(null=True, max_digits=15, decimal_places=2, blank=True)),
('amount_net', self.gf('django.db.models.fields.DecimalField')(null=True, max_digits=15, decimal_places=2, blank=True)),
('custom_str1', self.gf('django.db.models.fields.CharField')(max_length=255, null=True, blank=True)),
('custom_str2', self.gf('django.db.models.fields.CharField')(max_length=255, null=True, blank=True)),
('custom_str3', self.gf('django.db.models.fields.CharField')(max_length=255, null=True, blank=True)),
('custom_str4', self.gf('django.db.models.fields.CharField')(max_length=255, null=True, blank=True)),
('custom_str5', self.gf('django.db.models.fields.CharField')(max_length=255, null=True, blank=True)),
('custom_int1', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('custom_int2', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('custom_int3', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('custom_int4', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('custom_int5', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('name_first', self.gf('django.db.models.fields.CharField')(max_length=100, null=True, blank=True)),
('name_last', self.gf('django.db.models.fields.CharField')(max_length=100, null=True, blank=True)),
('email_address', self.gf('django.db.models.fields.CharField')(max_length=100, null=True, blank=True)),
('merchant_id', self.gf('django.db.models.fields.CharField')(max_length=15)),
('signature', self.gf('django.db.models.fields.CharField')(max_length=32, null=True, blank=True)),
('created_at', self.gf('django.db.models.fields.DateTimeField')(default=datetime.datetime.now)),
('updated_at', self.gf('django.db.models.fields.DateTimeField')(default=datetime.datetime.now)),
('request_ip', self.gf('django.db.models.fields.IPAddressField')(max_length=15, null=True, blank=True)),
('debug_info', self.gf('django.db.models.fields.CharField')(max_length=255, null=True, blank=True)),
('trusted', self.gf('django.db.models.fields.NullBooleanField')(default=None, null=True, blank=True)),
('user', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['auth.User'], null=True, blank=True)),
))
db.send_create_signal('payfast', ['PayFastOrder'])
def backwards(self, orm):
# Deleting model 'PayFastOrder'
db.delete_table('payfast_payfastorder')
models = {
'auth.group': {
'Meta': {'object_name': 'Group'},
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '80'}),
'permissions': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['auth.Permission']", 'symmetrical': 'False', 'blank': 'True'})
},
'auth.permission': {
'Meta': {'ordering': "('content_type__app_label', 'content_type__model', 'codename')", 'unique_together': "(('content_type', 'codename'),)", 'object_name': 'Permission'},
'codename': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'content_type': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['contenttypes.ContentType']"}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '50'})
},
'auth.user': {
'Meta': {'object_name': 'User'},
'date_joined': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime.now'}),
'email': ('django.db.models.fields.EmailField', [], {'max_length': '75', 'blank': 'True'}),
'first_name': ('django.db.models.fields.CharField', [], {'max_length': '30', 'blank': 'True'}),
'groups': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['auth.Group']", 'symmetrical': 'False', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'is_active': ('django.db.models.fields.BooleanField', [], {'default': 'True'}),
'is_staff': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'is_superuser': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'last_login': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime.now'}),
'last_name': ('django.db.models.fields.CharField', [], {'max_length': '30', 'blank': 'True'}),
'password': ('django.db.models.fields.CharField', [], {'max_length': '128'}),
'user_permissions': ('django.db.models.fields.related.ManyToManyField', [], {'to': "orm['auth.Permission']", 'symmetrical': 'False', 'blank': 'True'}),
'username': ('django.db.models.fields.CharField', [], {'unique': 'True', 'max_length': '30'})
},
'contenttypes.contenttype': {
'Meta': {'ordering': "('name',)", 'unique_together': "(('app_label', 'model'),)", 'object_name': 'ContentType', 'db_table': "'django_content_type'"},
'app_label': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'model': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '100'})
},
'payfast.payfastorder': {
'Meta': {'object_name': 'PayFastOrder'},
'amount_fee': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '15', 'decimal_places': '2', 'blank': 'True'}),
'amount_gross': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '15', 'decimal_places': '2', 'blank': 'True'}),
'amount_net': ('django.db.models.fields.DecimalField', [], {'null': 'True', 'max_digits': '15', 'decimal_places': '2', 'blank': 'True'}),
'created_at': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime.now'}),
'custom_int1': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'custom_int2': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'custom_int3': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'custom_int4': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'custom_int5': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'custom_str1': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'blank': 'True'}),
'custom_str2': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'blank': 'True'}),
'custom_str3': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'blank': 'True'}),
'custom_str4': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'blank': 'True'}),
'custom_str5': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'blank': 'True'}),
'debug_info': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'blank': 'True'}),
'email_address': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True', 'blank': 'True'}),
'item_description': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'blank': 'True'}),
'item_name': ('django.db.models.fields.CharField', [], {'max_length': '100'}),
'm_payment_id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'merchant_id': ('django.db.models.fields.CharField', [], {'max_length': '15'}),
'name_first': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True', 'blank': 'True'}),
'name_last': ('django.db.models.fields.CharField', [], {'max_length': '100', 'null': 'True', 'blank': 'True'}),
'payment_status': ('django.db.models.fields.CharField', [], {'max_length': '20', 'null': 'True', 'blank': 'True'}),
'pf_payment_id': ('django.db.models.fields.CharField', [], {'max_length': '40', 'unique': 'True', 'null': 'True', 'blank': 'True'}),
'request_ip': ('django.db.models.fields.IPAddressField', [], {'max_length': '15', 'null': 'True', 'blank': 'True'}),
'signature': ('django.db.models.fields.CharField', [], {'max_length': '32', 'null': 'True', 'blank': 'True'}),
'trusted': ('django.db.models.fields.NullBooleanField', [], {'default': 'None', 'null': 'True', 'blank': 'True'}),
'updated_at': ('django.db.models.fields.DateTimeField', [], {'default': 'datetime.datetime.now'}),
'user': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['auth.User']", 'null': 'True', 'blank': 'True'})
}
}
complete_apps = ['payfast']
| 84.806452 | 182 | 0.597851 | 1,194 | 10,516 | 5.137353 | 0.11474 | 0.108249 | 0.187154 | 0.267362 | 0.821324 | 0.817737 | 0.79312 | 0.772579 | 0.726117 | 0.649006 | 0 | 0.01754 | 0.170502 | 10,516 | 123 | 183 | 85.495935 | 0.685659 | 0.006942 | 0 | 0.036036 | 0 | 0 | 0.523039 | 0.30022 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018018 | false | 0.009009 | 0.036036 | 0 | 0.081081 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4c0b3ab53bd0e0a437983858877aa1c6af1824d1 | 2,882 | py | Python | tests/keras/applications/applications_test.py | mikezsx/dlstudy | 6572934f9a7c4ba498300186c2d297994c43900d | [
"MIT"
] | 3 | 2018-01-27T06:15:26.000Z | 2019-12-27T16:51:54.000Z | tests/keras/applications/applications_test.py | candleinwindsteve/keras | 9eb7ecd3e525c9cff31ebd59a96794f212ca5e1e | [
"MIT"
] | null | null | null | tests/keras/applications/applications_test.py | candleinwindsteve/keras | 9eb7ecd3e525c9cff31ebd59a96794f212ca5e1e | [
"MIT"
] | 3 | 2020-02-24T15:16:05.000Z | 2020-05-09T05:29:53.000Z | import pytest
from keras.utils.test_utils import keras_test
from keras import applications
from keras import backend as K
@keras_test
def test_resnet50():
model = applications.ResNet50(weights=None)
assert model.output_shape == (None, 1000)
@keras_test
def test_resnet50_notop():
model = applications.ResNet50(weights=None, include_top=False)
assert model.output_shape == (None, None, None, 2048)
@keras_test
def test_resnet50_pooling():
model = applications.ResNet50(weights=None,
include_top=False,
pooling='avg')
assert model.output_shape == (None, 2048)
@keras_test
def test_vgg16():
model = applications.VGG16(weights=None)
assert model.output_shape == (None, 1000)
@keras_test
def test_vgg16_notop():
model = applications.VGG16(weights=None, include_top=False)
assert model.output_shape == (None, None, None, 512)
@keras_test
def test_vgg16_pooling():
model = applications.VGG16(weights=None, include_top=False, pooling='avg')
assert model.output_shape == (None, 512)
@keras_test
def test_vgg19():
model = applications.VGG19(weights=None)
assert model.output_shape == (None, 1000)
@keras_test
def test_vgg19_notop():
model = applications.VGG16(weights=None, include_top=False)
assert model.output_shape == (None, None, None, 512)
@keras_test
def test_vgg19_pooling():
model = applications.VGG16(weights=None, include_top=False, pooling='avg')
assert model.output_shape == (None, 512)
@keras_test
@pytest.mark.skipif((K.backend() != 'tensorflow'),
reason='Requires tensorflow backend')
def test_xception():
model = applications.Xception(weights=None)
assert model.output_shape == (None, 1000)
@keras_test
@pytest.mark.skipif((K.backend() != 'tensorflow'),
reason='Requires tensorflow backend')
def test_xception_notop():
model = applications.Xception(weights=None, include_top=False)
assert model.output_shape == (None, None, None, 2048)
@keras_test
@pytest.mark.skipif((K.backend() != 'tensorflow'),
reason='Requires tensorflow backend')
def test_xception_pooling():
model = applications.Xception(weights=None, include_top=False, pooling='avg')
assert model.output_shape == (None, 2048)
@keras_test
def test_inceptionv3():
model = applications.InceptionV3(weights=None)
assert model.output_shape == (None, 1000)
@keras_test
def test_inceptionv3_notop():
model = applications.InceptionV3(weights=None, include_top=False)
assert model.output_shape == (None, None, None, 2048)
@keras_test
def test_inceptionv3_pooling():
model = applications.InceptionV3(weights=None, include_top=False, pooling='avg')
assert model.output_shape == (None, 2048)
if __name__ == '__main__':
pytest.main([__file__])
| 26.934579 | 84 | 0.703678 | 356 | 2,882 | 5.474719 | 0.11236 | 0.073884 | 0.130836 | 0.169318 | 0.905593 | 0.817855 | 0.817855 | 0.797845 | 0.717804 | 0.717804 | 0 | 0.041368 | 0.178001 | 2,882 | 106 | 85 | 27.188679 | 0.781342 | 0 | 0 | 0.540541 | 0 | 0 | 0.046495 | 0 | 0 | 0 | 0 | 0 | 0.202703 | 1 | 0.202703 | false | 0 | 0.054054 | 0 | 0.256757 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4c3c33a7373ea7a97dbd6c2831f2a160fb344025 | 96,028 | py | Python | tests/EVM/test_EVMBYTE.py | mroll/manticore | d731562f7761ed9437cce406b24c815303de370c | [
"Apache-2.0"
] | null | null | null | tests/EVM/test_EVMBYTE.py | mroll/manticore | d731562f7761ed9437cce406b24c815303de370c | [
"Apache-2.0"
] | null | null | null | tests/EVM/test_EVMBYTE.py | mroll/manticore | d731562f7761ed9437cce406b24c815303de370c | [
"Apache-2.0"
] | null | null | null |
import struct
import unittest
import json
from manticore.platforms import evm
from manticore.core import state
from manticore.core.smtlib import Operators, ConstraintSet
import os
class EVMTest_BYTE(unittest.TestCase):
_multiprocess_can_split_ = True
maxDiff=None
def _execute(self, new_vm):
last_returned = None
last_exception = None
try:
new_vm.execute()
except evm.Stop, e:
last_exception = "STOP"
except evm.NotEnoughGas:
last_exception = "OOG"
except evm.StackUnderflow:
last_exception = "INSUFICIENT STACK"
except evm.InvalidOpcode:
last_exception = "INVALID"
except evm.SelfDestruct:
last_exception = "SUICIDED"
except evm.Return as e:
last_exception = "RETURN"
last_returned = e.data
except evm.Revert:
last_exception = "REVERT"
return last_exception, last_returned
def test_BYTE_1(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_2(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [255L])
def test_BYTE_3(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [255L])
def test_BYTE_4(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_5(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_6(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [255L])
def test_BYTE_7(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_8(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_9(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
new_vm._push(6089590155545428825848686802984512581899718912L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_10(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_11(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_12(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_13(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_14(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_15(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_16(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_17(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_18(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_19(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_20(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_21(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_22(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_23(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_24(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_25(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_26(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_27(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_28(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_29(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [127L])
def test_BYTE_30(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [255L])
def test_BYTE_31(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_32(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_33(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [255L])
def test_BYTE_34(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_35(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_36(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
new_vm._push(6089590155545428825848686802984512581899718912L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_37(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_38(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [8L])
def test_BYTE_39(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_40(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_41(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_42(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_43(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_44(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_45(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
new_vm._push(6089590155545428825848686802984512581899718912L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_46(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_47(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_48(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_49(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_50(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_51(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_52(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_53(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_54(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_55(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_56(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_57(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_58(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_59(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_60(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_61(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_62(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_63(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_64(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_65(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_66(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_67(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_68(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_69(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_70(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_71(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_72(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_73(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(6089590155545428825848686802984512581899718912L)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_74(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(6089590155545428825848686802984512581899718912L)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_75(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(6089590155545428825848686802984512581899718912L)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0L])
def test_BYTE_76(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(6089590155545428825848686802984512581899718912L)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_77(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(6089590155545428825848686802984512581899718912L)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_78(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(6089590155545428825848686802984512581899718912L)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17L])
def test_BYTE_79(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(6089590155545428825848686802984512581899718912L)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_80(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(6089590155545428825848686802984512581899718912L)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_BYTE_81(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x1a'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, origin, price, data, caller, value, bytecode, header, gas=gas, global_storage=world.storage)
new_vm._push(6089590155545428825848686802984512581899718912L)
new_vm._push(6089590155545428825848686802984512581899718912L)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
if __name__ == '__main__':
unittest.main()
| 41.624621 | 143 | 0.562419 | 8,049 | 96,028 | 6.555473 | 0.021866 | 0.046243 | 0.027632 | 0.061404 | 0.984099 | 0.984099 | 0.984099 | 0.984099 | 0.984099 | 0.984099 | 0 | 0.219038 | 0.35927 | 96,028 | 2,306 | 144 | 41.642671 | 0.638539 | 0.045549 | 0 | 0.89557 | 0 | 0 | 0.067045 | 0.02656 | 0 | 0 | 0.072597 | 0 | 0.128165 | 0 | null | null | 0 | 0.003692 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4c3ff2676e3bcbbb273cd807dc7c48f1ed894421 | 29,764 | py | Python | tradenity/resources/collection.py | tradenity/python-sdk | d13fbe23f4d6ff22554c6d8d2deaf209371adaf1 | [
"Apache-2.0"
] | 1 | 2020-03-19T04:09:17.000Z | 2020-03-19T04:09:17.000Z | tradenity/resources/collection.py | tradenity/python-sdk | d13fbe23f4d6ff22554c6d8d2deaf209371adaf1 | [
"Apache-2.0"
] | null | null | null | tradenity/resources/collection.py | tradenity/python-sdk | d13fbe23f4d6ff22554c6d8d2deaf209371adaf1 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Tradenity API
Tradenity eCommerce Rest API
Contact: support@tradenity.com
"""
from __future__ import absolute_import
import re
import pprint
# python 2 and python 3 compatibility library
import six
from tradenity.api_client import ApiClient
class Collection(object):
swagger_types = {
'id': 'str',
'meta': 'InstanceMeta',
'name': 'str',
'slug': 'str',
'status': 'str',
'description': 'str',
'products': 'list[Product]'
}
attribute_map = {
'id': 'id',
'meta': '__meta',
'name': 'name',
'slug': 'slug',
'status': 'status',
'description': 'description',
'products': 'products'
}
api_client = None
def __init__(self, id=None, meta=None, name=None, slug=None, status=None, description=None, products=None):
"""Collection - a model defined in Swagger"""
self._id = id
self._meta = None
self._name = None
self._slug = None
self._status = None
self._description = None
self._products = None
self.discriminator = None
if meta is not None:
self.meta = meta
self.name = name
self.slug = slug
self.status = status
if description is not None:
self.description = description
if products is not None:
self.products = products
@property
def id(self):
if self._id:
return self._id
elif self.meta is None:
return None
else:
self._id = self.meta.href.split("/")[-1]
return self._id
@id.setter
def id(self, new_id):
self._id = new_id
@property
def meta(self):
"""Gets the meta of this Collection.
:return: The meta of this Collection.
:rtype: InstanceMeta
"""
return self._meta
@meta.setter
def meta(self, meta):
"""Sets the meta of this Collection.
:param meta: The meta of this Collection.
:type: InstanceMeta
"""
self._meta = meta
@property
def name(self):
"""Gets the name of this Collection.
:return: The name of this Collection.
:rtype: str
"""
return self._name
@name.setter
def name(self, name):
"""Sets the name of this Collection.
:param name: The name of this Collection.
:type: str
"""
self._name = name
@property
def slug(self):
"""Gets the slug of this Collection.
:return: The slug of this Collection.
:rtype: str
"""
return self._slug
@slug.setter
def slug(self, slug):
"""Sets the slug of this Collection.
:param slug: The slug of this Collection.
:type: str
"""
self._slug = slug
@property
def status(self):
"""Gets the status of this Collection.
:return: The status of this Collection.
:rtype: str
"""
return self._status
@status.setter
def status(self, status):
"""Sets the status of this Collection.
:param status: The status of this Collection.
:type: str
"""
allowed_values = ["enabled", "disabled"]
if status is not None and status not in allowed_values:
raise ValueError(
"Invalid value for `status` ({0}), must be one of {1}"
.format(status, allowed_values)
)
self._status = status
@property
def description(self):
"""Gets the description of this Collection.
:return: The description of this Collection.
:rtype: str
"""
return self._description
@description.setter
def description(self, description):
"""Sets the description of this Collection.
:param description: The description of this Collection.
:type: str
"""
self._description = description
@property
def products(self):
"""Gets the products of this Collection.
:return: The products of this Collection.
:rtype: list[Product]
"""
return self._products
@products.setter
def products(self, products):
"""Sets the products of this Collection.
:param products: The products of this Collection.
:type: list[Product]
"""
self._products = products
def to_dict(self):
"""Returns the model properties as a dict"""
result = {}
for attr, _ in six.iteritems(self.swagger_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
if issubclass(Collection, dict):
for key, value in self.items():
result[key] = value
return result
def to_str(self):
"""Returns the string representation of the model"""
return pprint.pformat(self.to_dict())
def __repr__(self):
"""For `print` and `pprint`"""
return self.to_str()
def __eq__(self, other):
"""Returns true if both objects are equal"""
if not isinstance(other, Collection):
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
"""Returns true if both objects are not equal"""
return not self == other
@classmethod
def get_api_client(cls):
if cls.api_client is None:
cls.api_client = ApiClient.instance()
return cls.api_client
@classmethod
def find_all(cls, **kwargs):
return cls.list_all_collections(**kwargs)
@classmethod
def find_all_by(cls, **kwargs):
return cls.list_all_collections(**kwargs)
@classmethod
def find_one_by(cls, **kwargs):
results = cls.list_all_collections(**kwargs)
if len(results) > 0:
return results[0]
@classmethod
def find_by_id(cls, id):
return cls.get_collection_by_id(id)
def create(self):
new_instance = self.create_collection(self)
self.id = new_instance.id
return self
def update(self):
return self.update_collection_by_id(self.id, self)
def delete(self):
return self.delete_collection_by_id(self.id)
@classmethod
def create_collection(cls, collection, **kwargs):
"""Create Collection
Create a new Collection
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.create_collection(collection, async=True)
>>> result = thread.get()
:param async bool
:param Collection collection: Attributes of collection to create (required)
:return: Collection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._create_collection_with_http_info(collection, **kwargs)
else:
(data) = cls._create_collection_with_http_info(collection, **kwargs)
return data
@classmethod
def _create_collection_with_http_info(cls, collection, **kwargs):
"""Create Collection
Create a new Collection
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.create_collection_with_http_info(collection, async=True)
>>> result = thread.get()
:param async bool
:param Collection collection: Attributes of collection to create (required)
:return: Collection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['collection']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'collection' is set
if ('collection' not in params or
params['collection'] is None):
raise ValueError("Missing the required parameter `collection` when calling `create_collection`")
collection_formats = {}
path_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'collection' in params:
body_params = params['collection']
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/collections', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Collection',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def delete_collection_by_id(cls, collection_id, **kwargs):
"""Delete Collection
Delete an instance of Collection by its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_collection_by_id(collection_id, async=True)
>>> result = thread.get()
:param async bool
:param str collection_id: ID of collection to delete. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._delete_collection_by_id_with_http_info(collection_id, **kwargs)
else:
(data) = cls._delete_collection_by_id_with_http_info(collection_id, **kwargs)
return data
@classmethod
def _delete_collection_by_id_with_http_info(cls, collection_id, **kwargs):
"""Delete Collection
Delete an instance of Collection by its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.delete_collection_by_id_with_http_info(collection_id, async=True)
>>> result = thread.get()
:param async bool
:param str collection_id: ID of collection to delete. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['collection_id']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'collection_id' is set
if ('collection_id' not in params or
params['collection_id'] is None):
raise ValueError("Missing the required parameter `collection_id` when calling `delete_collection_by_id`")
collection_formats = {}
path_params = {}
if 'collection_id' in params:
path_params['collectionId'] = params['collection_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/collections/{collectionId}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def get_collection_by_id(cls, collection_id, **kwargs):
"""Find Collection
Return single instance of Collection by its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_collection_by_id(collection_id, async=True)
>>> result = thread.get()
:param async bool
:param str collection_id: ID of collection to return (required)
:return: Collection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._get_collection_by_id_with_http_info(collection_id, **kwargs)
else:
(data) = cls._get_collection_by_id_with_http_info(collection_id, **kwargs)
return data
@classmethod
def _get_collection_by_id_with_http_info(cls, collection_id, **kwargs):
"""Find Collection
Return single instance of Collection by its ID.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.get_collection_by_id_with_http_info(collection_id, async=True)
>>> result = thread.get()
:param async bool
:param str collection_id: ID of collection to return (required)
:return: Collection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['collection_id']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'collection_id' is set
if ('collection_id' not in params or
params['collection_id'] is None):
raise ValueError("Missing the required parameter `collection_id` when calling `get_collection_by_id`")
collection_formats = {}
path_params = {}
if 'collection_id' in params:
path_params['collectionId'] = params['collection_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/collections/{collectionId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Collection',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def list_all_collections(cls, **kwargs):
"""List Collections
Return a list of Collections
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.list_all_collections(async=True)
>>> result = thread.get()
:param async bool
:param int page: page number
:param int size: page size
:param str sort: page order
:return: page[Collection]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._list_all_collections_with_http_info(**kwargs)
else:
(data) = cls._list_all_collections_with_http_info(**kwargs)
return data
@classmethod
def _list_all_collections_with_http_info(cls, **kwargs):
"""List Collections
Return a list of Collections
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.list_all_collections_with_http_info(async=True)
>>> result = thread.get()
:param async bool
:param int page: page number
:param int size: page size
:param str sort: page order
:return: page[Collection]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page', 'size', 'sort']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
if 'page' in params:
query_params.append(('page', params['page']))
if 'size' in params:
query_params.append(('size', params['size']))
if 'sort' in params:
query_params.append(('sort', params['sort']))
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/collections', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='page[Collection]',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def replace_collection_by_id(cls, collection_id, collection, **kwargs):
"""Replace Collection
Replace all attributes of Collection
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.replace_collection_by_id(collection_id, collection, async=True)
>>> result = thread.get()
:param async bool
:param str collection_id: ID of collection to replace (required)
:param Collection collection: Attributes of collection to replace (required)
:return: Collection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._replace_collection_by_id_with_http_info(collection_id, collection, **kwargs)
else:
(data) = cls._replace_collection_by_id_with_http_info(collection_id, collection, **kwargs)
return data
@classmethod
def _replace_collection_by_id_with_http_info(cls, collection_id, collection, **kwargs):
"""Replace Collection
Replace all attributes of Collection
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.replace_collection_by_id_with_http_info(collection_id, collection, async=True)
>>> result = thread.get()
:param async bool
:param str collection_id: ID of collection to replace (required)
:param Collection collection: Attributes of collection to replace (required)
:return: Collection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['collection_id', 'collection']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'collection_id' is set
if ('collection_id' not in params or
params['collection_id'] is None):
raise ValueError("Missing the required parameter `collection_id` when calling `replace_collection_by_id`")
# verify the required parameter 'collection' is set
if ('collection' not in params or
params['collection'] is None):
raise ValueError("Missing the required parameter `collection` when calling `replace_collection_by_id`")
collection_formats = {}
path_params = {}
if 'collection_id' in params:
path_params['collectionId'] = params['collection_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'collection' in params:
body_params = params['collection']
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/collections/{collectionId}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Collection',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
@classmethod
def update_collection_by_id(cls, collection_id, collection, **kwargs):
"""Update Collection
Update attributes of Collection
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.update_collection_by_id(collection_id, collection, async=True)
>>> result = thread.get()
:param async bool
:param str collection_id: ID of collection to update. (required)
:param Collection collection: Attributes of collection to update. (required)
:return: Collection
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return cls._update_collection_by_id_with_http_info(collection_id, collection, **kwargs)
else:
(data) = cls._update_collection_by_id_with_http_info(collection_id, collection, **kwargs)
return data
@classmethod
def _update_collection_by_id_with_http_info(cls, collection_id, collection, **kwargs):
"""Update Collection
Update attributes of Collection
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.update_collection_by_id_with_http_info(collection_id, collection, async=True)
>>> result = thread.get()
:param async bool
:param str collection_id: ID of collection to update. (required)
:param Collection collection: Attributes of collection to update. (required)
:return: Collection
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['collection_id', 'collection']
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
query_params = []
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
query_params.append((key, val))
params[key] = val
del params['kwargs']
# verify the required parameter 'collection_id' is set
if ('collection_id' not in params or
params['collection_id'] is None):
raise ValueError("Missing the required parameter `collection_id` when calling `update_collection_by_id`")
# verify the required parameter 'collection' is set
if ('collection' not in params or
params['collection'] is None):
raise ValueError("Missing the required parameter `collection` when calling `update_collection_by_id`")
collection_formats = {}
path_params = {}
if 'collection_id' in params:
path_params['collectionId'] = params['collection_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'collection' in params:
body_params = params['collection']
# HTTP header `Accept`
header_params['Accept'] = cls.get_api_client().select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = cls.get_api_client().select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = []
return cls.get_api_client().call_api(
'/collections/{collectionId}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Collection',
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 32.564551 | 118 | 0.604791 | 3,303 | 29,764 | 5.207387 | 0.060551 | 0.04186 | 0.02686 | 0.025116 | 0.832035 | 0.78593 | 0.772384 | 0.76157 | 0.746105 | 0.746105 | 0 | 0.000531 | 0.303857 | 29,764 | 913 | 119 | 32.600219 | 0.829585 | 0.02839 | 0 | 0.621399 | 0 | 0 | 0.140015 | 0.032806 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.010288 | null | null | 0.004115 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4c71acd3ee62007c9d3a8f86d0337d506c8b42b7 | 27,076 | py | Python | perfil_de_aco.py | JoaoSevergnini/verificacao_perfis_metalicos | 0c0eba1bdcfbcfb5c6c72fe196eb06f19562379d | [
"MIT"
] | null | null | null | perfil_de_aco.py | JoaoSevergnini/verificacao_perfis_metalicos | 0c0eba1bdcfbcfb5c6c72fe196eb06f19562379d | [
"MIT"
] | null | null | null | perfil_de_aco.py | JoaoSevergnini/verificacao_perfis_metalicos | 0c0eba1bdcfbcfb5c6c72fe196eb06f19562379d | [
"MIT"
] | 1 | 2022-03-30T14:09:41.000Z | 2022-03-30T14:09:41.000Z | from math import sqrt
from secao import SecaoGenerica
from material import Material
class PerfilDeAço(SecaoGenerica):
"""
Esta classe define um seção tranversal de barra de formato genérico
de acordo com suas propriedades geométricas e seu material.
Parameter
----------
A: 'float'
área total da seção transversal
Ix: 'float'
momento de inércia a flexão do perfil em relação ao eixo X (horizontal)
que passa pelo centroide da seção.
Iy: 'float'
momento de inércia a flexão da perfil em relação ao eixo Y (Vertical)
que passa pelo centroide da seção.
J: 'float'
constante de torção da seção em relação ao centróide da seção
material: 'Material', 'list', 'dict'
material que compõe a seção.
Wx: 'float'
módulo elástico do perfil em relação ao eixo X (horizontal)
Wy: 'float'
módulo elástico do perfil em relação ao eixo Y (Vertical)
Zx: 'float'
módulo plástico do perfil em relação ao eixo X (horizontal)
xo: 'float'
coordenada x do centro de corte da seção trasnversal em relação ao
centróide da seção
yo: 'float'
coordenada y do centro de corte da seção trasnversal em relação ao
centróide da seção
Cw: 'float'
constante de empenamento do pefil
simetria: 'list'
indica se o perfil apresenta eixos de simetria
"""
def __init__(self, A, Ix, Iy, J, Wx, Wy, Zx, Zy, xo, yo, Cw, material, simetria):
if isinstance(material, list):
material = Material(*material)
if isinstance(material, dict):
material = Material(**material)
super().__init__(A, Ix, Iy, J, material, Wx, Wy, xo, yo, Cw, simetria)
self.Zx = Zx
self.Zy = Zy
self.esb_alma = None
self.esb_mesa = None
self.raiz_E_fy = sqrt(self.material.E / self.material.fy)
self.raiz_fy_E = sqrt(self.material.fy / self.material.E)
# -------------------------------------------------------------------------------------
# --------------------------Verificações de resistência--------------------------------
# -------------------------------------------------------------------------------------
# --------------------------------NBR8800/2008-----------------------------------------
# TRAÇÂO
# --------
def resist_esc_secao_bruta_NBR8800(self, gama_a1=1.1):
"""
Método que determina a resistência ao escoamento da seção bruta de perfil métálico
A resitência ao escoamento da seção bruta é determinada de acordo com o item a) da
seção 5.2.2 da NBR8800:2008, que é obtida pelo produto da área total do perfil pelo
tensão de escoamento, sendo o valor obtido deste produto minorado pelo coeficiente de
segurança gama a1.
Parameter
---------
gama_a1: 'float' (default=1,1)
coeficiente de segurança gama_a1
Return
------
"""
return self.A * self.material.fy / gama_a1
# COMPRESSÃO
# -----------
def par_esbeltez_limites_AL_Ncrd(self):
"""
Retorna os parâmetros de esbeltez limite de plastificação e de início de escoamento,
respectivamente dos elementos apoiado-livre que compõe o perfil, de acordo com os itens
F.2.a) à F.2.d) do anexo F da NBR8800:2008.
"""
# Como o cálculo deste fator é em função do tipo de seção, este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil, caso o perfil o perfil a ser criado não apresente elementos
# apoiados livres este método não precisa ser implementado.
raise NotImplementedError
def ind_esbeltez_reduzido(self, klx, kly, klz, Q=1):
"""
Método que determina o indice de elbeltez reduzido de uma barra de aço de determinado perfil.
O indice de esbeltez reduzido é determinado de acordo com o item 5.3.3.2 da NBR8800:2008,
que apresenta um limite superior para o seu valor de 200.
Parameter
---------
klx:'float'
comprimento de flambagem por flexão em relação ao eixo x
kly:'float'
comprimento de flambagem por flexão em relação ao eixo Y
klz:'float'
comprimento de flambagem por torção em relação ao eixo longitudinal Z
Q:'float' (default = 1)
fator de redução total associado a flambagem local
Return
------
ier: 'float'
indice de esbeltez reduzido
"""
Ne = self.Ne(klx, kly, klz)
ier = sqrt(Q * self.A * self.material.fy / Ne)
return ier
def fator_reducao_compressao(self, ier):
"""
Método que determina o fator de redução da resistência a compressão do perfil (Fator Chi).
O fator de redução da resistência a compressão (Fator Chi) é determinado de acordo
com o item 5.3.3 da NBR8800:2008 em função do indíce de esbeltez reduzido da barra.
Parameter
--------
ier: 'float'
indice de esbeltez reduzido
Return:
-------
frc: 'float'
fator de redução de compressão
"""
if ier <= 1.5:
frc = 0.658 ** (ier ** 2)
else:
frc = 0.877 / ier ** 2
return frc
def fator_Qs(self):
""" Método que determina o fator de redução de resistência a compressão, associado a flambagem local de
de elementos apoiados livres.
O fator Qs é determinado de acordo com o item F.2 do anexo F da NBR8800:2008.
"""
# Como o cálculo deste fator é em função do tipo de seção, este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil, caso o perfil o perfil a ser criado não apresente elementos
# apoiados livres este método deve ser implementado retornando o valor 1.
raise NotImplementedError
def fator_Qa(self, frc):
"""
Método que determina o fator de redução de resistência a compressão, associado a flambagem local de
de elementos apoiados apoiados.
O fator Qa é determinado de acordo com o item F.3 do anexo F da NBR8800:2008.
Como o cálculo deste fator é em função do tipo de seção, este método deve ser implementado em cada uma das
classes especificas de cada tipo de perfil.
"""
# Como o cálculo deste fator é em função do tipo de seção, este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil, caso o perfil o perfil a ser criado não apresente elementos
# apoiados apoiados este método deve ser implementado retornando o valor 1.
raise NotImplementedError
def fator_Q(self, frc):
"""
Método que determina o fator de redução total associado a flambagem local.
O fator de redução Q associado a flambagem local é determinado de acordo com o anexo F da NBR8800:2008, sendo
determinada pela multiplicação dos fatores Qa e Qs que estão associados, respectivamente, a flambagem local
dos elementos apoiados-apoiados(AA), como almas de perfis I por exemplo, e elementos apoiados-livres(AL), como
mesas de perfis I.
Parameter
--------
frc: 'float'
fator de redução da compressão
Return:
'float'
"""
return self.fator_Qa(frc) * self.fator_Qs()
def Ncrd_NBR8800(self, klx, kly, klz, gama_a1=1.1):
"""
Método que determina a resistência a compressão de cálculo de uma barra de aço de acordo com a NBR8800
Parameter
---------
klx:'float'
comprimento de flambagem por flexão em relação ao eixo x
kly:'float'
comprimento de flambagem por flexão em relação ao eixo Y
klz:'float'
comprimento de flambagem por torção em relação ao eixo longitudinal Z
Return
------
Ncrd = 'float'
resistência a compressão do perfil
"""
ier = self.ind_esbeltez_reduzido(klx, kly, klz)
frc = self.fator_reducao_compressao(ier)
Q = self.fator_Q(frc)
ier = self.ind_esbeltez_reduzido(klx, kly, klz, Q)
frc = self.fator_reducao_compressao(ier)
Ncrd = frc * Q * self.A * self.material.fy / gama_a1
return Ncrd
# CORTANTE
# -----------
@property
def Awx(self):
""" Área efetiva de cisalhamento na direção X"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
@property
def Awy(self):
""" Área efetiva de cisalhamento na direção Y"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def par_esbeltez_limites_Vrd(self, kv):
"""
Retorna os parâmetros de esbeltez limite de plastificação e de início de escoamento,
respectivamente, conforme o item 5.4.3.1.1 da NBR8800:2008.
Parameter
--------
kv: 'float'
coeficiente kv do perfil
Returns: 'float'
"""
return 1.1 * sqrt(kv) * self.raiz_E_fy, 1.37 * sqrt(kv) * self.raiz_E_fy
def Vpl(self, Aw):
"""
Método que determina a força cortante de plastificação da alma.
O valor de Vpl é determinado conforme o item 5.4.3.1.2 na NBR8800:2008.
Parameter
--------
Aw: 'float'
área efetiva de cisalhamento
Return
------
vpl: 'float'
força cortante de plastificação
"""
vpl = 0.60 * Aw * self.material.fy
return vpl
# EM X
# -------------
def kv_Vrdx(self, a=None):
"""
Retorna o valor do coeficiente kv do perfil determinação da força
resistênte ao corte na direção x.
Coeficiente que leva em consideração a existência de enrijecedores
e a forma do perfil na resistência ao cisalhamento
Patameter
---------
a: 'float'
distância entre os centros dos enrijecedores.
Return
------
kv: 'float'
coeficiente kv
"""
raise NotImplementedError
def Vrdx_NBR8800(self, a=None, gama_a1=1.1):
"""
Método que determina a força cortante resistente de cálculo na
direção X do perfil de acordo com a NBR8800:2008.
O procedimento para a determinação da capacidade resistênte ao corte da
seção transversal é realizado conforme a seção 5.4.3 da NBR8800:2008.
Parameter
--------
a: 'float'
distância entre eixos de enrijecedores
gama_a1: 'float' (default = 1.1)
coeficiente de minoração da resistência.
Return
------
Vrdx: 'float'
Força cortante resistênte de cálculo na direção x.
"""
kv = self.kv_Vrdx(a)
# elp = esbeltez limite para plastificação
# elr = esbeltez limite para início de escoamento
elp, elr = self.par_esbeltez_limites_Vrd(kv)
if self.esb_mesa <= elp:
return self.Vpl(self.Awx) / gama_a1
elif elp < self.esb_mesa <= elr:
return (elp / self.esb_mesa) * (self.Vpl(self.Awx) / gama_a1)
else:
return 1.24 * (elp / self.esb_mesa) ** 2 * (self.Vpl(self.Awx) / gama_a1)
# CORTANTE EM Y
# -------------
def kv_Vrdy(self, a=None):
"""
Retorna o valor do coeficiente kv do perfil para determinação da
força resistênte ao corte na direção y.
Coeficiente que leva em consideração a existência de enrijecedores
e a forma do perfil na resistência ao cisalhamento
Patameter
---------
a: 'float'
distância entre os centros dos enrijecedores.
Return
------
kv: 'float'
coeficiente kv
"""
raise NotImplementedError
def Vrdy_NBR8800(self, a=None, gama_a1=1.1):
"""
Método que determina a força cortante resistente de cálculo na
direção X do perfil de acordo com a NBR8800:2008.
O procedimento para a determinação da capacidade resistente ao corte
da seção transversal é realizado conforme a seção 5.4.3 da NBR8800:2008.
Parameter
--------
a: 'float'
distância entre eixos de enrijecedores
gama_a1: 'float' (default = 1.1)
coeficiente de minoração da resistência.
Return
------
Vrdy: 'float'
Força cortante resistênte de cálculo na direção y
"""
kv = self.kv_Vrdy(a)
# elp = esbeltez limite para plastificação
# elr = esbeltez limite para início de escoamento
elp, elr = self.par_esbeltez_limites_Vrd(kv)
self.Vpl(self.Awy)
if self.esb_alma <= elp:
return self.Vpl(self.Awy) / gama_a1
elif elp < self.esb_alma <= elr:
return (elp / self.esb_alma) * (self.Vpl(self.Awy) / gama_a1)
else:
return 1.24 * (elp / self.esb_alma) ** 2 * (self.Vpl(self.Awy) / gama_a1)
# MOMENTO FLETOR EM X
# ------------
@property
def Mplx(self):
""" Momento de plastificação da seção em relação ao eixo X"""
return self.Zx * self.material.fy
# Estado Limite FLT
def indice_esbeltez_X(self, Lb):
"""
Retorna o indice de esbeltez de uma barra de comprimento destravado Lb
formado pelo perfil em relação ao eixo X
Parameter
---------
Lb: 'float'
comprimento destravado da barra
Return
------
"""
return Lb / self.ry
def par_esbeltez_limite_Mrdx_FLT(self):
"""
Retorna os parâmetros de esbeltez limite de plastificação e de
início de escoamento, respectivamente, relativos a flambagem lateral com
torção para barras fletidas em relação ao eixo x conforme o que consta na
seção G2 do anexo G da NBR8800:2008.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mrx_FLT(self):
"""
Retorna o momento fletor em X correspondente ao início de escoamento da seção,
para o estado limite de flambagem lateral com torção.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mcrx_FLT(self, cb, lb):
"""
Retorna o momento fletor em X correspondente a flambagem elástica,
para o estado limite de flambagem lateral com torção.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mnx_FLT(self, Cb, Lb):
"""
Determina o momento fletor resistente nominal de uma barra para
o estado limite último de flambagem lateral com torção.
Parameter
--------
Cb: 'float'
coeficiente Cb determinado conforme item 5.4.2.3 da NBR8800:2008
Lb: 'float'
comprimento destravado da barra
Return
------
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
# Estado Limite FLM
def par_esbeltez_limite_Mrdx_FLM(self):
"""
Retorna os parâmetros de esbeltez limite de plastificação e de
início de escoamento, respectivamente, relativos a flambagem local da
mesa para barras fletidas em relação ao eixo x conforme o que consta na
seção G2 do anexo G da NBR8800:2008.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mrx_FLM(self):
"""
Retorna o momento fletor em X correspondente ao inicio de escoamento da seção,
para o estado limite de flambagem local da mesa.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mcrx_FLM(self):
"""
Retorna o momento fletor em X correspondente a flambagem elástica,
para o estado limite de flambagem local da mesa.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mnx_FLM(self):
"""
Determina o momento fletor resistente nominal em X de uma barra para
o estado limite último de flambagem local da mesa.
Return
------
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
# Estado Limite FLA
def par_esbeltez_limite_Mrdx_FLA(self):
"""
Retorna os parâmetros de esbeltez limite de plastificação e de
início de escoamento, respectivamente, relativos a flambagem local da
alma para barras fletidas em relação ao eixo x conforme o que consta na
seção G2 do anexo G da NBR8800:2008.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mrx_FLA(self):
"""
Retorna o momento fletor em X correspondente ao inicio de escoamento da seção,
para o estado limite de flambagem local da alma.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mcrx_FLA(self):
"""
Retorna o momento fletor em X correspondente a flambagem elástica,
para o estado limite de flambagem local da mesa.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mnx_FLA(self):
"""
Determina o momento fletor resistente nominal em X de uma barra para
o estado limite último de flambagem local da alma.
Return
------
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mrdx_NBR8800(self, Lb, gama_a1=1.1, Cb=1):
"""
Método responsável por calcular o momento fletor resitente de cálculo para uma
barra de comprimento Lb em relação ao eixo X do perfil, de acordo com a NBR8800.
Parameter
--------
Cb: 'float'
coeficiente Cb determinado conforme item 5.4.2.3 da NBR8800:2008
Lb: 'float'
comprimento destravado da barra
gama_a1: 'float'
coeficiente de minoração da resistência
Return
------
"""
return min(self.Mnx_FLA(), self.Mnx_FLM(), self.Mnx_FLT(Cb, Lb)) / gama_a1
# MOMENTO EM Y
# ------------
@property
def Mply(self):
""" Momento de plastificação da seção em relação ao eixo Y"""
return self.Zy * self.material.fy
# Estado Limite FLT
def indice_esbeltez_Y(self, Lb):
"""
Retorna o indice de esbeltez de uma barra de comprimento destravado Lb
formado pelo perfil em relação ao eixo Y
Parameter
---------
Lb: 'float'
comprimento destravado da barra
Return
------
"""
return Lb / self.rx
def par_esbeltez_limite_Mrdy_FLT(self):
"""
Retorna os parâmetros de esbeltez limite de plastificação e de
início de escoamento, respectivamente, relativos a flambagem lateral com
torção para barras fletidas em relação ao eixo y conforme o que consta na
seção G2 do anexo G da NBR8800:2008.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mry_FLT(self):
"""
Retorna o momento fletor em Y correspondente ao início de escoamento da seção,
para o estado limite de flambagem lateral com torção.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mcry_FLT(self, Cb):
"""
Retorna o momento fletor em Y correspondente a flambagem elástica,
para o estado limite de flambagem lateral com torção.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mny_FLT(self, Cb, Lb):
"""
Determina o momento fletor resistente nominal em Y de uma barra para
o estado limite último de flambagem lateral com torção.
Return
------
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
# Estado Limite FLM
def par_esbeltez_limite_Mrdy_FLM(self):
"""
Retorna os parâmetros de esbeltez limite de plastificação e de
início de escoamento, respectivamente, relativos a flambagem local
da mesa para barras fletidas em relação ao eixo y conforme o que
consta na seção G2 do anexo G da NBR8800:2008.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mry_FLM(self):
"""
Retorna o momento fletor em Y correspondente ao início de escoamento da seção,
para o estado limite de flambagem local da mesa.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mcry_FLM(self):
"""
Retorna o momento fletor em Y correspondente a flambagem elástica,
para o estado limite de flambagem local da mesa.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mny_FLM(self):
"""
Determina o momento fletor resistente nominal em Y de uma barra para
o estado limite último de flambagem local da mesa.
Return
------
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
# Estado Limite FLA
def par_esbeltez_limite_Mrdy_FLA(self):
"""
Retorna os parâmetros de esbeltez limite de plastificação e de
início de escoamento, respectivamente, relativos a flambagem local
da alma para barras fletidas em relação ao eixo y conforme o que
consta na seção G2 do anexo G da NBR8800:2008.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mry_FLA(self):
"""
Retorna o momento fletor em Y correspondente ao início de escoamento da seção,
para o estado limite de flambagem local da alma.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mcry_FLA(self):
"""
Retorna o momento fletor em Y correspondente a flambagem elástica,
para o estado limite de flambagem local da alma.
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mny_FLA(self):
"""
Determina o momento fletor resistente nominal em Y de uma barra para
o estado limite último de flambagem local da alma.
Return
------
"""
# Como o cálculo deste fator é em função do tipo de seção,
# este método deve ser implementado em cada uma das
# classes especificas de cada tipo de perfil.
raise NotImplementedError
def Mrdy_NBR8800(self, Lb, Cb=1, gama_a1=1.1):
"""
Método responsável por calcular o momento fletor resitente de cálculo para uma
barra de comprimento Lb em relação ao eixo Y do perfil, de acordo com a NBR8800.
Parameter
--------
Cb: 'float'
coeficiente Cb determinado conforme item 5.4.2.3 da NBR8800:2008
Lb: 'float'
comprimento destravado da barra
gama_a1: 'float'
coeficiente de minoração da resistência
Return
------
"""
return min(self.Mny_FLA(), self.Mny_FLM(), self.Mny_FLT(Cb, Lb)) / gama_a1
| 32.621687 | 118 | 0.611095 | 3,536 | 27,076 | 4.638575 | 0.083428 | 0.021949 | 0.027314 | 0.033167 | 0.834532 | 0.809779 | 0.77521 | 0.761614 | 0.72668 | 0.701012 | 0 | 0.018682 | 0.315962 | 27,076 | 829 | 119 | 32.661037 | 0.866908 | 0.642414 | 0 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049457 | 0 | 1 | 0.335714 | false | 0 | 0.021429 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d5bc27e926fc88578be001718343e11fb07f106b | 313 | py | Python | tests/internal/ipv6/test_ipv6_false_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/ipv6/test_ipv6_false_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/ipv6/test_ipv6_false_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | 1 | 2021-12-15T11:58:22.000Z | 2021-12-15T11:58:22.000Z |
# Testing module ipv6.false
import pytest
import ec2_compare.internal.ipv6.false
def test_get_internal_data_ipv6_false_get_instances_list():
assert len(ec2_compare.internal.ipv6.false.get_instances_list()) > 0
def test_get_internal_data_ipv6_false_get():
assert len(ec2_compare.internal.ipv6.false.get) > 0
| 31.3 | 70 | 0.830671 | 50 | 313 | 4.82 | 0.36 | 0.224066 | 0.19917 | 0.273859 | 0.825726 | 0.605809 | 0.605809 | 0.605809 | 0 | 0 | 0 | 0.038328 | 0.083067 | 313 | 9 | 71 | 34.777778 | 0.801394 | 0.079872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
912b6ca8bd1c87bb0172fff3f4a4252db87cc895 | 113 | py | Python | odapi/tests/toolbox/__init__.py | jlandercy/odapi | 781aa95ef346f8d5f1d727a19ae078687cc4cc36 | [
"BSD-3-Clause"
] | 1 | 2020-05-27T08:33:26.000Z | 2020-05-27T08:33:26.000Z | odapi/tests/toolbox/__init__.py | jlandercy/odapi | 781aa95ef346f8d5f1d727a19ae078687cc4cc36 | [
"BSD-3-Clause"
] | null | null | null | odapi/tests/toolbox/__init__.py | jlandercy/odapi | 781aa95ef346f8d5f1d727a19ae078687cc4cc36 | [
"BSD-3-Clause"
] | null | null | null | from odapi.tests.toolbox.test_toolbox_timeseries import *
from odapi.tests.toolbox.test_toolbox_weather import *
| 37.666667 | 57 | 0.858407 | 16 | 113 | 5.8125 | 0.5 | 0.193548 | 0.301075 | 0.451613 | 0.688172 | 0.688172 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070796 | 113 | 2 | 58 | 56.5 | 0.885714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9134f8cd5e34953c13bd7cf9104c4d8dd8260426 | 8,287 | py | Python | raiden/tests/integration/fixtures/raiden_network.py | LefterisJP/raiden | 1ee26cd3e4294cb122efb705160e03ba3e74ab9f | [
"MIT"
] | null | null | null | raiden/tests/integration/fixtures/raiden_network.py | LefterisJP/raiden | 1ee26cd3e4294cb122efb705160e03ba3e74ab9f | [
"MIT"
] | 1 | 2018-06-18T13:06:00.000Z | 2018-06-18T13:06:00.000Z | raiden/tests/integration/fixtures/raiden_network.py | LefterisJP/raiden | 1ee26cd3e4294cb122efb705160e03ba3e74ab9f | [
"MIT"
] | 1 | 2017-06-09T19:27:11.000Z | 2017-06-09T19:27:11.000Z | import os
import gevent
import pytest
from raiden.constants import GENESIS_BLOCK_NUMBER
from raiden.tests.utils.network import (
CHAIN,
create_all_channels_for_network,
create_apps,
create_network_channels,
create_sequential_channels,
parallel_start_apps,
wait_for_alarm_start,
wait_for_channels,
wait_for_token_networks,
)
from raiden.tests.utils.tests import shutdown_apps_and_cleanup_tasks
_ETH_LOGDIR = os.environ.get("RAIDEN_TESTS_ETH_LOGSDIR")
def timeout(blockchain_type: str):
"""As parity nodes are slower, we need to set a longer timeout when
waiting for onchain events to complete."""
return 120 if blockchain_type == "parity" else 30
@pytest.fixture
def raiden_chain(
token_addresses,
token_network_registry_address,
channels_per_node,
deposit,
settle_timeout,
chain_id,
blockchain_services,
endpoint_discovery_services,
raiden_udp_ports,
reveal_timeout,
retry_interval,
retries_before_backoff,
throttle_capacity,
throttle_fill_rate,
nat_invitation_timeout,
nat_keepalive_retries,
nat_keepalive_timeout,
environment_type,
unrecoverable_error_should_crash,
local_matrix_servers,
private_rooms,
blockchain_type,
contracts_path,
user_deposit_address,
tmpdir,
request,
):
if len(token_addresses) != 1:
raise ValueError("raiden_chain only works with a single token")
assert channels_per_node in (0, 1, 2, CHAIN), (
"deployed_network uses create_sequential_network that can only work "
"with 0, 1 or 2 channels"
)
if _ETH_LOGDIR:
base_datadir = os.path.join(_ETH_LOGDIR, request.node.name, "raiden_nodes")
else:
base_datadir = os.path.join(tmpdir.strpath, "raiden_nodes")
service_registry_address = None
if blockchain_services.service_registry:
service_registry_address = blockchain_services.service_registry.address
raiden_apps = create_apps(
chain_id=chain_id,
blockchain_services=blockchain_services.blockchain_services,
endpoint_discovery_services=endpoint_discovery_services,
token_network_registry_address=token_network_registry_address,
secret_registry_address=blockchain_services.secret_registry.address,
service_registry_address=service_registry_address,
user_deposit_address=user_deposit_address,
raiden_udp_ports=raiden_udp_ports,
reveal_timeout=reveal_timeout,
settle_timeout=settle_timeout,
database_basedir=base_datadir,
retry_interval=retry_interval,
retries_before_backoff=retries_before_backoff,
throttle_capacity=throttle_capacity,
throttle_fill_rate=throttle_fill_rate,
nat_invitation_timeout=nat_invitation_timeout,
nat_keepalive_retries=nat_keepalive_retries,
nat_keepalive_timeout=nat_keepalive_timeout,
environment_type=environment_type,
unrecoverable_error_should_crash=unrecoverable_error_should_crash,
local_matrix_url=local_matrix_servers[0],
private_rooms=private_rooms,
contracts_path=contracts_path,
)
confirmed_block = raiden_apps[0].raiden.confirmation_blocks + 1
blockchain_services.deploy_service.wait_until_block(target_block_number=confirmed_block)
parallel_start_apps(raiden_apps)
from_block = GENESIS_BLOCK_NUMBER
for app in raiden_apps:
app.raiden.install_all_blockchain_filters(
app.raiden.default_registry, app.raiden.default_secret_registry, from_block
)
exception = RuntimeError("`raiden_chain` fixture setup failed, token networks unavailable")
with gevent.Timeout(seconds=timeout(blockchain_type), exception=exception):
wait_for_token_networks(
raiden_apps=raiden_apps,
token_network_registry_address=token_network_registry_address,
token_addresses=token_addresses,
)
app_channels = create_sequential_channels(raiden_apps, channels_per_node)
create_all_channels_for_network(
app_channels=app_channels,
token_addresses=token_addresses,
channel_individual_deposit=deposit,
channel_settle_timeout=settle_timeout,
)
exception = RuntimeError("`raiden_chain` fixture setup failed, nodes are unreachable")
with gevent.Timeout(seconds=timeout(blockchain_type), exception=exception):
wait_for_channels(
app_channels, blockchain_services.deploy_registry.address, token_addresses, deposit
)
yield raiden_apps
shutdown_apps_and_cleanup_tasks(raiden_apps)
@pytest.fixture
def raiden_network(
token_addresses,
token_network_registry_address,
channels_per_node,
deposit,
settle_timeout,
chain_id,
blockchain_services,
endpoint_discovery_services,
raiden_udp_ports,
reveal_timeout,
retry_interval,
retries_before_backoff,
throttle_capacity,
throttle_fill_rate,
nat_invitation_timeout,
nat_keepalive_retries,
nat_keepalive_timeout,
environment_type,
unrecoverable_error_should_crash,
local_matrix_servers,
private_rooms,
blockchain_type,
contracts_path,
user_deposit_address,
tmpdir,
request,
):
service_registry_address = None
if blockchain_services.service_registry:
service_registry_address = blockchain_services.service_registry.address
if _ETH_LOGDIR:
base_datadir = os.path.join(_ETH_LOGDIR, request.node.name, "raiden_nodes")
else:
base_datadir = os.path.join(tmpdir.strpath, "raiden_nodes")
raiden_apps = create_apps(
chain_id=chain_id,
contracts_path=contracts_path,
blockchain_services=blockchain_services.blockchain_services,
endpoint_discovery_services=endpoint_discovery_services,
token_network_registry_address=token_network_registry_address,
secret_registry_address=blockchain_services.secret_registry.address,
service_registry_address=service_registry_address,
user_deposit_address=user_deposit_address,
raiden_udp_ports=raiden_udp_ports,
reveal_timeout=reveal_timeout,
settle_timeout=settle_timeout,
database_basedir=base_datadir,
retry_interval=retry_interval,
retries_before_backoff=retries_before_backoff,
throttle_capacity=throttle_capacity,
throttle_fill_rate=throttle_fill_rate,
nat_invitation_timeout=nat_invitation_timeout,
nat_keepalive_retries=nat_keepalive_retries,
nat_keepalive_timeout=nat_keepalive_timeout,
environment_type=environment_type,
unrecoverable_error_should_crash=unrecoverable_error_should_crash,
local_matrix_url=local_matrix_servers[0],
private_rooms=private_rooms,
)
confirmed_block = raiden_apps[0].raiden.confirmation_blocks + 1
blockchain_services.deploy_service.wait_until_block(target_block_number=confirmed_block)
parallel_start_apps(raiden_apps)
exception = RuntimeError("`raiden_chain` fixture setup failed, token networks unavailable")
with gevent.Timeout(seconds=timeout(blockchain_type), exception=exception):
wait_for_token_networks(
raiden_apps=raiden_apps,
token_network_registry_address=token_network_registry_address,
token_addresses=token_addresses,
)
app_channels = create_network_channels(raiden_apps, channels_per_node)
create_all_channels_for_network(
app_channels=app_channels,
token_addresses=token_addresses,
channel_individual_deposit=deposit,
channel_settle_timeout=settle_timeout,
)
exception = RuntimeError("`raiden_network` fixture setup failed, nodes are unreachable")
with gevent.Timeout(seconds=timeout(blockchain_type), exception=exception):
wait_for_channels(
app_channels, blockchain_services.deploy_registry.address, token_addresses, deposit
)
# Force blocknumber update
exception = RuntimeError("Alarm failed to start and set up start_block correctly")
with gevent.Timeout(seconds=5, exception=exception):
wait_for_alarm_start(raiden_apps)
yield raiden_apps
shutdown_apps_and_cleanup_tasks(raiden_apps)
| 34.385892 | 95 | 0.755159 | 954 | 8,287 | 6.095388 | 0.165618 | 0.067068 | 0.034394 | 0.046432 | 0.816337 | 0.807051 | 0.807051 | 0.806191 | 0.794497 | 0.794497 | 0 | 0.002823 | 0.187885 | 8,287 | 240 | 96 | 34.529167 | 0.861218 | 0.015687 | 0 | 0.768473 | 0 | 0 | 0.062477 | 0.006014 | 0 | 0 | 0 | 0 | 0.004926 | 1 | 0.014778 | false | 0 | 0.029557 | 0 | 0.049261 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e6ce084189cefd75314c250bdff1ae2389020d9c | 23,063 | py | Python | flavio/physics/bdecays/angular.py | Felicia56/flavio | ea735bd8febbb961d249eddf338a4960c1fbee69 | [
"MIT"
] | 61 | 2016-03-09T16:19:39.000Z | 2022-03-30T00:55:51.000Z | flavio/physics/bdecays/angular.py | Felicia56/flavio | ea735bd8febbb961d249eddf338a4960c1fbee69 | [
"MIT"
] | 167 | 2016-03-15T15:25:57.000Z | 2022-02-27T22:19:22.000Z | flavio/physics/bdecays/angular.py | Felicia56/flavio | ea735bd8febbb961d249eddf338a4960c1fbee69 | [
"MIT"
] | 57 | 2016-03-15T14:24:23.000Z | 2022-01-14T01:00:03.000Z | r"""Generic $B\to V \ell_1 \bar \ell_2$ helicity amplitudes and angular
distribution. Can be used for $B\to V\ell^+\ell^-$, $B\to V\ell\nu$, and
lepton flavour violating decays."""
from flavio.physics.bdecays.common import lambda_K
from math import sqrt, pi
import cmath
def transversity_to_helicity(ta):
H={}
H['0' ,'V'] = -1j * (ta['0_R'] + ta['0_L'])
H['0' ,'A'] = -1j * (ta['0_R'] - ta['0_L'])
H['pl' ,'V'] = 1j * ((ta['para_R'] + ta['para_L']) + (ta['perp_R'] + ta['perp_L']))/sqrt(2)
H['pl' ,'A'] = 1j * ((ta['para_R'] - ta['para_L']) + (ta['perp_R'] - ta['perp_L']))/sqrt(2)
H['mi' ,'V'] = 1j * ((ta['para_R'] + ta['para_L']) - (ta['perp_R'] + ta['perp_L']))/sqrt(2)
H['mi' ,'A'] = 1j * ((ta['para_R'] - ta['para_L']) - (ta['perp_R'] - ta['perp_L']))/sqrt(2)
return H
def helicity_amps_v(q2, mB, mV, mqh, mql, ml1, ml2, ff, wc, prefactor):
laB = lambda_K(mB**2, mV**2, q2)
H = {}
H['0','V'] = (4 * 1j * mB * mV)/(sqrt(q2) * (mB+mV)) * ((wc['v']-wc['vp']) * (mB+mV) * ff['A12']+mqh * (wc['7']-wc['7p']) * ff['T23'])
H['0','A'] = 4 * 1j * mB * mV/sqrt(q2) * (wc['a']-wc['ap']) * ff['A12']
H['pl','V'] = 1j/(2 * (mB+mV)) * (+(wc['v']+wc['vp']) * sqrt(laB) * ff['V']-(mB+mV)**2 * (wc['v']-wc['vp']) * ff['A1'])+1j * mqh/q2 * (+(wc['7']+wc['7p']) * sqrt(laB) * ff['T1']-(wc['7']-wc['7p']) * (mB**2-mV**2) * ff['T2'])
H['mi','V'] = 1j/(2 * (mB+mV)) * (-(wc['v']+wc['vp']) * sqrt(laB) * ff['V']-(mB+mV)**2 * (wc['v']-wc['vp']) * ff['A1'])+1j * mqh/q2 * (-(wc['7']+wc['7p']) * sqrt(laB) * ff['T1']-(wc['7']-wc['7p']) * (mB**2-mV**2) * ff['T2'])
H['pl','A'] = 1j/(2 * (mB+mV)) * (+(wc['a']+wc['ap']) * sqrt(laB) * ff['V']-(mB+mV)**2 * (wc['a']-wc['ap']) * ff['A1'])
H['mi','A'] = 1j/(2 * (mB+mV)) * (-(wc['a']+wc['ap']) * sqrt(laB) * ff['V']-(mB+mV)**2 * (wc['a']-wc['ap']) * ff['A1'])
H['P'] = 1j * sqrt(laB)/2 * ((wc['p']-wc['pp'])/(mqh+mql)+(ml1+ml2)/q2 * (wc['a']-wc['ap'])) * ff['A0']
H['S'] = 1j * sqrt(laB)/2 * ((wc['s']-wc['sp'])/(mqh+mql)+(ml1-ml2)/q2 * (wc['v']-wc['vp'])) * ff['A0']
H['0','T'] = 2 * sqrt(2) * mB * mV/(mB+mV) * (wc['t']+wc['tp']) * ff['T23']
H['0','Tt'] = 2 * mB * mV/(mB+mV) * (wc['t']-wc['tp']) * ff['T23']
H['pl','T'] = 1/(sqrt(2) * sqrt(q2)) * (+(wc['t']-wc['tp']) * sqrt(laB) * ff['T1']-(wc['t']+wc['tp']) * (mB**2-mV**2) * ff['T2'])
H['mi','T'] = 1/(sqrt(2) * sqrt(q2)) * (-(wc['t']-wc['tp']) * sqrt(laB) * ff['T1']-(wc['t']+wc['tp']) * (mB**2-mV**2) * ff['T2'])
H['pl','Tt'] = 1/(2 * sqrt(q2)) * (+(wc['t']+wc['tp']) * sqrt(laB) * ff['T1']-(wc['t']-wc['tp']) * (mB**2-mV**2) * ff['T2'])
H['mi','Tt'] = 1/(2 * sqrt(q2)) * (-(wc['t']+wc['tp']) * sqrt(laB) * ff['T1']-(wc['t']-wc['tp']) * (mB**2-mV**2) * ff['T2'])
return {k: prefactor*v for k, v in H.items()}
def _Re(z):
return z.real
def _Im(z):
return z.imag
def _Co(z):
return complex(z).conjugate()
def angularcoeffs_general_Gbasis_v(H, q2, mB, mV, mqh, mql, ml1, ml2):
laB = lambda_K(mB**2, mV**2, q2)
laGa = lambda_K(q2, ml1**2, ml2**2)
E1 = sqrt(ml1**2+laGa/(4 * q2))
E2 = sqrt(ml2**2+laGa/(4 * q2))
CH = {k: complex(v).conjugate() for k, v in H.items()}
G = {}
G[0,0,0] = (
4/9 * (3 * E1 * E2+laGa/(4 * q2)) * (abs(H['pl','V'])**2+abs(H['mi','V'])**2+abs(H['0','V'])**2+abs(H['pl','A'])**2+abs(H['mi','A'])**2+abs(H['0','A'])**2)
+4 * ml1 * ml2/3 * (abs(H['pl','V'])**2+abs(H['mi','V'])**2+abs(H['0','V'])**2-abs(H['pl','A'])**2-abs(H['mi','A'])**2-abs(H['0','A'])**2)
+4/3 * (E1 * E2-ml1 * ml2+laGa/(4 * q2)) * abs(H['S'])**2+4/3 * (E1 * E2+ml1 * ml2+laGa/(4 * q2)) * abs(H['P'])**2
+16/9 * (3 * (E1 * E2+ml1 * ml2)-laGa/(4 * q2)) * (abs(H['pl','Tt'])**2+abs(H['mi','Tt'])**2+abs(H['0','Tt'])**2)
+8/9 * (3 * (E1 * E2-ml1 * ml2)-laGa/(4 * q2)) * (abs(H['pl','T'])**2+abs(H['mi','T'])**2+abs(H['0','T'])**2)
+16/3 * (ml1 * E2+ml2 * E1) * _Im(H['pl','V'] * CH['pl','Tt']+H['mi','V'] * CH['mi','Tt']+H['0','V'] * CH['0','Tt'])
+8 * sqrt(2)/3 * (ml1 * E2-ml2 * E1) * _Im(H['pl','A'] * CH['pl','T']+H['mi','A'] * CH['mi','T']+H['0','A'] * CH['0','T']))
G[0,1,0] = (4 * sqrt(laGa)/3 * (
_Re(H['pl','V'] * CH['pl','A']-H['mi','V'] * CH['mi','A'])
+2 * sqrt(2)/q2 * (ml1**2-ml2**2) * _Re(H['pl','T'] * CH['pl','Tt']-H['mi','T'] * CH['mi','Tt'])
+2 * (ml1+ml2)/sqrt(q2) * _Im(H['pl','A'] * CH['pl','Tt']-H['mi','A'] * CH['mi','Tt'])
+sqrt(2)*(ml1-ml2)/sqrt(q2) * _Im(H['pl','V'] * CH['pl','T']-H['mi','V'] * CH['mi','T'])
-(ml1-ml2)/sqrt(q2) * _Re(H['0','A'] * CH['P'])-(ml1+ml2)/sqrt(q2) * _Re(H['0','V'] * CH['S'])
+_Im(sqrt(2) * H['0','T'] * CH['P']+2 * H['0','Tt'] * CH['S'])
))
G[0,2,0] = -2/9 * laGa/q2 * (
-abs(H['pl','V'])**2-abs(H['mi','V'])**2+2 * abs(H['0','V'])**2-abs(H['pl','A'])**2-abs(H['mi','A'])**2+2 * abs(H['0','A'])**2
-2 * (-abs(H['pl','T'])**2-abs(H['mi','T'])**2+2 * abs(H['0','T'])**2)-4 * (-abs(H['pl','Tt'])**2-abs(H['mi','Tt'])**2+2 * abs(H['0','Tt'])**2))
G[2,0,0] = (-4/9 * (3 * E1 * E2+laGa/(4 * q2)) * (abs(H['pl','V'])**2+abs(H['mi','V'])**2-2 * abs(H['0','V'])**2+abs(H['pl','A'])**2+abs(H['mi','A'])**2
-2 * abs(H['0','A'])**2)-4 * ml1 * ml2/3 * (abs(H['pl','V'])**2+abs(H['mi','V'])**2-2 * abs(H['0','V'])**2-abs(H['pl','A'])**2
-abs(H['mi','A'])**2+2 * abs(H['0','A'])**2)+8/3 * (E1 * E2-ml1 * ml2+laGa/(4 * q2)) * abs(H['S'])**2
+8/3 * (E1 * E2+ml1 * ml2+laGa/(4 * q2)) * abs(H['P'])**2
-16/9 * (3 * (E1 * E2+ml1 * ml2)-laGa/(4 * q2)) * (abs(H['pl','Tt'])**2+abs(H['mi','Tt'])**2-2 * abs(H['0','Tt'])**2)
-8/9 * (3 * (E1 * E2-ml1 * ml2)-laGa/(4 * q2)) * (abs(H['pl','T'])**2+abs(H['mi','T'])**2-2 * abs(H['0','T'])**2)
-16/3 * (ml1 * E2+ml2 * E1) * _Im(H['pl','V'] * CH['pl','Tt']+H['mi','V'] * CH['mi','Tt']-2 * H['0','V'] * CH['0','Tt'])
-8 * sqrt(2)/3 * (ml1 * E2-ml2 * E1) * _Im(H['pl','A'] * CH['pl','T']+H['mi','A'] * CH['mi','T']-2 * H['0','A'] * CH['0','T']))
G[2,1,0] = (-4 * sqrt(laGa)/3 * (_Re(H['pl','V'] * CH['pl','A']-H['mi','V'] * CH['mi','A'])
+2 * sqrt(2) * (ml1**2-ml2**2)/q2 * _Re(H['pl','T'] * CH['pl','Tt']-H['mi','T'] * CH['mi','Tt'])
+2 * (ml1+ml2)/sqrt(q2) * _Im(H['pl','A'] * CH['pl','Tt']-H['mi','A'] * CH['mi','Tt'])
+sqrt(2) * (ml1-ml2)/sqrt(q2) * _Im(H['pl','V'] * CH['pl','T']-H['mi','V'] * CH['mi','T'])
+2 * (ml1-ml2)/sqrt(q2) * _Re(H['0','A'] * CH['P'])+2 * (ml1+ml2)/sqrt(q2) * _Re(H['0','V'] * CH['S'])
-2 * _Im(sqrt(2) * H['0','T'] * CH['P']+2 * H['0','Tt'] * CH['S'])))
G[2,2,0] = (-2/9 * laGa/q2 * (abs(H['pl','V'])**2+abs(H['mi','V'])**2+4 * abs(H['0','V'])**2+abs(H['pl','A'])**2+abs(H['mi','A'])**2
+4 * abs(H['0','A'])**2-2 * (abs(H['pl','T'])**2+abs(H['mi','T'])**2+4 * abs(H['0','T'])**2)-4 * (abs(H['pl','Tt'])**2+abs(H['mi','Tt'])**2+4 * abs(H['0','Tt'])**2)))
G[2,1,1] = (4/sqrt(3) * sqrt(laGa) * (H['pl','V'] * CH['0','A']+H['pl','A'] * CH['0','V']-H['0','V'] * CH['mi','A']-H['0','A'] * CH['mi','V']
+(ml1+ml2)/sqrt(q2) * (H['pl','V'] * CH['S']+H['S'] * CH['mi','V'])-sqrt(2) * 1j * (H['P'] * CH['mi','T']-H['pl','T'] * CH['P']
+sqrt(2)*(H['S'] * CH['mi','Tt']-H['pl','Tt'] * CH['S']))
+(ml1-ml2)/sqrt(q2) * (H['pl','A'] * CH['P']+H['P'] * CH['mi','A'])
-2 * 1j * (ml1+ml2)/sqrt(q2) * (H['pl','A'] * CH['0','Tt']+H['0','Tt'] * CH['mi','A']-H['pl','Tt'] * CH['0','A']-H['0','A'] * CH['mi','Tt'])
-sqrt(2) * 1j * (ml1-ml2)/sqrt(q2) * (H['pl','V'] * CH['0','T']+H['0','T'] * CH['mi','V']-H['pl','T'] * CH['0','V']-H['0','V'] * CH['mi','T'])
+2 * sqrt(2) * (ml1**2-ml2**2)/q2 * (H['pl','T'] * CH['0','Tt']+H['pl','Tt'] * CH['0','T']-H['0','T'] * CH['mi','Tt']-H['0','Tt'] * CH['mi','T'])))
G[2,2,1] = (4/3 * laGa/q2 * (H['pl','V'] * CH['0','V']+H['0','V'] * CH['mi','V']+H['pl','A'] * CH['0','A']+H['0','A'] * CH['mi','A']
-2 * (H['pl','T'] * CH['0','T']+H['0','T'] * CH['mi','T']+2 * (H['pl','Tt'] * CH['0','Tt']+H['0','Tt'] * CH['mi','Tt']))))
G[2,2,2] = -8/3 * laGa/q2 * (H['pl','V'] * CH['mi','V']+H['pl','A'] * CH['mi','A']-2 * (H['pl','T'] * CH['mi','T']+2 * H['pl','Tt'] * CH['mi','Tt']))
prefactor = sqrt(laB)*sqrt(laGa)/(2**9 * pi**3 * mB**3 * q2)
return {k: prefactor*v for k, v in G.items()}
def angularcoeffs_h_Gbasis_v(phi, H, Htilde, q2, mB, mV, mqh, mql, ml1, ml2):
qp = -cmath.exp(1j * phi) # here it is assumed that q/p is a pure phase, as appropriate for B and Bs mixing
laB = lambda_K(mB**2, mV**2, q2)
laGa = lambda_K(q2, ml1**2, ml2**2)
E1 = sqrt(ml1**2+laGa/(4 * q2))
E2 = sqrt(ml2**2+laGa/(4 * q2))
CH = {k: complex(v).conjugate() for k, v in H.items()}
CHtilde = {k: complex(v).conjugate() for k, v in Htilde.items()}
G = {}
G[0,0,0] = (
4/9 * (3 * E1 * E2+laGa/(4 * q2)) * (2 * _Re(-qp * Htilde['pl','V'] * CH['pl','V'])+2 * _Re(-qp * Htilde['mi','V'] * CH['mi','V'])+2 * _Re(-qp * Htilde['0','V'] * CH['0','V'])+2 * _Re(-qp * Htilde['pl','A'] * CH['pl','A'])+2 * _Re(-qp * Htilde['mi','A'] * CH['mi','A'])+2 * _Re(-qp * Htilde['0','A'] * CH['0','A']))
+4 * ml1 * ml2/3 * (2 * _Re(-qp * Htilde['pl','V'] * CH['pl','V'])+2 * _Re(-qp * Htilde['mi','V'] * CH['mi','V'])+2 * _Re(-qp * Htilde['0','V'] * CH['0','V'])-2 * _Re(-qp * Htilde['pl','A'] * CH['pl','A'])-2 * _Re(-qp * Htilde['mi','A'] * CH['mi','A'])-2 * _Re(-qp * Htilde['0','A'] * CH['0','A']))
+4/3 * (E1 * E2-ml1 * ml2+laGa/(4 * q2)) * 2 * _Re(-qp * Htilde['S'] * CH['S'])+4/3 * (E1 * E2+ml1 * ml2+laGa/(4 * q2)) * 2 * _Re(-qp * Htilde['P'] * CH['P'])
+16/9 * (3 * (E1 * E2+ml1 * ml2)-laGa/(4 * q2)) * (2 * _Re(-qp * Htilde['pl','Tt'] * CH['pl','Tt'])+2 * _Re(-qp * Htilde['mi','Tt'] * CH['mi','Tt'])+2 * _Re(-qp * Htilde['0','Tt'] * CH['0','Tt']))
+8/9 * (3 * (E1 * E2-ml1 * ml2)-laGa/(4 * q2)) * (2 * _Re(-qp * Htilde['pl','T'] * CH['pl','T'])+2 * _Re(-qp * Htilde['mi','T'] * CH['mi','T'])+2 * _Re(-qp * Htilde['0','T'] * CH['0','T']))
+16/3 * (ml1 * E2+ml2 * E1) * _Im((-qp * Htilde['pl','V'] * CH['pl','Tt'] + _Co(-qp) * H['pl','V'] * CHtilde['pl','Tt'])+(-qp * Htilde['mi','V'] * CH['mi','Tt'] + _Co(-qp) * H['mi','V'] * CHtilde['mi','Tt'])+(-qp * Htilde['0','V'] * CH['0','Tt'] + _Co(-qp) * H['0','V'] * CHtilde['0','Tt']))
+8 * sqrt(2)/3 * (ml1 * E2-ml2 * E1) * _Im((-qp * Htilde['pl','A'] * CH['pl','T'] + _Co(-qp) * H['pl','A'] * CHtilde['pl','T'])+(-qp * Htilde['mi','A'] * CH['mi','T'] + _Co(-qp) * H['mi','A'] * CHtilde['mi','T'])+(-qp * Htilde['0','A'] * CH['0','T'] + _Co(-qp) * H['0','A'] * CHtilde['0','T'])))
G[0,1,0] = (4 * sqrt(laGa)/3 * (
_Re((-qp * Htilde['pl','V'] * CH['pl','A'] + _Co(-qp) * H['pl','V'] * CHtilde['pl','A'])-(-qp * Htilde['mi','V'] * CH['mi','A'] + _Co(-qp) * H['mi','V'] * CHtilde['mi','A']))
+2 * sqrt(2)/q2 * (ml1**2-ml2**2) * _Re((-qp * Htilde['pl','T'] * CH['pl','Tt'] + _Co(-qp) * H['pl','T'] * CHtilde['pl','Tt'])-(-qp * Htilde['mi','T'] * CH['mi','Tt'] + _Co(-qp) * H['mi','T'] * CHtilde['mi','Tt']))
+2 * (ml1+ml2)/sqrt(q2) * _Im((-qp * Htilde['pl','A'] * CH['pl','Tt'] + _Co(-qp) * H['pl','A'] * CHtilde['pl','Tt'])-(-qp * Htilde['mi','A'] * CH['mi','Tt'] + _Co(-qp) * H['mi','A'] * CHtilde['mi','Tt']))
+sqrt(2)*(ml1-ml2)/sqrt(q2) * _Im((-qp * Htilde['pl','V'] * CH['pl','T'] + _Co(-qp) * H['pl','V'] * CHtilde['pl','T'])-(-qp * Htilde['mi','V'] * CH['mi','T'] + _Co(-qp) * H['mi','V'] * CHtilde['mi','T']))
-(ml1-ml2)/sqrt(q2) * _Re((-qp * Htilde['0','A'] * CH['P'] + _Co(-qp) * H['0','A'] * CHtilde['P']))-(ml1+ml2)/sqrt(q2) * _Re((-qp * Htilde['0','V'] * CH['S'] + _Co(-qp) * H['0','V'] * CHtilde['S']))
+_Im(sqrt(2) * (-qp * Htilde['0','T'] * CH['P'] + _Co(-qp) * H['0','T'] * CHtilde['P'])+2 * (-qp * Htilde['0','Tt'] * CH['S'] + _Co(-qp) * H['0','Tt'] * CHtilde['S']))
))
G[0,2,0] = -2/9 * laGa/q2 * (
-2 * _Re(-qp * Htilde['pl','V'] * CH['pl','V'])-2 * _Re(-qp * Htilde['mi','V'] * CH['mi','V'])+2 * 2 * _Re(-qp * Htilde['0','V'] * CH['0','V'])-2 * _Re(-qp * Htilde['pl','A'] * CH['pl','A'])-2 * _Re(-qp * Htilde['mi','A'] * CH['mi','A'])+2 * 2 * _Re(-qp * Htilde['0','A'] * CH['0','A'])
-2 * (-2 * _Re(-qp * Htilde['pl','T'] * CH['pl','T'])-2 * _Re(-qp * Htilde['mi','T'] * CH['mi','T'])+2 * 2 * _Re(-qp * Htilde['0','T'] * CH['0','T']))-4 * (-2 * _Re(-qp * Htilde['pl','Tt'] * CH['pl','Tt'])-2 * _Re(-qp * Htilde['mi','Tt'] * CH['mi','Tt'])+2 * 2 * _Re(-qp * Htilde['0','Tt'] * CH['0','Tt'])))
G[2,0,0] = (-4/9 * (3 * E1 * E2+laGa/(4 * q2)) * (2 * _Re(-qp * Htilde['pl','V'] * CH['pl','V'])+2 * _Re(-qp * Htilde['mi','V'] * CH['mi','V'])-2 * 2 * _Re(-qp * Htilde['0','V'] * CH['0','V'])+2 * _Re(-qp * Htilde['pl','A'] * CH['pl','A'])+2 * _Re(-qp * Htilde['mi','A'] * CH['mi','A'])
-2 * 2 * _Re(-qp * Htilde['0','A'] * CH['0','A']))-4 * ml1 * ml2/3 * (2 * _Re(-qp * Htilde['pl','V'] * CH['pl','V'])+2 * _Re(-qp * Htilde['mi','V'] * CH['mi','V'])-2 * 2 * _Re(-qp * Htilde['0','V'] * CH['0','V'])-2 * _Re(-qp * Htilde['pl','A'] * CH['pl','A'])
-2 * _Re(-qp * Htilde['mi','A'] * CH['mi','A'])+2 * 2 * _Re(-qp * Htilde['0','A'] * CH['0','A']))+8/3 * (E1 * E2-ml1 * ml2+laGa/(4 * q2)) * 2 * _Re(-qp * Htilde['S'] * CH['S'])
+8/3 * (E1 * E2+ml1 * ml2+laGa/(4 * q2)) * 2 * _Re(-qp * Htilde['P'] * CH['P'])
-16/9 * (3 * (E1 * E2+ml1 * ml2)-laGa/(4 * q2)) * (2 * _Re(-qp * Htilde['pl','Tt'] * CH['pl','Tt'])+2 * _Re(-qp * Htilde['mi','Tt'] * CH['mi','Tt'])-2 * 2 * _Re(-qp * Htilde['0','Tt'] * CH['0','Tt']))
-8/9 * (3 * (E1 * E2-ml1 * ml2)-laGa/(4 * q2)) * (2 * _Re(-qp * Htilde['pl','T'] * CH['pl','T'])+2 * _Re(-qp * Htilde['mi','T'] * CH['mi','T'])-2 * 2 * _Re(-qp * Htilde['0','T'] * CH['0','T']))
-16/3 * (ml1 * E2+ml2 * E1) * _Im((-qp * Htilde['pl','V'] * CH['pl','Tt'] + _Co(-qp) * H['pl','V'] * CHtilde['pl','Tt'])+(-qp * Htilde['mi','V'] * CH['mi','Tt'] + _Co(-qp) * H['mi','V'] * CHtilde['mi','Tt'])-2 * (-qp * Htilde['0','V'] * CH['0','Tt'] + _Co(-qp) * H['0','V'] * CHtilde['0','Tt']))
-8 * sqrt(2)/3 * (ml1 * E2-ml2 * E1) * _Im((-qp * Htilde['pl','A'] * CH['pl','T'] + _Co(-qp) * H['pl','A'] * CHtilde['pl','T'])+(-qp * Htilde['mi','A'] * CH['mi','T'] + _Co(-qp) * H['mi','A'] * CHtilde['mi','T'])-2 * (-qp * Htilde['0','A'] * CH['0','T'] + _Co(-qp) * H['0','A'] * CHtilde['0','T'])))
G[2,1,0] = (-4 * sqrt(laGa)/3 * (_Re((-qp * Htilde['pl','V'] * CH['pl','A'] + _Co(-qp) * H['pl','V'] * CHtilde['pl','A'])-(-qp * Htilde['mi','V'] * CH['mi','A'] + _Co(-qp) * H['mi','V'] * CHtilde['mi','A']))
+2 * sqrt(2) * (ml1**2-ml2**2)/q2 * _Re((-qp * Htilde['pl','T'] * CH['pl','Tt'] + _Co(-qp) * H['pl','T'] * CHtilde['pl','Tt'])-(-qp * Htilde['mi','T'] * CH['mi','Tt'] + _Co(-qp) * H['mi','T'] * CHtilde['mi','Tt']))
+2 * (ml1+ml2)/sqrt(q2) * _Im((-qp * Htilde['pl','A'] * CH['pl','Tt'] + _Co(-qp) * H['pl','A'] * CHtilde['pl','Tt'])-(-qp * Htilde['mi','A'] * CH['mi','Tt'] + _Co(-qp) * H['mi','A'] * CHtilde['mi','Tt']))
+sqrt(2) * (ml1-ml2)/sqrt(q2) * _Im((-qp * Htilde['pl','V'] * CH['pl','T'] + _Co(-qp) * H['pl','V'] * CHtilde['pl','T'])-(-qp * Htilde['mi','V'] * CH['mi','T'] + _Co(-qp) * H['mi','V'] * CHtilde['mi','T']))
+2 * (ml1-ml2)/sqrt(q2) * _Re((-qp * Htilde['0','A'] * CH['P'] + _Co(-qp) * H['0','A'] * CHtilde['P']))+2 * (ml1+ml2)/sqrt(q2) * _Re((-qp * Htilde['0','V'] * CH['S'] + _Co(-qp) * H['0','V'] * CHtilde['S']))
-2 * _Im(sqrt(2) * (-qp * Htilde['0','T'] * CH['P'] + _Co(-qp) * H['0','T'] * CHtilde['P'])+2 * (-qp * Htilde['0','Tt'] * CH['S'] + _Co(-qp) * H['0','Tt'] * CHtilde['S']))))
G[2,2,0] = (-2/9 * laGa/q2 * (2 * _Re(-qp * Htilde['pl','V'] * CH['pl','V'])+2 * _Re(-qp * Htilde['mi','V'] * CH['mi','V'])+4 * 2 * _Re(-qp * Htilde['0','V'] * CH['0','V'])+2 * _Re(-qp * Htilde['pl','A'] * CH['pl','A'])+2 * _Re(-qp * Htilde['mi','A'] * CH['mi','A'])
+4 * 2 * _Re(-qp * Htilde['0','A'] * CH['0','A'])-2 * (2 * _Re(-qp * Htilde['pl','T'] * CH['pl','T'])+2 * _Re(-qp * Htilde['mi','T'] * CH['mi','T'])+4 * 2 * _Re(-qp * Htilde['0','T'] * CH['0','T']))-4 * (2 * _Re(-qp * Htilde['pl','Tt'] * CH['pl','Tt'])+2 * _Re(-qp * Htilde['mi','Tt'] * CH['mi','Tt'])+4 * 2 * _Re(-qp * Htilde['0','Tt'] * CH['0','Tt']))))
G[2,1,1] = (4/sqrt(3) * sqrt(laGa) * ((-qp * Htilde['pl','V'] * CH['0','A'] + _Co(-qp) * H['pl','V'] * CHtilde['0','A'])+(-qp * Htilde['pl','A'] * CH['0','V'] + _Co(-qp) * H['pl','A'] * CHtilde['0','V'])-(-qp * Htilde['0','V'] * CH['mi','A'] + _Co(-qp) * H['0','V'] * CHtilde['mi','A'])-(-qp * Htilde['0','A'] * CH['mi','V'] + _Co(-qp) * H['0','A'] * CHtilde['mi','V'])
+(ml1+ml2)/sqrt(q2) * ((-qp * Htilde['pl','V'] * CH['S'] + _Co(-qp) * H['pl','V'] * CHtilde['S'])+(-qp * Htilde['S'] * CH['mi','V'] + _Co(-qp) * H['S'] * CHtilde['mi','V']))-sqrt(2) * 1j * ((-qp * Htilde['P'] * CH['mi','T'] + _Co(-qp) * H['P'] * CHtilde['mi','T'])-(-qp * Htilde['pl','T'] * CH['P'] + _Co(-qp) * H['pl','T'] * CHtilde['P'])
+sqrt(2)*((-qp * Htilde['S'] * CH['mi','Tt'] + _Co(-qp) * H['S'] * CHtilde['mi','Tt'])-(-qp * Htilde['pl','Tt'] * CH['S'] + _Co(-qp) * H['pl','Tt'] * CHtilde['S'])))
+(ml1-ml2)/sqrt(q2) * ((-qp * Htilde['pl','A'] * CH['P'] + _Co(-qp) * H['pl','A'] * CHtilde['P'])+(-qp * Htilde['P'] * CH['mi','A'] + _Co(-qp) * H['P'] * CHtilde['mi','A']))
-2 * 1j * (ml1+ml2)/sqrt(q2) * ((-qp * Htilde['pl','A'] * CH['0','Tt'] + _Co(-qp) * H['pl','A'] * CHtilde['0','Tt'])+(-qp * Htilde['0','Tt'] * CH['mi','A'] + _Co(-qp) * H['0','Tt'] * CHtilde['mi','A'])-(-qp * Htilde['pl','Tt'] * CH['0','A'] + _Co(-qp) * H['pl','Tt'] * CHtilde['0','A'])-(-qp * Htilde['0','A'] * CH['mi','Tt'] + _Co(-qp) * H['0','A'] * CHtilde['mi','Tt']))
-sqrt(2) * 1j * (ml1-ml2)/sqrt(q2) * ((-qp * Htilde['pl','V'] * CH['0','T'] + _Co(-qp) * H['pl','V'] * CHtilde['0','T'])+(-qp * Htilde['0','T'] * CH['mi','V'] + _Co(-qp) * H['0','T'] * CHtilde['mi','V'])-(-qp * Htilde['pl','T'] * CH['0','V'] + _Co(-qp) * H['pl','T'] * CHtilde['0','V'])-(-qp * Htilde['0','V'] * CH['mi','T'] + _Co(-qp) * H['0','V'] * CHtilde['mi','T']))
+2 * sqrt(2) * (ml1**2-ml2**2)/q2 * ((-qp * Htilde['pl','T'] * CH['0','Tt'] + _Co(-qp) * H['pl','T'] * CHtilde['0','Tt'])+(-qp * Htilde['pl','Tt'] * CH['0','T'] + _Co(-qp) * H['pl','Tt'] * CHtilde['0','T'])-(-qp * Htilde['0','T'] * CH['mi','Tt'] + _Co(-qp) * H['0','T'] * CHtilde['mi','Tt'])-(-qp * Htilde['0','Tt'] * CH['mi','T'] + _Co(-qp) * H['0','Tt'] * CHtilde['mi','T']))))
G[2,2,1] = (4/3 * laGa/q2 * ((-qp * Htilde['pl','V'] * CH['0','V'] + _Co(-qp) * H['pl','V'] * CHtilde['0','V'])+(-qp * Htilde['0','V'] * CH['mi','V'] + _Co(-qp) * H['0','V'] * CHtilde['mi','V'])+(-qp * Htilde['pl','A'] * CH['0','A'] + _Co(-qp) * H['pl','A'] * CHtilde['0','A'])+(-qp * Htilde['0','A'] * CH['mi','A'] + _Co(-qp) * H['0','A'] * CHtilde['mi','A'])
-2 * ((-qp * Htilde['pl','T'] * CH['0','T'] + _Co(-qp) * H['pl','T'] * CHtilde['0','T'])+(-qp * Htilde['0','T'] * CH['mi','T'] + _Co(-qp) * H['0','T'] * CHtilde['mi','T'])+2 * ((-qp * Htilde['pl','Tt'] * CH['0','Tt'] + _Co(-qp) * H['pl','Tt'] * CHtilde['0','Tt'])+(-qp * Htilde['0','Tt'] * CH['mi','Tt'] + _Co(-qp) * H['0','Tt'] * CHtilde['mi','Tt'])))))
G[2,2,2] = -8/3 * laGa/q2 * ((-qp * Htilde['pl','V'] * CH['mi','V'] + _Co(-qp) * H['pl','V'] * CHtilde['mi','V'])+(-qp * Htilde['pl','A'] * CH['mi','A'] + _Co(-qp) * H['pl','A'] * CHtilde['mi','A'])-2 * ((-qp * Htilde['pl','T'] * CH['mi','T'] + _Co(-qp) * H['pl','T'] * CHtilde['mi','T'])+2 * (-qp * Htilde['pl','Tt'] * CH['mi','Tt'] + _Co(-qp) * H['pl','Tt'] * CHtilde['mi','Tt'])))
prefactor = sqrt(laB)*sqrt(laGa)/(2**9 * pi**3 * mB**3 * q2)
return {k: prefactor*v for k, v in G.items()}
def G_to_g(G):
g = {}
g['1s'] = 1/32 * (8 * G[0,0,0] + 2 * G[0,2,0] - 4 * G[2,0,0] - G[2,2,0] )
g['1c'] = 1/16 * (4 * G[0,0,0] + G[0,2,0] + 4 * G[2,0,0] + G[2,2,0] )
g['2s'] = 3/32 * ( 2 * G[0,2,0] - G[2,2,0] )
g['2c'] = 3/16 * (G[0,2,0] + G[2,2,0] )
g['6s'] = 1/8 * ( 2 * G[0,1,0] - G[2,1,0] )
g['6c'] = 1/4 * ( G[0,1,0] + G[2,1,0] )
g[3] = 3/32 * _Re(G[2,2,2])
g[4] = 3/32 * _Re(G[2,2,1])
g[5] = sqrt(3)/16 * _Re(G[2,1,1])
g[7] = sqrt(3)/16 * _Im(G[2,1,1])
g[8] = 3/32 * _Im(G[2,2,1])
g[9] = 3/32 * _Im(G[2,2,2])
return g
def angularcoeffs_general_v(*args, **kwargs):
G = angularcoeffs_general_Gbasis_v(*args, **kwargs)
g = G_to_g(G)
signflip = [4, '6s', '6c', 7, 9]
J = {k: -8*4/3.*g[k] if k in signflip else 8*4/3.*g[k] for k in g}
return J
def angularcoeffs_h_v(*args, **kwargs):
h = angularcoeffs_h_Gbasis_v(*args, **kwargs)
g_h = G_to_g(h)
signflip = [4, '6s', '6c', 7, 9]
J_h = {k: -8*4/3.*g_h[k] if k in signflip else 8*4/3.*g_h[k] for k in g_h}
return J_h
def helicity_amps_p(q2, mB, mP, mqh, mql, ml1, ml2, ff, wc, prefactor):
laB = lambda_K(mB**2, mP**2, q2)
h = {}
h['V'] = sqrt(laB)/(2*sqrt(q2)) * (
2*mqh/(mB+mP)*(wc['7']+wc['7p'])*ff['fT']+(wc['v']+wc['vp'])*ff['f+'] )
h['A'] = sqrt(laB)/(2*sqrt(q2)) * (wc['a']+wc['ap'])*ff['f+']
h['S'] = (mB**2-mP**2)/2. * ff['f0'] * (
(wc['s']+wc['sp'])/(mqh-mql) + (ml1-ml2)/q2*(wc['v']+wc['vp']) )
h['P'] = (mB**2-mP**2)/2. * ff['f0'] * (
(wc['p']+wc['pp'])/(mqh-mql) + (ml1+ml2)/q2*(wc['a']+wc['ap']) )
h['T'] = -1j*sqrt(laB)/(2*(mB+mP)) * (wc['t']-wc['tp']) * ff['fT']
h['Tt'] = -1j*sqrt(laB)/(2*(mB+mP)) * (wc['t']+wc['tp']) * ff['fT']
return {k: prefactor*v for k, v in h.items()}
def angularcoeffs_general_Gbasis_p(h, q2, mB, mP, mqh, mql, ml1, ml2):
laB = lambda_K(mB**2, mP**2, q2)
laGa = lambda_K(q2, ml1**2, ml2**2)
E1 = sqrt(ml1**2+laGa/(4 * q2))
E2 = sqrt(ml2**2+laGa/(4 * q2))
G = {}
G[0] = (
( 4*(E1*E2 + ml1*ml2) + laGa/(3*q2) ) * abs(h['V'])**2
+ ( 4*(E1*E2 - ml1*ml2) + laGa/(3*q2) ) * abs(h['A'])**2
+ ( 4*(E1*E2 - ml1*ml2) + laGa/( q2) ) * abs(h['S'])**2
+ ( 4*(E1*E2 + ml1*ml2) + laGa/( q2) ) * abs(h['P'])**2
+ 16*(E1*E2 + ml1*ml2 - laGa/(12*q2)) * abs(h['Tt'])**2
+ 8*(E1*E2 - ml1*ml2 - laGa/(12*q2)) * abs(h['T'])**2
+ 16 * (ml1*E2 + ml2*E1) * _Im( h['V'] * _Co(h['Tt']) )
+ 8*sqrt(2)*(ml1*E2 - ml2*E1) * _Im( h['A'] * _Co(h['T']) ) )
G[1] = -4*sqrt(laGa) * (
_Re( (ml1+ml2)/sqrt(q2) * h['V'] * _Co(h['S'])
+ (ml1-ml2)/sqrt(q2) * h['A'] * _Co(h['P']) )
- _Im( 2 * h['Tt'] * _Co(h['S']) + sqrt(2) * h['T'] * _Co(h['P'])) )
G[2] = -4*laGa/(3*q2) * (
abs(h['V'])**2 + abs(h['A'])**2 - 2*abs(h['T'])**2 - 4*abs(h['Tt'])**2 )
prefactor = sqrt(laB)*sqrt(laGa)/(2**9 * pi**3 * mB**3 * q2)
return {k: prefactor*v for k, v in G.items()}
def angularcoeffs_general_p(*args, **kwargs):
G = angularcoeffs_general_Gbasis_p(*args, **kwargs)
J = {}
J['a'] = G[0] - G[2]/2.
J['b'] = G[1]
J['c'] = 3*G[2]/2.
return J
| 98.559829 | 395 | 0.40823 | 4,413 | 23,063 | 2.073646 | 0.031271 | 0.118894 | 0.07868 | 0.078134 | 0.899137 | 0.867883 | 0.818818 | 0.758496 | 0.727024 | 0.658507 | 0 | 0.069048 | 0.18302 | 23,063 | 233 | 396 | 98.982833 | 0.416622 | 0.011013 | 0 | 0.21659 | 0 | 0 | 0.092269 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.059908 | false | 0 | 0.013825 | 0.013825 | 0.133641 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5da30f694040f8e0c65106507b2be2e3ceb2bbb8 | 98 | py | Python | components/customer/Customer_tpl.py | bitbuit/billterm | 553bf2afb6ff2c1e15becbe1b4ab59346e5a87b5 | [
"MIT"
] | null | null | null | components/customer/Customer_tpl.py | bitbuit/billterm | 553bf2afb6ff2c1e15becbe1b4ab59346e5a87b5 | [
"MIT"
] | null | null | null | components/customer/Customer_tpl.py | bitbuit/billterm | 553bf2afb6ff2c1e15becbe1b4ab59346e5a87b5 | [
"MIT"
] | null | null | null | from components.company.Company_tpl import Company_tpl
class Customer_tpl(Company_tpl):
pass
| 19.6 | 54 | 0.826531 | 14 | 98 | 5.5 | 0.571429 | 0.38961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122449 | 98 | 4 | 55 | 24.5 | 0.895349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
5db4c192b9c6fda7cbdad5da4a9f4847606779bf | 4,027 | py | Python | configs/_base_/det_pipelines/bsnet_pipeline_tb5.py | zzx0226/mmocr | 50354895244339a392b4f1af5a35963883923cca | [
"Apache-2.0"
] | null | null | null | configs/_base_/det_pipelines/bsnet_pipeline_tb5.py | zzx0226/mmocr | 50354895244339a392b4f1af5a35963883923cca | [
"Apache-2.0"
] | null | null | null | configs/_base_/det_pipelines/bsnet_pipeline_tb5.py | zzx0226/mmocr | 50354895244339a392b4f1af5a35963883923cca | [
"Apache-2.0"
] | null | null | null | '''
Description:
Version: 1.0
Autor: Zhangzixu
Date: 2022-01-07 11:59:12
LastEditors: Zhangzixu
LastEditTime: 2022-01-07 12:04:41
'''
img_norm_cfg = dict(mean=[123.675, 116.28, 103.53], std=[
58.395, 57.12, 57.375], to_rgb=True)
# for icdar2015
leval_prop_range_icdar2015 = ((0, 0.4), (0.3, 0.7), (0.6, 1.0))
train_pipeline_icdar2015 = [
dict(type='LoadImageFromFile', color_type='color_ignore_orientation'),
dict(type='LoadTextAnnotations', with_bbox=True,
with_mask=True, poly2mask=False),
dict(type='ColorJitter', brightness=32.0 /
255, saturation=0.5, contrast=0.5),
dict(type='Normalize', **img_norm_cfg),
dict(type='RandomScaling', size=800, scale=(3. / 4, 5. / 2)),
# dict(type='RandomCropFlip', crop_ratio=0.5, iter_num=1, min_area_ratio=0.2),
dict(type='RandomCropPolyInstances', instance_key='gt_masks',
crop_ratio=0.8, min_side_ratio=0.3),
dict(type='RandomRotatePolyInstances', rotate_ratio=0.5,
max_angle=30, pad_with_fixed_color=False),
dict(type='SquareResizePad', target_size=800, pad_ratio=0.6),
dict(type='RandomFlip', flip_ratio=0.5, direction='horizontal'),
dict(type='Pad', size_divisor=32),
dict(type='BSNetTargets_tb', bs_degree=4, cp_num=5,
level_proportion_range=leval_prop_range_icdar2015),
dict(type='CustomFormatBundle', keys=[
'p3_maps', 'p4_maps', 'p5_maps'], visualize=dict(flag=False, boundary_key=None)),
dict(type='Collect', keys=['img', 'p3_maps', 'p4_maps', 'p5_maps'])
]
img_scale_icdar2015 = (2260, 2260)
test_pipeline_icdar2015 = [
dict(type='LoadImageFromFile', color_type='color_ignore_orientation'),
dict(type='MultiScaleFlipAug',
img_scale=img_scale_icdar2015,
flip=False,
transforms=[
dict(type='Resize', img_scale=(1280, 800), keep_ratio=True),
dict(type='Normalize', **img_norm_cfg),
dict(type='Pad', size_divisor=32),
dict(type='ImageToTensor', keys=['img']),
dict(type='Collect', keys=['img']),
])
]
# for ctw1500
leval_prop_range_ctw1500 = ((0, 0.25), (0.2, 0.65), (0.55, 1.0))
train_pipeline_ctw1500 = [
dict(type='LoadImageFromFile', color_type='color_ignore_orientation'),
dict(type='LoadTextAnnotations', with_bbox=True,
with_mask=True, poly2mask=False),
dict(type='ColorJitter', brightness=32.0 /
255, saturation=0.5, contrast=0.5),
dict(type='Normalize', **img_norm_cfg),
dict(type='RandomScaling', size=800, scale=(3. / 4, 5. / 2)),
dict(type='RandomCropFlip', crop_ratio=0.5, iter_num=1, min_area_ratio=0.2),
dict(type='RandomCropPolyInstances', instance_key='gt_masks',
crop_ratio=0.8, min_side_ratio=0.3),
dict(type='RandomRotatePolyInstances', rotate_ratio=0.5,
max_angle=30, pad_with_fixed_color=False),
dict(type='SquareResizePad', target_size=800, pad_ratio=0.6),
dict(type='RandomFlip', flip_ratio=0.5, direction='horizontal'),
dict(type='Pad', size_divisor=32),
dict(type='BSNetTargets_tb', bs_degree=4, cp_num=5,
level_proportion_range=leval_prop_range_ctw1500),
dict(type='CustomFormatBundle', keys=[
'p3_maps', 'p4_maps', 'p5_maps'], visualize=dict(flag=False, boundary_key=None)),
dict(type='Collect', keys=['img', 'p3_maps', 'p4_maps', 'p5_maps'])
]
# img_scale_ctw1500 = (1080, 736)
img_scale_ctw1500 = (800, 800)
test_pipeline_ctw1500 = [
dict(type='LoadImageFromFile', color_type='color_ignore_orientation'),
dict(type='MultiScaleFlipAug',
img_scale=img_scale_ctw1500,
flip=False,
transforms=[
dict(type='Resize', img_scale=(800, 800), keep_ratio=False),
# dict(type='Resize', img_scale=(1280, 800), keep_ratio=True),
dict(type='Normalize', **img_norm_cfg),
dict(type='Pad', size_divisor=32),
dict(type='ImageToTensor', keys=['img']),
dict(type='Collect', keys=['img']),
])
]
| 42.840426 | 90 | 0.656568 | 544 | 4,027 | 4.626838 | 0.240809 | 0.136671 | 0.016687 | 0.027811 | 0.852602 | 0.852602 | 0.852602 | 0.852602 | 0.828764 | 0.828764 | 0 | 0.086327 | 0.177303 | 4,027 | 93 | 91 | 43.301075 | 0.673408 | 0.080457 | 0 | 0.72 | 0 | 0 | 0.206448 | 0.052018 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5dc5c8a508f2f89b37a01b8f0488d2ae87dd4ed3 | 37,206 | py | Python | my_utils/parsers.py | Minys233/GCN-BMP | 21b64a3c8cc9bc33718ae09c65aa917e575132eb | [
"MIT"
] | null | null | null | my_utils/parsers.py | Minys233/GCN-BMP | 21b64a3c8cc9bc33718ae09c65aa917e575132eb | [
"MIT"
] | null | null | null | my_utils/parsers.py | Minys233/GCN-BMP | 21b64a3c8cc9bc33718ae09c65aa917e575132eb | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 12/8/2018 9:00 PM
# @Author : chinshin
# @FileName: parsers.py
import os
from logging import getLogger
import numpy
import pandas
import pickle
from rdkit import Chem
from tqdm import tqdm
from chainer_chemistry.dataset.parsers.base_parser import BaseFileParser
from chainer_chemistry.dataset.preprocessors.common import MolFeatureExtractionError # NOQA
from chainer_chemistry.dataset.preprocessors.mol_preprocessor import MolPreprocessor # NOQA
from chainer_chemistry.datasets.numpy_tuple_dataset import NumpyTupleDataset
import traceback
class CSVFileParserForPair(BaseFileParser):
"""data frame parser
This FileParser parses pandas dataframe.
It should contain column which contain SMILES as input, and
label column which is the target to predict.
Args:
preprocessor (BasePreprocessor): preprocessor instance
labels (str or list or None): labels column
smiles_cols (list): smiles columns
postprocess_label (Callable): post processing function if necessary
postprocess_fn (Callable): post processing function if necessary
logger:
"""
def __init__(self, preprocessor,
labels=None,
smiles_cols=('smiles_1', 'smiles_2'),
postprocess_label=None, postprocess_fn=None,
logger=None):
super(CSVFileParserForPair, self).__init__(preprocessor)
if isinstance(labels, str):
labels = [labels, ]
self.labels = labels # type: list
if not isinstance(smiles_cols, list):
self.smiles_cols = list(smiles_cols)
else:
self.smiles_cols = smiles_cols
self.postprocess_label = postprocess_label
self.postprocess_fn = postprocess_fn
self.logger = logger or getLogger(__name__)
def parse(self, filepath, return_smiles_pair=False, return_smiles_pair_original=False, target_index=None,
return_is_successful=False):
"""parse DataFrame using `preprocessor`
Label is extracted from `labels` columns and input features are
extracted from smiles information in `smiles` column.
Args:
filepath (str): file path to be parsed.
return_smiles_pair (bool): If set to `True`, smiles list is returned in
the key 'smiles', it is a list of SMILES from which input
features are successfully made.
If set to `False`, `None` is returned in the key 'smiles'.
target_index (list or None): target index list to partially extract
dataset. If None (default), all examples are parsed.
return_is_successful (bool): If set to `True`, boolean list is
returned in the key 'is_successful'. It represents
preprocessing has succeeded or not for each SMILES.
If set to False, `None` is returned in the key 'is_success'.
Returns (dict): dictionary that contains Dataset, 1-d numpy array with
dtype=object(string) which is a vector of smiles for each example
or None.
"""
df = pandas.read_csv(filepath)
logger = self.logger
pp = self.preprocessor
smiles_pair_list = []
smiles_pair_list_original = []
is_successful_list = []
# counter = 0
if isinstance(pp, MolPreprocessor):
# No influence.
if target_index is not None:
df = df.iloc[target_index]
features = None
smiles_1_index = df.columns.get_loc(self.smiles_cols[0])
smiles_2_index = df.columns.get_loc(self.smiles_cols[1])
if self.labels is None:
labels_index = [] # dummy list
else:
labels_index = [df.columns.get_loc(c) for c in self.labels]
total_count = df.shape[0]
fail_count = 0
success_count = 0
# iteration on every row within the csv file
for row in tqdm(df.itertuples(index=False), total=df.shape[0]):
smiles_1 = row[smiles_1_index]
smiles_2 = row[smiles_2_index]
# currently it assumes list
labels = [int(row[i]) for i in labels_index]
try:
mol_1 = Chem.MolFromSmiles(smiles_1)
mol_2 = Chem.MolFromSmiles(smiles_2)
if mol_1 is None or mol_2 is None:
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
# Note that smiles expression is not unique.
# we obtain canonical smiles
# canonical_smiles_1, mol_1 = pp.prepare_smiles_and_mol(mol_1)
# input_features_1 = pp.get_input_features(mol_1)
# canonical_smiles_2, mol_2 = pp.prepare_smiles_and_mol(mol_2)
# input_features_2 = pp.get_input_features(mol_2)
input_features_1 = pp.get_input_features(mol_1)
input_features_2 = pp.get_input_features(mol_2)
# Extract label
if self.postprocess_label is not None:
labels = self.postprocess_label(labels)
# if return_smiles_pair:
# smiles_pair_list.append([canonical_smiles_1, canonical_smiles_2])
if return_smiles_pair:
smiles_pair_list.append([smiles_1, smiles_2])
if return_smiles_pair_original:
smiles_pair_list_original.append([smiles_1, smiles_2])
except MolFeatureExtractionError as e:
# This is expected error that extracting feature failed,
# skip this molecule.
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
except Exception as e:
logger.warning('parse(), type: {}, {}'
.format(type(e).__name__, e.args))
logger.info(traceback.format_exc())
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
# Initialize features: list of list
if features is None:
if isinstance(input_features_1, tuple):
num_features_1 = len(input_features_1)
else:
num_features_1 = 1
if isinstance(input_features_2, tuple):
num_features_2 = len(input_features_2)
else:
num_features_2 = 1
num_features = num_features_1 + num_features_2
if self.labels is not None:
num_features += 1
# list of list, a sublist corresponding to a certain feature
features = [[] for _ in range(num_features)]
# for every row in csv file
if isinstance(input_features_1, tuple):
for i in range(len(input_features_1)):
# features[i] a list containing the i-th feature
features[i].append(input_features_1[i])
else:
features[0].append(input_features_1)
offset = len(input_features_1)
if isinstance(input_features_2, tuple):
for i in range(len(input_features_2)):
features[offset + i].append(input_features_2[i])
else:
features[offset].append(input_features_2)
# last column corresponding to targeted label
if self.labels is not None:
features[len(features) - 1].append(labels)
success_count += 1
if return_is_successful:
is_successful_list.append(True)
ret = []
for feature in features:
try:
feat_array = numpy.asarray(feature)
except ValueError:
# Temporal work around.
# See,
# https://stackoverflow.com/questions/26885508/why-do-i-get-error-trying-to-cast-np-arraysome-list-valueerror-could-not-broa
feat_array = numpy.empty(len(feature), dtype=numpy.ndarray)
feat_array[:] = feature[:]
ret.append(feat_array)
result = tuple(ret)
logger.info('Preprocess finished. FAIL {}, SUCCESS {}, TOTAL {}'
.format(fail_count, success_count, total_count))
else:
raise NotImplementedError
smiles_pairs = numpy.array(smiles_pair_list) if return_smiles_pair else None
smiles_pairs_original = numpy.array(smiles_pair_list_original) if return_smiles_pair_original else None
if return_is_successful:
is_successful = numpy.array(is_successful_list)
else:
is_successful = None
if isinstance(result, tuple):
if self.postprocess_fn is not None:
result = self.postprocess_fn(*result)
dataset = NumpyTupleDataset(*result)
else:
if self.postprocess_fn is not None:
result = self.postprocess_fn(result)
dataset = NumpyTupleDataset(result)
return {"dataset": dataset,
"smiles_pair": smiles_pairs,
"smiles_pair_original": smiles_pairs_original,
"is_successful": is_successful}
def extract_total_num(self, df):
"""Extracts total number of data which can be parsed
We can use this method to determine the value fed to `target_index`
option of `parse` method. For example, if we want to extract input
feature from 10% of whole dataset, we need to know how many samples
are in a file. The returned value of this method may not to be same as
the final dataset size.
Args:
df (pandas.DataFrame): dataframe to be parsed.
Returns (int): total number of dataset can be parsed.
"""
return len(df)
class Mol2VecParserForPair(BaseFileParser):
"""data frame parser
This FileParser parses pandas dataframe.
It should contain column which contain SMILES as input, and
label column which is the target to predict.
Args:
preprocessor (BasePreprocessor): preprocessor instance
labels (str or list or None): labels column
smiles_cols (list): smiles columns
postprocess_label (Callable): post processing function if necessary
postprocess_fn (Callable): post processing function if necessary
logger:
"""
def __init__(self, preprocessor,
labels=None,
smiles_cols=('smiles_1', 'smiles_2'),
postprocess_label=None, postprocess_fn=None,
logger=None):
super(Mol2VecParserForPair, self).__init__(preprocessor)
if isinstance(labels, str):
labels = [labels, ]
self.labels = labels # type: list
if not isinstance(smiles_cols, list):
self.smiles_cols = list(smiles_cols)
else:
self.smiles_cols = smiles_cols
self.postprocess_label = postprocess_label
self.postprocess_fn = postprocess_fn
self.logger = logger or getLogger(__name__)
def parse(self, filepath, return_smiles_pair=False, return_smiles_pair_original=False, target_index=None,
return_is_successful=False):
smiles2vec_filename = "smiles2vec.pkl"
smiles2vec_path = "/home/chenx/drug_mining/representation_learning/chainer-chemistry/examples/ddi/dataset/drug_list"
smiles2vec_filepath = os.path.join(smiles2vec_path, smiles2vec_filename)
with open(smiles2vec_filepath, 'rb') as pkl_reader:
smiles2vec = pickle.load(pkl_reader)
df = pandas.read_csv(filepath)
logger = self.logger
pp = self.preprocessor
smiles_pair_list = []
smiles_pair_list_original = []
is_successful_list = []
# counter = 0
if isinstance(pp, MolPreprocessor):
# No influence.
if target_index is not None:
df = df.iloc[target_index]
features = None
smiles_1_index = df.columns.get_loc(self.smiles_cols[0])
smiles_2_index = df.columns.get_loc(self.smiles_cols[1])
if self.labels is None:
labels_index = [] # dummy list
else:
labels_index = [df.columns.get_loc(c) for c in self.labels]
total_count = df.shape[0]
fail_count = 0
success_count = 0
# iteration on every row within the csv file
for row in tqdm(df.itertuples(index=False), total=df.shape[0]):
smiles_1 = row[smiles_1_index]
smiles_2 = row[smiles_2_index]
# currently it assumes list
labels = [int(row[i]) for i in labels_index]
try:
mol_1 = Chem.MolFromSmiles(smiles_1)
mol_2 = Chem.MolFromSmiles(smiles_2)
if mol_1 is None or mol_2 is None:
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
# input_features_1 = pp.get_input_features(mol_1)
# input_features_2 = pp.get_input_features(mol_2)
input_features_1 = smiles2vec[smiles_1]
input_features_2 = smiles2vec[smiles_2]
# Extract label
if self.postprocess_label is not None:
labels = self.postprocess_label(labels)
# if return_smiles_pair:
# smiles_pair_list.append([canonical_smiles_1, canonical_smiles_2])
if return_smiles_pair:
smiles_pair_list.append([smiles_1, smiles_2])
if return_smiles_pair_original:
smiles_pair_list_original.append([smiles_1, smiles_2])
except MolFeatureExtractionError as e:
# This is expected error that extracting feature failed,
# skip this molecule.
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
except Exception as e:
logger.warning('parse(), type: {}, {}'
.format(type(e).__name__, e.args))
logger.info(traceback.format_exc())
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
# Initialize features: list of list
if features is None:
if isinstance(input_features_1, tuple):
num_features_1 = len(input_features_1)
else:
num_features_1 = 1
if isinstance(input_features_2, tuple):
num_features_2 = len(input_features_2)
else:
num_features_2 = 1
num_features = num_features_1 + num_features_2
if self.labels is not None:
num_features += 1
# list of list, a sublist corresponding to a certain feature
features = [[] for _ in range(num_features)]
# for every row in csv file
if isinstance(input_features_1, tuple):
for i in range(len(input_features_1)):
# features[i] a list containing the i-th feature
features[i].append(input_features_1[i])
else:
features[0].append(input_features_1)
# offset = len(input_features_1)
offset = num_features_1
if isinstance(input_features_2, tuple):
for i in range(len(input_features_2)):
features[offset + i].append(input_features_2[i])
else:
features[offset].append(input_features_2)
# last column corresponding to targeted label
if self.labels is not None:
features[len(features) - 1].append(labels)
success_count += 1
if return_is_successful:
is_successful_list.append(True)
ret = []
for feature in features:
try:
feat_array = numpy.asarray(feature)
except ValueError:
# Temporal work around.
# See,
# https://stackoverflow.com/questions/26885508/why-do-i-get-error-trying-to-cast-np-arraysome-list-valueerror-could-not-broa
feat_array = numpy.empty(len(feature), dtype=numpy.ndarray)
feat_array[:] = feature[:]
ret.append(feat_array)
result = tuple(ret)
logger.info('Preprocess finished. FAIL {}, SUCCESS {}, TOTAL {}'
.format(fail_count, success_count, total_count))
else:
raise NotImplementedError
smiles_pairs = numpy.array(smiles_pair_list) if return_smiles_pair else None
smiles_pairs_original = numpy.array(smiles_pair_list_original) if return_smiles_pair_original else None
if return_is_successful:
is_successful = numpy.array(is_successful_list)
else:
is_successful = None
if isinstance(result, tuple):
if self.postprocess_fn is not None:
result = self.postprocess_fn(*result)
dataset = NumpyTupleDataset(*result)
else:
if self.postprocess_fn is not None:
result = self.postprocess_fn(result)
dataset = NumpyTupleDataset(result)
return {"dataset": dataset,
"smiles_pair": smiles_pairs,
"smiles_pair_original": smiles_pairs_original,
"is_successful": is_successful}
def extract_total_num(self, df):
"""Extracts total number of data which can be parsed
We can use this method to determine the value fed to `target_index`
option of `parse` method. For example, if we want to extract input
feature from 10% of whole dataset, we need to know how many samples
are in a file. The returned value of this method may not to be same as
the final dataset size.
Args:
df (pandas.DataFrame): dataframe to be parsed.
Returns (int): total number of dataset can be parsed.
"""
return len(df)
class MolAutoencoderParserForPair(BaseFileParser):
def __init__(self, preprocessor,
labels=None,
smiles_cols=('smiles_1', 'smiles_2'),
postprocess_label=None, postprocess_fn=None,
logger=None):
super(MolAutoencoderParserForPair, self).__init__(preprocessor)
if isinstance(labels, str):
labels = [labels, ]
self.labels = labels # type: list
if not isinstance(smiles_cols, list):
self.smiles_cols = list(smiles_cols)
else:
self.smiles_cols = smiles_cols
self.postprocess_label = postprocess_label
self.postprocess_fn = postprocess_fn
self.logger = logger or getLogger(__name__)
def parse(self, filepath, return_smiles_pair=False, return_smiles_pair_original=False, target_index=None,
return_is_successful=False):
smiles2molenc_filename = "smiles2molenc.pkl"
smiles2molenc_path = "/home/chenx/drug_mining/representation_learning/chainer-chemistry/examples/ddi/dataset/drug_list"
smiles2vec_filepath = os.path.join(smiles2molenc_path, smiles2molenc_filename)
with open(smiles2vec_filepath, 'rb') as pkl_reader:
smiles2vec = pickle.load(pkl_reader)
df = pandas.read_csv(filepath)
logger = self.logger
pp = self.preprocessor
smiles_pair_list = []
smiles_pair_list_original = []
is_successful_list = []
# counter = 0
if isinstance(pp, MolPreprocessor):
# No influence.
if target_index is not None:
df = df.iloc[target_index]
features = None
smiles_1_index = df.columns.get_loc(self.smiles_cols[0])
smiles_2_index = df.columns.get_loc(self.smiles_cols[1])
if self.labels is None:
labels_index = [] # dummy list
else:
labels_index = [df.columns.get_loc(c) for c in self.labels]
total_count = df.shape[0]
fail_count = 0
success_count = 0
# iteration on every row within the csv file
for row in tqdm(df.itertuples(index=False), total=df.shape[0]):
smiles_1 = row[smiles_1_index]
smiles_2 = row[smiles_2_index]
# currently it assumes list
labels = [int(row[i]) for i in labels_index]
try:
mol_1 = Chem.MolFromSmiles(smiles_1)
mol_2 = Chem.MolFromSmiles(smiles_2)
if mol_1 is None or mol_2 is None:
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
# input_features_1 = pp.get_input_features(mol_1)
# input_features_2 = pp.get_input_features(mol_2)
input_features_1 = smiles2vec[smiles_1]
input_features_2 = smiles2vec[smiles_2]
# Extract label
if self.postprocess_label is not None:
labels = self.postprocess_label(labels)
# if return_smiles_pair:
# smiles_pair_list.append([canonical_smiles_1, canonical_smiles_2])
if return_smiles_pair:
smiles_pair_list.append([smiles_1, smiles_2])
if return_smiles_pair_original:
smiles_pair_list_original.append([smiles_1, smiles_2])
except MolFeatureExtractionError as e:
# This is expected error that extracting feature failed,
# skip this molecule.
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
except Exception as e:
logger.warning('parse(), type: {}, {}'
.format(type(e).__name__, e.args))
logger.info(traceback.format_exc())
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
# Initialize features: list of list
if features is None:
if isinstance(input_features_1, tuple):
num_features_1 = len(input_features_1)
else:
num_features_1 = 1
if isinstance(input_features_2, tuple):
num_features_2 = len(input_features_2)
else:
num_features_2 = 1
num_features = num_features_1 + num_features_2
if self.labels is not None:
num_features += 1
# list of list, a sublist corresponding to a certain feature
features = [[] for _ in range(num_features)]
# for every row in csv file
if isinstance(input_features_1, tuple):
for i in range(len(input_features_1)):
# features[i] a list containing the i-th feature
features[i].append(input_features_1[i])
else:
features[0].append(input_features_1)
# offset = len(input_features_1)
offset = num_features_1
if isinstance(input_features_2, tuple):
for i in range(len(input_features_2)):
features[offset + i].append(input_features_2[i])
else:
features[offset].append(input_features_2)
# last column corresponding to targeted label
if self.labels is not None:
features[len(features) - 1].append(labels)
success_count += 1
if return_is_successful:
is_successful_list.append(True)
ret = []
for feature in features:
try:
feat_array = numpy.asarray(feature)
except ValueError:
# Temporal work around.
# See,
# https://stackoverflow.com/questions/26885508/why-do-i-get-error-trying-to-cast-np-arraysome-list-valueerror-could-not-broa
feat_array = numpy.empty(len(feature), dtype=numpy.ndarray)
feat_array[:] = feature[:]
ret.append(feat_array)
result = tuple(ret)
logger.info('Preprocess finished. FAIL {}, SUCCESS {}, TOTAL {}'
.format(fail_count, success_count, total_count))
else:
raise NotImplementedError
smiles_pairs = numpy.array(smiles_pair_list) if return_smiles_pair else None
smiles_pairs_original = numpy.array(smiles_pair_list_original) if return_smiles_pair_original else None
if return_is_successful:
is_successful = numpy.array(is_successful_list)
else:
is_successful = None
if isinstance(result, tuple):
if self.postprocess_fn is not None:
result = self.postprocess_fn(*result)
dataset = NumpyTupleDataset(*result)
else:
if self.postprocess_fn is not None:
result = self.postprocess_fn(result)
dataset = NumpyTupleDataset(result)
return {"dataset": dataset,
"smiles_pair": smiles_pairs,
"smiles_pair_original": smiles_pairs_original,
"is_successful": is_successful}
def extract_total_num(self, df):
"""Extracts total number of data which can be parsed
We can use this method to determine the value fed to `target_index`
option of `parse` method. For example, if we want to extract input
feature from 10% of whole dataset, we need to know how many samples
are in a file. The returned value of this method may not to be same as
the final dataset size.
Args:
df (pandas.DataFrame): dataframe to be parsed.
Returns (int): total number of dataset can be parsed.
"""
return len(df)
class SSPParserForPair(BaseFileParser):
def __init__(self, preprocessor,
labels=None,
smiles_cols=('smiles_1', 'smiles_2'),
postprocess_label=None, postprocess_fn=None,
logger=None):
super(SSPParserForPair, self).__init__(preprocessor)
if isinstance(labels, str):
labels = [labels, ]
self.labels = labels # type: list
if not isinstance(smiles_cols, list):
self.smiles_cols = list(smiles_cols)
else:
self.smiles_cols = smiles_cols
self.postprocess_label = postprocess_label
self.postprocess_fn = postprocess_fn
self.logger = logger or getLogger(__name__)
def parse(self, filepath, return_smiles_pair=False, return_smiles_pair_original=False, target_index=None,
return_is_successful=False):
smiles2ssp_filename = "smiles2ssp.pkl"
smiles2ssp_path = "/home/chenx/drug_mining/representation_learning/chainer-chemistry/examples/ddi/dataset/drug_list"
smiles2ssp_filepath = os.path.join(smiles2ssp_path, smiles2ssp_filename)
with open(smiles2ssp_filepath, 'rb') as pkl_reader:
smiles2vec = pickle.load(pkl_reader)
df = pandas.read_csv(filepath)
logger = self.logger
pp = self.preprocessor
smiles_pair_list = []
smiles_pair_list_original = []
is_successful_list = []
# counter = 0
if isinstance(pp, MolPreprocessor):
# No influence.
if target_index is not None:
df = df.iloc[target_index]
features = None
smiles_1_index = df.columns.get_loc(self.smiles_cols[0])
smiles_2_index = df.columns.get_loc(self.smiles_cols[1])
if self.labels is None:
labels_index = [] # dummy list
else:
labels_index = [df.columns.get_loc(c) for c in self.labels]
total_count = df.shape[0]
fail_count = 0
success_count = 0
# iteration on every row within the csv file
for row in tqdm(df.itertuples(index=False), total=df.shape[0]):
smiles_1 = row[smiles_1_index]
smiles_2 = row[smiles_2_index]
# currently it assumes list
labels = [int(row[i]) for i in labels_index]
try:
mol_1 = Chem.MolFromSmiles(smiles_1)
mol_2 = Chem.MolFromSmiles(smiles_2)
if mol_1 is None or mol_2 is None:
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
# input_features_1 = pp.get_input_features(mol_1)
# input_features_2 = pp.get_input_features(mol_2)
input_features_1 = smiles2vec[smiles_1]
input_features_2 = smiles2vec[smiles_2]
# Extract label
if self.postprocess_label is not None:
labels = self.postprocess_label(labels)
# if return_smiles_pair:
# smiles_pair_list.append([canonical_smiles_1, canonical_smiles_2])
if return_smiles_pair:
smiles_pair_list.append([smiles_1, smiles_2])
if return_smiles_pair_original:
smiles_pair_list_original.append([smiles_1, smiles_2])
except MolFeatureExtractionError as e:
# This is expected error that extracting feature failed,
# skip this molecule.
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
except Exception as e:
logger.warning('parse(), type: {}, {}'
.format(type(e).__name__, e.args))
logger.info(traceback.format_exc())
fail_count += 1
if return_is_successful:
is_successful_list.append(False)
continue
# Initialize features: list of list
if features is None:
if isinstance(input_features_1, tuple):
num_features_1 = len(input_features_1)
else:
num_features_1 = 1
if isinstance(input_features_2, tuple):
num_features_2 = len(input_features_2)
else:
num_features_2 = 1
num_features = num_features_1 + num_features_2
if self.labels is not None:
num_features += 1
# list of list, a sublist corresponding to a certain feature
features = [[] for _ in range(num_features)]
# for every row in csv file
if isinstance(input_features_1, tuple):
for i in range(len(input_features_1)):
# features[i] a list containing the i-th feature
features[i].append(input_features_1[i])
else:
features[0].append(input_features_1)
# offset = len(input_features_1)
offset = num_features_1
if isinstance(input_features_2, tuple):
for i in range(len(input_features_2)):
features[offset + i].append(input_features_2[i])
else:
features[offset].append(input_features_2)
# last column corresponding to targeted label
if self.labels is not None:
features[len(features) - 1].append(labels)
success_count += 1
if return_is_successful:
is_successful_list.append(True)
ret = []
for feature in features:
try:
feat_array = numpy.asarray(feature)
except ValueError:
# Temporal work around.
# See,
# https://stackoverflow.com/questions/26885508/why-do-i-get-error-trying-to-cast-np-arraysome-list-valueerror-could-not-broa
feat_array = numpy.empty(len(feature), dtype=numpy.ndarray)
feat_array[:] = feature[:]
ret.append(feat_array)
result = tuple(ret)
logger.info('Preprocess finished. FAIL {}, SUCCESS {}, TOTAL {}'
.format(fail_count, success_count, total_count))
else:
raise NotImplementedError
smiles_pairs = numpy.array(smiles_pair_list) if return_smiles_pair else None
smiles_pairs_original = numpy.array(smiles_pair_list_original) if return_smiles_pair_original else None
if return_is_successful:
is_successful = numpy.array(is_successful_list)
else:
is_successful = None
if isinstance(result, tuple):
if self.postprocess_fn is not None:
result = self.postprocess_fn(*result)
dataset = NumpyTupleDataset(*result)
else:
if self.postprocess_fn is not None:
result = self.postprocess_fn(result)
dataset = NumpyTupleDataset(result)
return {"dataset": dataset,
"smiles_pair": smiles_pairs,
"smiles_pair_original": smiles_pairs_original,
"is_successful": is_successful}
def extract_total_num(self, df):
"""Extracts total number of data which can be parsed
We can use this method to determine the value fed to `target_index`
option of `parse` method. For example, if we want to extract input
feature from 10% of whole dataset, we need to know how many samples
are in a file. The returned value of this method may not to be same as
the final dataset size.
Args:
df (pandas.DataFrame): dataframe to be parsed.
Returns (int): total number of dataset can be parsed.
"""
return len(df)
| 44.398568 | 145 | 0.548218 | 3,976 | 37,206 | 4.890091 | 0.07168 | 0.05349 | 0.025922 | 0.029625 | 0.92352 | 0.917862 | 0.912308 | 0.912308 | 0.912308 | 0.910456 | 0 | 0.015081 | 0.385153 | 37,206 | 837 | 146 | 44.451613 | 0.834849 | 0.193974 | 0 | 0.940966 | 0 | 0 | 0.031277 | 0.01011 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021467 | false | 0 | 0.021467 | 0 | 0.064401 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b90874d0ef647f2383e890f330094a215573c6ba | 25,630 | py | Python | tb_rest_client/api/api_ce/audit_log_controller_api.py | jernkuan/thingsboard-python-rest-client | 3fb25272507494e6d494b27ca2380d3c543562e5 | [
"Apache-2.0"
] | null | null | null | tb_rest_client/api/api_ce/audit_log_controller_api.py | jernkuan/thingsboard-python-rest-client | 3fb25272507494e6d494b27ca2380d3c543562e5 | [
"Apache-2.0"
] | null | null | null | tb_rest_client/api/api_ce/audit_log_controller_api.py | jernkuan/thingsboard-python-rest-client | 3fb25272507494e6d494b27ca2380d3c543562e5 | [
"Apache-2.0"
] | 1 | 2021-11-26T11:24:56.000Z | 2021-11-26T11:24:56.000Z | # coding: utf-8
"""
ThingsBoard REST API
For instructions how to authorize requests please visit <a href='http://thingsboard.io/docs/reference/rest-api/'>REST API documentation page</a>. # noqa: E501
OpenAPI spec version: 2.0
Contact: info@thingsboard.io
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from tb_rest_client.api_client import ApiClient
class AuditLogControllerApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def get_audit_logs_by_customer_id_using_get(self, customer_id, page_size, page, **kwargs): # noqa: E501
"""getAuditLogsByCustomerId # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_logs_by_customer_id_using_get(customer_id, page_size, page, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str customer_id: customerId (required)
:param str page_size: pageSize (required)
:param str page: page (required)
:param str text_search: textSearch
:param str sort_property: sortProperty
:param str sort_order: sortOrder
:param int start_time: startTime
:param int end_time: endTime
:param str action_types: actionTypes
:return: PageDataAuditLog
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_audit_logs_by_customer_id_using_get_with_http_info(customer_id, page_size, page, **kwargs) # noqa: E501
else:
(data) = self.get_audit_logs_by_customer_id_using_get_with_http_info(customer_id, page_size, page, **kwargs) # noqa: E501
return data
def get_audit_logs_by_customer_id_using_get_with_http_info(self, customer_id, page_size, page, **kwargs): # noqa: E501
"""getAuditLogsByCustomerId # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_logs_by_customer_id_using_get_with_http_info(customer_id, page_size, page, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str customer_id: customerId (required)
:param str page_size: pageSize (required)
:param str page: page (required)
:param str text_search: textSearch
:param str sort_property: sortProperty
:param str sort_order: sortOrder
:param int start_time: startTime
:param int end_time: endTime
:param str action_types: actionTypes
:return: PageDataAuditLog
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['customer_id', 'page_size', 'page', 'text_search', 'sort_property', 'sort_order', 'start_time', 'end_time', 'action_types'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_logs_by_customer_id_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'customer_id' is set
if ('customer_id' not in params or
params['customer_id'] is None):
raise ValueError("Missing the required parameter `customer_id` when calling `get_audit_logs_by_customer_id_using_get`") # noqa: E501
# verify the required parameter 'page_size' is set
if ('page_size' not in params or
params['page_size'] is None):
raise ValueError("Missing the required parameter `page_size` when calling `get_audit_logs_by_customer_id_using_get`") # noqa: E501
# verify the required parameter 'page' is set
if ('page' not in params or
params['page'] is None):
raise ValueError("Missing the required parameter `page` when calling `get_audit_logs_by_customer_id_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'customer_id' in params:
path_params['customerId'] = params['customer_id'] # noqa: E501
query_params = []
if 'text_search' in params:
query_params.append(('textSearch', params['text_search'])) # noqa: E501
if 'sort_property' in params:
query_params.append(('sortProperty', params['sort_property'])) # noqa: E501
if 'sort_order' in params:
query_params.append(('sortOrder', params['sort_order'])) # noqa: E501
if 'start_time' in params:
query_params.append(('startTime', params['start_time'])) # noqa: E501
if 'end_time' in params:
query_params.append(('endTime', params['end_time'])) # noqa: E501
if 'action_types' in params:
query_params.append(('actionTypes', params['action_types'])) # noqa: E501
if 'page_size' in params:
query_params.append(('pageSize', params['page_size'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/audit/logs/customer/{customerId}{?textSearch,sortProperty,sortOrder,startTime,endTime,actionTypes,pageSize,page}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageDataAuditLog', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_audit_logs_by_entity_id_using_get(self, entity_type, entity_id, page_size, page, **kwargs): # noqa: E501
"""getAuditLogsByEntityId # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_logs_by_entity_id_using_get(entity_type, entity_id, page_size, page, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str entity_type: entityType (required)
:param str entity_id: entityId (required)
:param str page_size: pageSize (required)
:param str page: page (required)
:param str text_search: textSearch
:param str sort_property: sortProperty
:param str sort_order: sortOrder
:param int start_time: startTime
:param int end_time: endTime
:param str action_types: actionTypes
:return: PageDataAuditLog
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_audit_logs_by_entity_id_using_get_with_http_info(entity_type, entity_id, page_size, page, **kwargs) # noqa: E501
else:
(data) = self.get_audit_logs_by_entity_id_using_get_with_http_info(entity_type, entity_id, page_size, page, **kwargs) # noqa: E501
return data
def get_audit_logs_by_entity_id_using_get_with_http_info(self, entity_type, entity_id, page_size, page, **kwargs): # noqa: E501
"""getAuditLogsByEntityId # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_logs_by_entity_id_using_get_with_http_info(entity_type, entity_id, page_size, page, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str entity_type: entityType (required)
:param str entity_id: entityId (required)
:param str page_size: pageSize (required)
:param str page: page (required)
:param str text_search: textSearch
:param str sort_property: sortProperty
:param str sort_order: sortOrder
:param int start_time: startTime
:param int end_time: endTime
:param str action_types: actionTypes
:return: PageDataAuditLog
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['entity_type', 'entity_id', 'page_size', 'page', 'text_search', 'sort_property', 'sort_order', 'start_time', 'end_time', 'action_types'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_logs_by_entity_id_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'entity_type' is set
if ('entity_type' not in params or
params['entity_type'] is None):
raise ValueError("Missing the required parameter `entity_type` when calling `get_audit_logs_by_entity_id_using_get`") # noqa: E501
# verify the required parameter 'entity_id' is set
if ('entity_id' not in params or
params['entity_id'] is None):
raise ValueError("Missing the required parameter `entity_id` when calling `get_audit_logs_by_entity_id_using_get`") # noqa: E501
# verify the required parameter 'page_size' is set
if ('page_size' not in params or
params['page_size'] is None):
raise ValueError("Missing the required parameter `page_size` when calling `get_audit_logs_by_entity_id_using_get`") # noqa: E501
# verify the required parameter 'page' is set
if ('page' not in params or
params['page'] is None):
raise ValueError("Missing the required parameter `page` when calling `get_audit_logs_by_entity_id_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'entity_type' in params:
path_params['entityType'] = params['entity_type'] # noqa: E501
if 'entity_id' in params:
path_params['entityId'] = params['entity_id'] # noqa: E501
query_params = []
if 'text_search' in params:
query_params.append(('textSearch', params['text_search'])) # noqa: E501
if 'sort_property' in params:
query_params.append(('sortProperty', params['sort_property'])) # noqa: E501
if 'sort_order' in params:
query_params.append(('sortOrder', params['sort_order'])) # noqa: E501
if 'start_time' in params:
query_params.append(('startTime', params['start_time'])) # noqa: E501
if 'end_time' in params:
query_params.append(('endTime', params['end_time'])) # noqa: E501
if 'action_types' in params:
query_params.append(('actionTypes', params['action_types'])) # noqa: E501
if 'page_size' in params:
query_params.append(('pageSize', params['page_size'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/audit/logs/entity/{entityType}/{entityId}{?textSearch,sortProperty,sortOrder,startTime,endTime,actionTypes,pageSize,page}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageDataAuditLog', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_audit_logs_by_user_id_using_get(self, user_id, page_size, page, **kwargs): # noqa: E501
"""getAuditLogsByUserId # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_logs_by_user_id_using_get(user_id, page_size, page, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str user_id: userId (required)
:param str page_size: pageSize (required)
:param str page: page (required)
:param str text_search: textSearch
:param str sort_property: sortProperty
:param str sort_order: sortOrder
:param int start_time: startTime
:param int end_time: endTime
:param str action_types: actionTypes
:return: PageDataAuditLog
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_audit_logs_by_user_id_using_get_with_http_info(user_id, page_size, page, **kwargs) # noqa: E501
else:
(data) = self.get_audit_logs_by_user_id_using_get_with_http_info(user_id, page_size, page, **kwargs) # noqa: E501
return data
def get_audit_logs_by_user_id_using_get_with_http_info(self, user_id, page_size, page, **kwargs): # noqa: E501
"""getAuditLogsByUserId # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_logs_by_user_id_using_get_with_http_info(user_id, page_size, page, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str user_id: userId (required)
:param str page_size: pageSize (required)
:param str page: page (required)
:param str text_search: textSearch
:param str sort_property: sortProperty
:param str sort_order: sortOrder
:param int start_time: startTime
:param int end_time: endTime
:param str action_types: actionTypes
:return: PageDataAuditLog
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['user_id', 'page_size', 'page', 'text_search', 'sort_property', 'sort_order', 'start_time', 'end_time', 'action_types'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_logs_by_user_id_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'user_id' is set
if ('user_id' not in params or
params['user_id'] is None):
raise ValueError("Missing the required parameter `user_id` when calling `get_audit_logs_by_user_id_using_get`") # noqa: E501
# verify the required parameter 'page_size' is set
if ('page_size' not in params or
params['page_size'] is None):
raise ValueError("Missing the required parameter `page_size` when calling `get_audit_logs_by_user_id_using_get`") # noqa: E501
# verify the required parameter 'page' is set
if ('page' not in params or
params['page'] is None):
raise ValueError("Missing the required parameter `page` when calling `get_audit_logs_by_user_id_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'user_id' in params:
path_params['userId'] = params['user_id'] # noqa: E501
query_params = []
if 'text_search' in params:
query_params.append(('textSearch', params['text_search'])) # noqa: E501
if 'sort_property' in params:
query_params.append(('sortProperty', params['sort_property'])) # noqa: E501
if 'sort_order' in params:
query_params.append(('sortOrder', params['sort_order'])) # noqa: E501
if 'start_time' in params:
query_params.append(('startTime', params['start_time'])) # noqa: E501
if 'end_time' in params:
query_params.append(('endTime', params['end_time'])) # noqa: E501
if 'action_types' in params:
query_params.append(('actionTypes', params['action_types'])) # noqa: E501
if 'page_size' in params:
query_params.append(('pageSize', params['page_size'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/audit/logs/user/{userId}{?textSearch,sortProperty,sortOrder,startTime,endTime,actionTypes,pageSize,page}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageDataAuditLog', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_audit_logs_using_get(self, page_size, page, **kwargs): # noqa: E501
"""getAuditLogs # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_logs_using_get(page_size, page, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str page_size: pageSize (required)
:param str page: page (required)
:param str text_search: textSearch
:param str sort_property: sortProperty
:param str sort_order: sortOrder
:param int start_time: startTime
:param int end_time: endTime
:param str action_types: actionTypes
:return: PageDataAuditLog
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_audit_logs_using_get_with_http_info(page_size, page, **kwargs) # noqa: E501
else:
(data) = self.get_audit_logs_using_get_with_http_info(page_size, page, **kwargs) # noqa: E501
return data
def get_audit_logs_using_get_with_http_info(self, page_size, page, **kwargs): # noqa: E501
"""getAuditLogs # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_logs_using_get_with_http_info(page_size, page, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str page_size: pageSize (required)
:param str page: page (required)
:param str text_search: textSearch
:param str sort_property: sortProperty
:param str sort_order: sortOrder
:param int start_time: startTime
:param int end_time: endTime
:param str action_types: actionTypes
:return: PageDataAuditLog
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page_size', 'page', 'text_search', 'sort_property', 'sort_order', 'start_time', 'end_time', 'action_types'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_logs_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'page_size' is set
if ('page_size' not in params or
params['page_size'] is None):
raise ValueError("Missing the required parameter `page_size` when calling `get_audit_logs_using_get`") # noqa: E501
# verify the required parameter 'page' is set
if ('page' not in params or
params['page'] is None):
raise ValueError("Missing the required parameter `page` when calling `get_audit_logs_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'text_search' in params:
query_params.append(('textSearch', params['text_search'])) # noqa: E501
if 'sort_property' in params:
query_params.append(('sortProperty', params['sort_property'])) # noqa: E501
if 'sort_order' in params:
query_params.append(('sortOrder', params['sort_order'])) # noqa: E501
if 'start_time' in params:
query_params.append(('startTime', params['start_time'])) # noqa: E501
if 'end_time' in params:
query_params.append(('endTime', params['end_time'])) # noqa: E501
if 'action_types' in params:
query_params.append(('actionTypes', params['action_types'])) # noqa: E501
if 'page_size' in params:
query_params.append(('pageSize', params['page_size'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/audit/logs{?textSearch,sortProperty,sortOrder,startTime,endTime,actionTypes,pageSize,page}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageDataAuditLog', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 45.282686 | 172 | 0.633086 | 3,103 | 25,630 | 4.944892 | 0.060264 | 0.046403 | 0.031283 | 0.039625 | 0.947341 | 0.93939 | 0.932742 | 0.926356 | 0.922119 | 0.903154 | 0 | 0.014724 | 0.271284 | 25,630 | 565 | 173 | 45.362832 | 0.806821 | 0.317089 | 0 | 0.77377 | 0 | 0.013115 | 0.263572 | 0.079728 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029508 | false | 0 | 0.013115 | 0 | 0.085246 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d3fbc25f2ba93afba4c2d668d5d94e02e49cb02 | 203 | py | Python | Windows/RPC/CVE-2022-26809/cve-2022-26809-scanVuln.py | lal0ne/CVE | 82d3d2c63bef37fdb0779a5b73fc382ca61d4995 | [
"MIT"
] | 7 | 2022-02-17T02:48:31.000Z | 2022-03-16T07:57:48.000Z | Windows/RPC/CVE-2022-26809/cve-20220-26809_exploit.py | lal0ne/vulnerability | 11ce22333eefc5153aa1a30efa17ec6dfcaf9500 | [
"MIT"
] | null | null | null | Windows/RPC/CVE-2022-26809/cve-20220-26809_exploit.py | lal0ne/vulnerability | 11ce22333eefc5153aa1a30efa17ec6dfcaf9500 | [
"MIT"
] | 5 | 2022-02-17T06:17:24.000Z | 2022-03-30T07:31:41.000Z | base64
aHR0cHM6Ly9zYXRvc2hpZGlzay5jb20vcGF5L0NGRWVmYgo=
aHR0cHM6Ly9zYXRvc2hpZGlzay5jb20vcGF5L0NGRWVmYgo=
aHR0cHM6Ly9zYXRvc2hpZGlzay5jb20vcGF5L0NGRWVmYgo=
aHR0cHM6Ly9zYXRvc2hpZGlzay5jb20vcGF5L0NGRWVmYgo=
| 33.833333 | 48 | 0.955665 | 5 | 203 | 38.8 | 0.4 | 1.453608 | 1.453608 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191919 | 0.024631 | 203 | 5 | 49 | 40.6 | 0.787879 | 0 | 0 | 0.8 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5d7393e0eed89f8bfaa98514197d8ee4a221218c | 11,872 | py | Python | TrainingExtensions/tensorflow/test/python/test_qc_quantize_op.py | quic-bharathr/aimet | 363308217dca3fc52644bdda31e69e356397adaf | [
"BSD-3-Clause"
] | null | null | null | TrainingExtensions/tensorflow/test/python/test_qc_quantize_op.py | quic-bharathr/aimet | 363308217dca3fc52644bdda31e69e356397adaf | [
"BSD-3-Clause"
] | null | null | null | TrainingExtensions/tensorflow/test/python/test_qc_quantize_op.py | quic-bharathr/aimet | 363308217dca3fc52644bdda31e69e356397adaf | [
"BSD-3-Clause"
] | null | null | null | # =============================================================================
# @@-COPYRIGHT-START-@@
#
# Copyright (c) 2019, Qualcomm Innovation Center, Inc. All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice,
# this list of conditions and the following disclaimer.
#
# 2. Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# 3. Neither the name of the copyright holder nor the names of its contributors
# may be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
# SPDX-License-Identifier: BSD-3-Clause
#
# @@-COPYRIGHT-END-@@
# =============================================================================
import unittest
import numpy as np
import tensorflow as tf
import libpymo
class TestTrainingExtensionsQcQuantizeOp(unittest.TestCase):
def test_qc_quantize_op_cpu(self):
"""
test custom op with CPU
"""
zero_out_module = tf.load_op_library('libaimet_tf_ops.so')
graph = tf.Graph()
config = tf.ConfigProto(log_device_placement=False)
sess = tf.Session(graph=graph, config=config)
with graph.as_default():
# place holder for the input
with tf.device("/device:CPU:0"):
inp = tf.placeholder(tf.float32, shape=[10], name='input')
tensor_quantizer = libpymo.TensorQuantizer(8, libpymo.QuantizationMode.QUANTIZATION_TF_ENHANCED,
libpymo.RoundingMode.ROUND_NEAREST, False)
tensor_quantizer_val = libpymo.PtrToInt64(tensor_quantizer)
tensor_quant_ref = tf.Variable(initial_value=tensor_quantizer_val, trainable=False, dtype=tf.int64)
encoding_min = tf.Variable(initial_value=0.0, trainable=True, dtype=tf.double)
encoding_max = tf.Variable(initial_value=0.0, trainable=True, dtype=tf.double)
bit_width = tf.Variable(initial_value=8, trainable=False, dtype=tf.int8)
use_symmetric_encoding = tf.Variable(initial_value=False, trainable=False, dtype=tf.bool)
mode_var = tf.Variable(initial_value=int(libpymo.TensorQuantizerOpMode.updateStats),
trainable=False, dtype=tf.int32)
sess.run([mode_var.initializer, tensor_quant_ref.initializer, encoding_min.initializer,
encoding_max.initializer, bit_width.initializer, use_symmetric_encoding.initializer])
pass_through_op_output = zero_out_module.qc_quantize(name='quant_op', in_tensor=inp,
op_mode=mode_var,
tensor_quantizer_reference=tensor_quant_ref,
encoding_min=encoding_min,
encoding_max=encoding_max,
bit_width=bit_width,
use_symmetric_encoding=use_symmetric_encoding)
inp_tensor = sess.graph.get_tensor_by_name('input:0')
inp_data = np.random.rand(10)
# get the output
print(inp_data)
out_data = sess.run(pass_through_op_output, feed_dict={inp_tensor: inp_data})
print(out_data)
# compare qc_quantize op's output with input
self.assertTrue(np.allclose(out_data, inp_data))
# compute encodings
self.assertFalse(tensor_quantizer.isEncodingValid)
encoding = tensor_quantizer.computeEncoding()
self.assertTrue(tensor_quantizer.isEncodingValid)
print('min=', encoding.min, ', max=', encoding.max)
# get the output
inp_data = np.random.rand(10) * 2
print(inp_data)
mode_var.load(int(libpymo.TensorQuantizerOpMode.quantizeDequantize), sess)
out_data = sess.run(pass_through_op_output, feed_dict={inp_tensor: inp_data})
print(out_data)
# compare qc_quantize op's output with input
self.assertFalse(np.allclose(out_data, inp_data))
sess.close()
def test_qc_quantize_op_oneshot_cpu(self):
"""
test custom op with CPU
"""
zero_out_module = tf.load_op_library('libaimet_tf_ops.so')
graph = tf.Graph()
config = tf.ConfigProto(log_device_placement=False)
sess = tf.Session(graph=graph, config=config)
with graph.as_default():
# place holder for the input
with tf.device("/device:CPU:0"):
inp = tf.placeholder(tf.float32, shape=[10], name='input')
tensor_quantizer = libpymo.TensorQuantizer(8, libpymo.QuantizationMode.QUANTIZATION_TF_ENHANCED,
libpymo.RoundingMode.ROUND_NEAREST, False)
tensor_quantizer_val = libpymo.PtrToInt64(tensor_quantizer)
tensor_quant_ref = tf.Variable(initial_value=tensor_quantizer_val, trainable=False, dtype=tf.int64)
mode_var = tf.Variable(initial_value=int(libpymo.TensorQuantizerOpMode.oneShotQuantizeDequantize),
trainable=False, dtype=tf.int32)
encoding_min = tf.Variable(initial_value=0.0, trainable=True, dtype=tf.double)
encoding_max = tf.Variable(initial_value=0.0, trainable=True, dtype=tf.double)
bit_width = tf.Variable(initial_value=8, trainable=False, dtype=tf.int8)
use_symmetric_encoding = tf.Variable(initial_value=False, trainable=False, dtype=tf.bool)
sess.run([mode_var.initializer, tensor_quant_ref.initializer, encoding_min.initializer,
encoding_max.initializer, bit_width.initializer, use_symmetric_encoding.initializer])
pass_through_op_output = zero_out_module.qc_quantize(name='quant_op', in_tensor=inp,
op_mode=mode_var,
tensor_quantizer_reference=tensor_quant_ref,
encoding_min=encoding_min,
encoding_max=encoding_max,
bit_width=bit_width,
use_symmetric_encoding=use_symmetric_encoding)
inp_tensor = sess.graph.get_tensor_by_name('input:0')
inp_data = np.random.rand(10) * 256
# get the output
print(inp_data)
out_data = sess.run(pass_through_op_output, feed_dict={inp_tensor: inp_data})
print(out_data)
self.assertTrue(tensor_quantizer.isEncodingValid)
encoding = tensor_quantizer.computeEncoding()
print('min=', encoding.min, ', max=', encoding.max)
# compare qc_quantize op's output with input
self.assertFalse(np.allclose(out_data, inp_data))
sess.close()
def test_qc_quantize_op_gpu(self):
"""
test custom op with GPU
"""
zero_out_module = tf.load_op_library('libaimet_tf_ops.so')
graph = tf.Graph()
config = tf.ConfigProto(log_device_placement=False)
sess = tf.Session(graph=graph, config=config)
with graph.as_default():
inp = tf.placeholder(tf.float32, shape=[10], name='input')
tensor_quantizer = libpymo.TensorQuantizer(8, libpymo.QuantizationMode.QUANTIZATION_TF_ENHANCED,
libpymo.RoundingMode.ROUND_NEAREST, True)
tensor_quantizer_val = libpymo.PtrToInt64(tensor_quantizer)
tensor_quant_ref = tf.Variable(initial_value=tensor_quantizer_val, trainable=False, dtype=tf.int64)
mode_var = tf.Variable(initial_value=int(libpymo.TensorQuantizerOpMode.updateStats),
trainable=False, dtype=tf.int32)
encoding_min = tf.Variable(initial_value=0.0, trainable=True, dtype=tf.double)
encoding_max = tf.Variable(initial_value=0.0, trainable=True, dtype=tf.double)
bit_width = tf.Variable(initial_value=8, trainable=False, dtype=tf.int8)
use_symmetric_encoding = tf.Variable(initial_value=False, trainable=False, dtype=tf.bool)
sess.run([mode_var.initializer, tensor_quant_ref.initializer, encoding_min.initializer,
encoding_max.initializer, bit_width.initializer, use_symmetric_encoding.initializer])
# place holder for the input
with tf.device("/device:GPU:0"):
pass_through_op_output = zero_out_module.qc_quantize(name='quant_op', in_tensor=inp,
op_mode=mode_var,
tensor_quantizer_reference=tensor_quant_ref,
encoding_min=encoding_min,
encoding_max=encoding_max,
bit_width=bit_width,
use_symmetric_encoding=use_symmetric_encoding)
inp_tensor = sess.graph.get_tensor_by_name('input:0')
inp_data = np.random.rand(10)
# get the output
print(inp_data)
with tf.device("/device:GPU:0"):
out_data = sess.run(pass_through_op_output, feed_dict={inp_tensor: inp_data})
print(out_data)
# compare qc_quantize op's output with input
self.assertTrue(np.allclose(out_data, inp_data))
# compute encodings
self.assertFalse(tensor_quantizer.isEncodingValid)
encoding = tensor_quantizer.computeEncoding()
self.assertTrue(tensor_quantizer.isEncodingValid)
print('min=', encoding.min, ', max=', encoding.max)
# get the output
inp_data = np.random.rand(10) * 2
print(inp_data)
mode_var.load(int(libpymo.TensorQuantizerOpMode.quantizeDequantize), sess)
with tf.device("/device:GPU:0"):
out_data = sess.run(pass_through_op_output, feed_dict={inp_tensor: inp_data})
print(out_data)
# compare qc_quantize op's output with input
self.assertFalse(np.allclose(out_data, inp_data))
sess.close()
| 50.305085 | 115 | 0.601836 | 1,304 | 11,872 | 5.255368 | 0.184816 | 0.050343 | 0.044652 | 0.057785 | 0.815701 | 0.800963 | 0.800379 | 0.785641 | 0.785641 | 0.779804 | 0 | 0.009978 | 0.307783 | 11,872 | 235 | 116 | 50.519149 | 0.823923 | 0.189774 | 0 | 0.916667 | 0 | 0 | 0.022 | 0 | 0 | 0 | 0 | 0 | 0.075758 | 1 | 0.022727 | false | 0.060606 | 0.030303 | 0 | 0.060606 | 0.098485 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
538b0ec54d6fdb85eaf85b01fb0b254ae4d102d2 | 197 | py | Python | doc/samples/group.py | m4ta1l/doit | d1a1b7b3abc7641d977d3b78b580d97aea4e27ea | [
"MIT"
] | 1,390 | 2015-01-01T21:11:47.000Z | 2022-03-31T11:35:44.000Z | doc/samples/group.py | m4ta1l/doit | d1a1b7b3abc7641d977d3b78b580d97aea4e27ea | [
"MIT"
] | 393 | 2015-01-05T11:18:29.000Z | 2022-03-20T11:46:46.000Z | doc/samples/group.py | m4ta1l/doit | d1a1b7b3abc7641d977d3b78b580d97aea4e27ea | [
"MIT"
] | 176 | 2015-01-07T16:58:56.000Z | 2022-03-28T12:12:11.000Z | def task_foo():
return {'actions': ["echo foo"]}
def task_bar():
return {'actions': ["echo bar"]}
def task_mygroup():
return {'actions': None,
'task_dep': ['foo', 'bar']}
| 19.7 | 39 | 0.548223 | 24 | 197 | 4.333333 | 0.416667 | 0.201923 | 0.326923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.233503 | 197 | 9 | 40 | 21.888889 | 0.688742 | 0 | 0 | 0 | 0 | 0 | 0.258883 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | true | 0 | 0 | 0.428571 | 0.857143 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
5398f2ec12ef0dd376455b0714de6c07560beb9d | 23,193 | gyp | Python | tools/distrib/cefclient.gyp | janseM3319/cef | c992ef9d5796cc371a24fc9163d4e215ecbe1721 | [
"BSD-3-Clause"
] | null | null | null | tools/distrib/cefclient.gyp | janseM3319/cef | c992ef9d5796cc371a24fc9163d4e215ecbe1721 | [
"BSD-3-Clause"
] | null | null | null | tools/distrib/cefclient.gyp | janseM3319/cef | c992ef9d5796cc371a24fc9163d4e215ecbe1721 | [
"BSD-3-Clause"
] | 3 | 2015-10-21T07:39:45.000Z | 2018-10-10T04:48:54.000Z | # Copyright (c) 2011 The Chromium Embedded Framework Authors. All rights
# reserved. Use of this source code is governed by a BSD-style license that
# can be found in the LICENSE file.
{
'variables': {
'chromium_code': 1,
'framework_name': 'Chromium Embedded Framework',
'linux_use_gold_binary': 0,
'linux_use_gold_flags': 0,
# Don't use clang with CEF binary releases due to Chromium tree structure dependency.
'clang': 0,
'conditions': [
['sysroot!=""', {
'pkg-config': './pkg-config-wrapper "<(sysroot)" "<(target_arch)"',
}, {
'pkg-config': 'pkg-config'
}],
[ 'OS=="win"', {
'multi_threaded_dll%': 0,
}],
]
},
'includes': [
# Bring in the source file lists for cefclient.
'cef_paths2.gypi',
],
'targets': [
{
'target_name': 'cefclient',
'type': 'executable',
'mac_bundle': 1,
'msvs_guid': '6617FED9-C5D4-4907-BF55-A90062A6683F',
'dependencies': [
'libcef_dll_wrapper',
],
'defines': [
'USING_CEF_SHARED',
],
'include_dirs': [
'.',
],
'sources': [
'<@(includes_common)',
'<@(includes_wrapper)',
],
'mac_bundle_resources': [
'<@(cefclient_bundle_resources_mac)',
],
'mac_bundle_resources!': [
# TODO(mark): Come up with a fancier way to do this (mac_info_plist?)
# that automatically sets the correct INFOPLIST_FILE setting and adds
# the file to a source group.
'cefclient/resources/mac/Info.plist',
],
'xcode_settings': {
'INFOPLIST_FILE': 'cefclient/resources/mac/Info.plist',
# Target build path.
'SYMROOT': 'xcodebuild',
},
'conditions': [
['OS=="win"', {
'variables': {
'win_exe_compatibility_manifest': 'cefclient/resources/win/compatibility.manifest',
},
'actions': [
{
'action_name': 'copy_resources',
'msvs_cygwin_shell': 0,
'inputs': [],
'outputs': [
'<(PRODUCT_DIR)/copy_resources.stamp',
],
'action': [
'xcopy /efy',
'Resources\*',
'$(OutDir)',
],
},
{
'action_name': 'copy_executables',
'msvs_cygwin_shell': 0,
'inputs': [],
'outputs': [
'<(PRODUCT_DIR)/copy_executables.stamp',
],
'action': [
'xcopy /efy',
'$(ConfigurationName)\*.exe',
'$(OutDir)',
],
},
{
'action_name': 'copy_libraries',
'msvs_cygwin_shell': 0,
'inputs': [],
'outputs': [
'<(PRODUCT_DIR)/copy_libraries.stamp',
],
'action': [
'xcopy /efy',
'$(ConfigurationName)\*.dll',
'$(OutDir)',
],
},
{
'action_name': 'copy_bin_files',
'msvs_cygwin_shell': 0,
'inputs': [],
'outputs': [
'<(PRODUCT_DIR)/copy_bin_files.stamp',
],
'action': [
'xcopy /efy',
'$(ConfigurationName)\*.bin',
'$(OutDir)',
],
},
],
'msvs_settings': {
'VCLinkerTool': {
# Set /SUBSYSTEM:WINDOWS.
'SubSystem': '2',
},
'VCManifestTool': {
'AdditionalManifestFiles': [
'cefclient/resources/win/cefclient.exe.manifest',
],
},
},
'link_settings': {
'libraries': [
'-lcomctl32.lib',
'-lshlwapi.lib',
'-lrpcrt4.lib',
'-lopengl32.lib',
'-lglu32.lib',
'-l$(ConfigurationName)/libcef.lib',
],
},
'library_dirs': [
# Needed to find cef_sandbox.lib using #pragma comment(lib, ...).
'$(ConfigurationName)',
],
'sources': [
'<@(includes_win)',
'<@(cefclient_sources_win)',
],
}],
[ 'OS=="win" and multi_threaded_dll', {
'configurations': {
'Debug': {
'msvs_settings': {
'VCCLCompilerTool': {
'RuntimeLibrary': 3,
'WarnAsError': 'false',
},
},
},
'Release': {
'msvs_settings': {
'VCCLCompilerTool': {
'RuntimeLibrary': 2,
'WarnAsError': 'false',
},
},
}
}
}],
[ 'OS=="mac"', {
'product_name': 'cefclient',
'dependencies': [
'cefclient_helper_app',
],
'copies': [
{
# Add libraries and helper app.
'destination': '<(PRODUCT_DIR)/cefclient.app/Contents/Frameworks',
'files': [
'<(PRODUCT_DIR)/cefclient Helper.app',
],
},
],
'postbuilds': [
{
'postbuild_name': 'Add framework',
'action': [
'cp',
'-Rf',
'${CONFIGURATION}/<(framework_name).framework',
'${BUILT_PRODUCTS_DIR}/${PRODUCT_NAME}.app/Contents/Frameworks/'
],
},
{
'postbuild_name': 'Fix Framework Link',
'action': [
'install_name_tool',
'-change',
'@executable_path/<(framework_name)',
'@executable_path/../Frameworks/<(framework_name).framework/<(framework_name)',
'${BUILT_PRODUCTS_DIR}/${EXECUTABLE_PATH}'
],
},
{
# This postbuid step is responsible for creating the following
# helpers:
#
# cefclient Helper EH.app and cefclient Helper NP.app are created
# from cefclient Helper.app.
#
# The EH helper is marked for an executable heap. The NP helper
# is marked for no PIE (ASLR).
'postbuild_name': 'Make More Helpers',
'action': [
'tools/make_more_helpers.sh',
'Frameworks',
'cefclient',
],
},
],
'link_settings': {
'libraries': [
'$(SDKROOT)/System/Library/Frameworks/AppKit.framework',
'$(SDKROOT)/System/Library/Frameworks/OpenGL.framework',
'$(CONFIGURATION)/<(framework_name).framework/<(framework_name)',
],
},
'sources': [
'<@(includes_mac)',
'<@(cefclient_sources_mac)',
],
}],
[ 'OS=="linux" or OS=="freebsd" or OS=="openbsd"', {
'copies': [
{
'destination': '<(PRODUCT_DIR)/files',
'files': [
'<@(cefclient_bundle_resources_linux)',
],
},
{
'destination': '<(PRODUCT_DIR)/',
'files': [
'Resources/cef.pak',
'Resources/cef_100_percent.pak',
'Resources/cef_200_percent.pak',
'Resources/cef_extensions.pak',
'Resources/devtools_resources.pak',
'Resources/icudtl.dat',
'Resources/locales/',
'$(BUILDTYPE)/chrome-sandbox',
'$(BUILDTYPE)/libcef.so',
'$(BUILDTYPE)/natives_blob.bin',
'$(BUILDTYPE)/snapshot_blob.bin',
],
},
],
'dependencies': [
'gtk',
'gtkglext',
],
'link_settings': {
'ldflags': [
# Look for libcef.so in the current directory. Path can also be
# specified using the LD_LIBRARY_PATH environment variable.
'-Wl,-rpath,.',
],
'libraries': [
"$(BUILDTYPE)/libcef.so",
"-lX11",
],
},
'sources': [
'<@(includes_linux)',
'<@(cefclient_sources_linux)',
],
}],
],
},
{
'target_name': 'cefsimple',
'type': 'executable',
'mac_bundle': 1,
'msvs_guid': '5390D142-473F-49A0-BC5E-5F6C609EEDB6',
'dependencies': [
'libcef_dll_wrapper',
],
'defines': [
'USING_CEF_SHARED',
],
'include_dirs': [
'.',
],
'sources': [
'<@(includes_common)',
'<@(includes_wrapper)',
'<@(cefsimple_sources_common)',
],
'mac_bundle_resources': [
'<@(cefsimple_bundle_resources_mac)',
],
'mac_bundle_resources!': [
# TODO(mark): Come up with a fancier way to do this (mac_info_plist?)
# that automatically sets the correct INFOPLIST_FILE setting and adds
# the file to a source group.
'cefsimple/mac/Info.plist',
],
'xcode_settings': {
'INFOPLIST_FILE': 'cefsimple/mac/Info.plist',
# Target build path.
'SYMROOT': 'xcodebuild',
},
'conditions': [
['OS=="win"', {
'variables': {
'win_exe_compatibility_manifest': 'cefsimple/compatibility.manifest',
},
'actions': [
{
'action_name': 'copy_resources',
'msvs_cygwin_shell': 0,
'inputs': [],
'outputs': [
'<(PRODUCT_DIR)/copy_resources.stamp',
],
'action': [
'xcopy /efy',
'Resources\*',
'$(OutDir)',
],
},
{
'action_name': 'copy_executables',
'msvs_cygwin_shell': 0,
'inputs': [],
'outputs': [
'<(PRODUCT_DIR)/copy_executables.stamp',
],
'action': [
'xcopy /efy',
'$(ConfigurationName)\*.exe',
'$(OutDir)',
],
},
{
'action_name': 'copy_libraries',
'msvs_cygwin_shell': 0,
'inputs': [],
'outputs': [
'<(PRODUCT_DIR)/copy_libraries.stamp',
],
'action': [
'xcopy /efy',
'$(ConfigurationName)\*.dll',
'$(OutDir)',
],
},
{
'action_name': 'copy_bin_files',
'msvs_cygwin_shell': 0,
'inputs': [],
'outputs': [
'<(PRODUCT_DIR)/copy_bin_files.stamp',
],
'action': [
'xcopy /efy',
'$(ConfigurationName)\*.bin',
'$(OutDir)',
],
},
],
'msvs_settings': {
'VCLinkerTool': {
# Set /SUBSYSTEM:WINDOWS.
'SubSystem': '2',
},
'VCManifestTool': {
'AdditionalManifestFiles': [
'cefsimple/cefsimple.exe.manifest',
],
},
},
'link_settings': {
'libraries': [
'-lcomctl32.lib',
'-lshlwapi.lib',
'-lrpcrt4.lib',
'-l$(ConfigurationName)/libcef.lib',
],
},
'library_dirs': [
# Needed to find cef_sandbox.lib using #pragma comment(lib, ...).
'$(ConfigurationName)',
],
'sources': [
'<@(includes_win)',
'<@(cefsimple_sources_win)',
],
}],
[ 'OS=="win" and multi_threaded_dll', {
'configurations': {
'Debug': {
'msvs_settings': {
'VCCLCompilerTool': {
'RuntimeLibrary': 3,
'WarnAsError': 'false',
},
},
},
'Release': {
'msvs_settings': {
'VCCLCompilerTool': {
'RuntimeLibrary': 2,
'WarnAsError': 'false',
},
},
}
}
}],
[ 'OS=="mac"', {
'product_name': 'cefsimple',
'dependencies': [
'cefsimple_helper_app',
],
'copies': [
{
# Add libraries and helper app.
'destination': '<(PRODUCT_DIR)/cefsimple.app/Contents/Frameworks',
'files': [
'<(PRODUCT_DIR)/cefsimple Helper.app',
],
},
],
'postbuilds': [
{
'postbuild_name': 'Add framework',
'action': [
'cp',
'-Rf',
'${CONFIGURATION}/<(framework_name).framework',
'${BUILT_PRODUCTS_DIR}/${PRODUCT_NAME}.app/Contents/Frameworks/'
],
},
{
'postbuild_name': 'Fix Framework Link',
'action': [
'install_name_tool',
'-change',
'@executable_path/<(framework_name)',
'@executable_path/../Frameworks/<(framework_name).framework/<(framework_name)',
'${BUILT_PRODUCTS_DIR}/${EXECUTABLE_PATH}'
],
},
{
# This postbuid step is responsible for creating the following
# helpers:
#
# cefsimple Helper EH.app and cefsimple Helper NP.app are created
# from cefsimple Helper.app.
#
# The EH helper is marked for an executable heap. The NP helper
# is marked for no PIE (ASLR).
'postbuild_name': 'Make More Helpers',
'action': [
'tools/make_more_helpers.sh',
'Frameworks',
'cefsimple',
],
},
],
'link_settings': {
'libraries': [
'$(SDKROOT)/System/Library/Frameworks/AppKit.framework',
'$(CONFIGURATION)/<(framework_name).framework/<(framework_name)',
],
},
'sources': [
'<@(includes_mac)',
'<@(cefsimple_sources_mac)',
],
}],
[ 'OS=="linux" or OS=="freebsd" or OS=="openbsd"', {
'copies': [
{
'destination': '<(PRODUCT_DIR)/',
'files': [
'Resources/cef.pak',
'Resources/cef_100_percent.pak',
'Resources/cef_200_percent.pak',
'Resources/cef_extensions.pak',
'Resources/devtools_resources.pak',
'Resources/icudtl.dat',
'Resources/locales/',
'$(BUILDTYPE)/chrome-sandbox',
'$(BUILDTYPE)/libcef.so',
'$(BUILDTYPE)/natives_blob.bin',
'$(BUILDTYPE)/snapshot_blob.bin',
],
},
],
'link_settings': {
'ldflags': [
# Look for libcef.so in the current directory. Path can also be
# specified using the LD_LIBRARY_PATH environment variable.
'-Wl,-rpath,.',
],
'libraries': [
"$(BUILDTYPE)/libcef.so",
"-lX11",
],
},
'sources': [
'<@(includes_linux)',
'<@(cefsimple_sources_linux)',
],
}],
],
},
{
'target_name': 'libcef_dll_wrapper',
'type': 'static_library',
'msvs_guid': 'A9D6DC71-C0DC-4549-AEA0-3B15B44E86A9',
'defines': [
'USING_CEF_SHARED',
],
'include_dirs': [
'.',
],
'sources': [
'<@(includes_common)',
'<@(includes_capi)',
'<@(includes_wrapper)',
'<@(libcef_dll_wrapper_sources_common)',
],
'xcode_settings': {
# Target build path.
'SYMROOT': 'xcodebuild',
},
'conditions': [
[ 'OS=="win" and multi_threaded_dll', {
'configurations': {
'Debug': {
'msvs_settings': {
'VCCLCompilerTool': {
'RuntimeLibrary': 3,
'WarnAsError': 'false',
},
},
},
'Release': {
'msvs_settings': {
'VCCLCompilerTool': {
'RuntimeLibrary': 2,
'WarnAsError': 'false',
},
},
}
}
}],
],
},
],
'conditions': [
['OS=="mac"', {
'targets': [
{
'target_name': 'cefclient_helper_app',
'type': 'executable',
'variables': { 'enable_wexit_time_destructors': 1, },
'product_name': 'cefclient Helper',
'mac_bundle': 1,
'dependencies': [
'libcef_dll_wrapper',
],
'defines': [
'USING_CEF_SHARED',
],
'include_dirs': [
'.',
],
'link_settings': {
'libraries': [
'$(SDKROOT)/System/Library/Frameworks/AppKit.framework',
'$(CONFIGURATION)/<(framework_name).framework/<(framework_name)',
],
},
'sources': [
'<@(cefclient_sources_mac_helper)',
],
# TODO(mark): Come up with a fancier way to do this. It should only
# be necessary to list helper-Info.plist once, not the three times it
# is listed here.
'mac_bundle_resources!': [
'cefclient/resources/mac/helper-Info.plist',
],
# TODO(mark): For now, don't put any resources into this app. Its
# resources directory will be a symbolic link to the browser app's
# resources directory.
'mac_bundle_resources/': [
['exclude', '.*'],
],
'xcode_settings': {
'INFOPLIST_FILE': 'cefclient/resources/mac/helper-Info.plist',
},
'postbuilds': [
{
# The framework defines its load-time path
# (DYLIB_INSTALL_NAME_BASE) relative to the main executable
# (chrome). A different relative path needs to be used in
# cefclient_helper_app.
'postbuild_name': 'Fix Framework Link',
'action': [
'install_name_tool',
'-change',
'@executable_path/<(framework_name)',
'@executable_path/../../../../Frameworks/<(framework_name).framework/<(framework_name)',
'${BUILT_PRODUCTS_DIR}/${EXECUTABLE_PATH}'
],
},
],
}, # target cefclient_helper_app
{
'target_name': 'cefsimple_helper_app',
'type': 'executable',
'variables': { 'enable_wexit_time_destructors': 1, },
'product_name': 'cefsimple Helper',
'mac_bundle': 1,
'dependencies': [
'libcef_dll_wrapper',
],
'defines': [
'USING_CEF_SHARED',
],
'include_dirs': [
'.',
],
'link_settings': {
'libraries': [
'$(SDKROOT)/System/Library/Frameworks/AppKit.framework',
'$(CONFIGURATION)/<(framework_name).framework/<(framework_name)',
],
},
'sources': [
'<@(cefsimple_sources_mac_helper)',
],
# TODO(mark): Come up with a fancier way to do this. It should only
# be necessary to list helper-Info.plist once, not the three times it
# is listed here.
'mac_bundle_resources!': [
'cefsimple/mac/helper-Info.plist',
],
# TODO(mark): For now, don't put any resources into this app. Its
# resources directory will be a symbolic link to the browser app's
# resources directory.
'mac_bundle_resources/': [
['exclude', '.*'],
],
'xcode_settings': {
'INFOPLIST_FILE': 'cefsimple/mac/helper-Info.plist',
},
'postbuilds': [
{
# The framework defines its load-time path
# (DYLIB_INSTALL_NAME_BASE) relative to the main executable
# (chrome). A different relative path needs to be used in
# cefsimple_helper_app.
'postbuild_name': 'Fix Framework Link',
'action': [
'install_name_tool',
'-change',
'@executable_path/<(framework_name)',
'@executable_path/../../../../Frameworks/<(framework_name).framework/<(framework_name)',
'${BUILT_PRODUCTS_DIR}/${EXECUTABLE_PATH}'
],
},
],
}, # target cefsimple_helper_app
],
}], # OS=="mac"
[ 'OS=="linux" or OS=="freebsd" or OS=="openbsd"', {
'targets': [
{
'target_name': 'gtk',
'type': 'none',
'variables': {
# gtk requires gmodule, but it does not list it as a dependency
# in some misconfigured systems.
'gtk_packages': 'gmodule-2.0 gtk+-2.0 gthread-2.0 gtk+-unix-print-2.0',
},
'direct_dependent_settings': {
'cflags': [
'$(shell <(pkg-config) --cflags <(gtk_packages))',
],
},
'link_settings': {
'ldflags': [
'$(shell <(pkg-config) --libs-only-L --libs-only-other <(gtk_packages))',
],
'libraries': [
'$(shell <(pkg-config) --libs-only-l <(gtk_packages))',
],
},
},
{
'target_name': 'gtkglext',
'type': 'none',
'variables': {
# gtkglext is required by the cefclient OSR example.
'gtk_packages': 'gtkglext-1.0',
},
'direct_dependent_settings': {
'cflags': [
'$(shell <(pkg-config) --cflags <(gtk_packages))',
],
},
'link_settings': {
'ldflags': [
'$(shell <(pkg-config) --libs-only-L --libs-only-other <(gtk_packages))',
],
'libraries': [
'$(shell <(pkg-config) --libs-only-l <(gtk_packages))',
],
},
},
],
}], # OS=="linux" or OS=="freebsd" or OS=="openbsd"
],
}
| 31.814815 | 104 | 0.431596 | 1,708 | 23,193 | 5.653396 | 0.177986 | 0.030965 | 0.022784 | 0.013256 | 0.825911 | 0.823115 | 0.806752 | 0.788525 | 0.781172 | 0.771023 | 0 | 0.009623 | 0.426508 | 23,193 | 728 | 105 | 31.858516 | 0.716337 | 0.124477 | 0 | 0.733434 | 0 | 0.004518 | 0.427703 | 0.188427 | 0 | 0 | 0 | 0.001374 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.001506 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
53ce3f011f36136e447768679199ae87e63f02db | 3,301 | py | Python | tests/terraform/checks/resource/azure/test_NetworkInterfacePublicIPAddressId.py | kylelaker/checkov | 6eada26030a87f397a6bf1831827b3dc6c5dad2d | [
"Apache-2.0"
] | 5 | 2021-07-29T18:08:40.000Z | 2022-03-21T04:39:32.000Z | tests/terraform/checks/resource/azure/test_NetworkInterfacePublicIPAddressId.py | kylelaker/checkov | 6eada26030a87f397a6bf1831827b3dc6c5dad2d | [
"Apache-2.0"
] | 16 | 2021-03-09T07:38:38.000Z | 2021-06-09T03:53:55.000Z | tests/terraform/checks/resource/azure/test_NetworkInterfacePublicIPAddressId.py | kylelaker/checkov | 6eada26030a87f397a6bf1831827b3dc6c5dad2d | [
"Apache-2.0"
] | 2 | 2021-08-23T13:25:36.000Z | 2021-11-05T21:44:52.000Z | import unittest
import hcl2
from checkov.terraform.checks.resource.azure.NetworkInterfacePublicIPAddressId import check
from checkov.common.models.enums import CheckResult
class TestNetworkInterfacePublicIPAddressId(unittest.TestCase):
def test_failure(self):
hcl_res = hcl2.loads("""
resource "azurerm_network_interface" "example" {
name = "example-nic"
location = azurerm_resource_group.example.location
resource_group_name = azurerm_resource_group.example.name
ip_configuration {
name = "internal"
subnet_id = azurerm_subnet.example.id
private_ip_address_allocation = "Dynamic"
}
ip_configuration {
name = "internal2"
subnet_id = azurerm_subnet.example.id2
private_ip_address_allocation = "Dynamic"
public_ip_address_id = azurerm_subnet.example2.id
}
}
""")
resource_conf = hcl_res['resource'][0]['azurerm_network_interface']['example']
scan_result = check.scan_resource_conf(conf=resource_conf)
self.assertEqual(CheckResult.FAILED, scan_result)
def test_success(self):
hcl_res = hcl2.loads("""
resource "azurerm_network_interface" "example" {
name = "example-nic"
location = azurerm_resource_group.example.location
resource_group_name = azurerm_resource_group.example.name
ip_configuration {
name = "internal"
subnet_id = azurerm_subnet.example.id
private_ip_address_allocation = "Dynamic"
}
ip_configuration {
name = "internal2"
subnet_id = azurerm_subnet.example.id2
private_ip_address_allocation = "Dynamic"
}
enable_ip_forwarding = false
}
""")
resource_conf = hcl_res['resource'][0]['azurerm_network_interface']['example']
scan_result = check.scan_resource_conf(conf=resource_conf)
self.assertEqual(CheckResult.PASSED, scan_result)
def test_success_no_param(self):
hcl_res = hcl2.loads("""
resource "azurerm_network_interface" "example" {
name = "example-nic"
location = azurerm_resource_group.example.location
resource_group_name = azurerm_resource_group.example.name
}
""")
resource_conf = hcl_res['resource'][0]['azurerm_network_interface']['example']
scan_result = check.scan_resource_conf(conf=resource_conf)
self.assertEqual(CheckResult.PASSED, scan_result)
if __name__ == '__main__':
unittest.main()
| 44.013333 | 91 | 0.523781 | 268 | 3,301 | 6.089552 | 0.227612 | 0.071691 | 0.084559 | 0.110294 | 0.802696 | 0.779412 | 0.779412 | 0.779412 | 0.779412 | 0.779412 | 0 | 0.006126 | 0.406543 | 3,301 | 74 | 92 | 44.608108 | 0.826953 | 0 | 0 | 0.677419 | 0 | 0 | 0.69585 | 0.188428 | 0 | 0 | 0 | 0 | 0.048387 | 1 | 0.048387 | false | 0.032258 | 0.064516 | 0 | 0.129032 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
53f4e7bcfe190410378a7abacd12c75bb78cf609 | 1,306 | py | Python | tests/test_data/test_blending.py | Naoki-Wake/mmaction2 | a2032605db82509744a18d993c94a06feb1efd15 | [
"Apache-2.0"
] | 648 | 2021-06-24T19:33:09.000Z | 2022-03-31T06:27:24.000Z | tests/test_data/test_blending.py | jayleicn/mmaction2-1 | 0a6fde1abb8403f1f68b568f5b4694c6f828e27e | [
"Apache-2.0"
] | 53 | 2021-07-01T03:07:52.000Z | 2022-03-27T16:15:29.000Z | tests/test_data/test_blending.py | jayleicn/mmaction2-1 | 0a6fde1abb8403f1f68b568f5b4694c6f828e27e | [
"Apache-2.0"
] | 117 | 2021-06-25T01:22:32.000Z | 2022-03-31T08:33:55.000Z | import torch
from mmaction.datasets import CutmixBlending, MixupBlending
def test_mixup():
alpha = 0.2
num_classes = 10
label = torch.randint(0, num_classes, (4, ))
mixup = MixupBlending(num_classes, alpha)
# NCHW imgs
imgs = torch.randn(4, 4, 3, 32, 32)
mixed_imgs, mixed_label = mixup(imgs, label)
assert mixed_imgs.shape == torch.Size((4, 4, 3, 32, 32))
assert mixed_label.shape == torch.Size((4, num_classes))
# NCTHW imgs
imgs = torch.randn(4, 4, 2, 3, 32, 32)
mixed_imgs, mixed_label = mixup(imgs, label)
assert mixed_imgs.shape == torch.Size((4, 4, 2, 3, 32, 32))
assert mixed_label.shape == torch.Size((4, num_classes))
def test_cutmix():
alpha = 0.2
num_classes = 10
label = torch.randint(0, num_classes, (4, ))
mixup = CutmixBlending(num_classes, alpha)
# NCHW imgs
imgs = torch.randn(4, 4, 3, 32, 32)
mixed_imgs, mixed_label = mixup(imgs, label)
assert mixed_imgs.shape == torch.Size((4, 4, 3, 32, 32))
assert mixed_label.shape == torch.Size((4, num_classes))
# NCTHW imgs
imgs = torch.randn(4, 4, 2, 3, 32, 32)
mixed_imgs, mixed_label = mixup(imgs, label)
assert mixed_imgs.shape == torch.Size((4, 4, 2, 3, 32, 32))
assert mixed_label.shape == torch.Size((4, num_classes))
| 31.095238 | 63 | 0.644717 | 202 | 1,306 | 4.029703 | 0.148515 | 0.12285 | 0.04914 | 0.14742 | 0.857494 | 0.857494 | 0.857494 | 0.857494 | 0.857494 | 0.857494 | 0 | 0.074074 | 0.214395 | 1,306 | 41 | 64 | 31.853659 | 0.719298 | 0.031394 | 0 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 1 | 0.071429 | false | 0 | 0.071429 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
53faaab80c7c541746dbdf1d4e40508828687dfe | 62,822 | py | Python | tests/asp/AllAnswerSets/nontight/test3.gus.gringo.test.py | bernardocuteri/wasp | 05c8f961776dbdbf7afbf905ee00fc262eba51ad | [
"Apache-2.0"
] | 19 | 2015-12-03T08:53:45.000Z | 2022-03-31T02:09:43.000Z | tests/asp/AllAnswerSets/nontight/test3.gus.gringo.test.py | bernardocuteri/wasp | 05c8f961776dbdbf7afbf905ee00fc262eba51ad | [
"Apache-2.0"
] | 80 | 2017-11-25T07:57:32.000Z | 2018-06-10T19:03:30.000Z | tests/asp/AllAnswerSets/nontight/test3.gus.gringo.test.py | bernardocuteri/wasp | 05c8f961776dbdbf7afbf905ee00fc262eba51ad | [
"Apache-2.0"
] | 6 | 2015-01-15T07:51:48.000Z | 2020-06-18T14:47:48.000Z | input = """
1 2 0 0
1 3 0 0
1 4 0 0
1 5 0 0
1 6 0 0
1 7 0 0
1 8 0 0
1 9 0 0
1 10 0 0
1 11 0 0
1 12 0 0
1 13 0 0
1 14 0 0
1 15 0 0
1 16 0 0
1 17 0 0
1 18 0 0
1 19 0 0
1 20 0 0
1 21 0 0
1 22 0 0
1 23 0 0
1 24 0 0
1 25 0 0
1 26 0 0
1 27 0 0
1 28 0 0
1 29 0 0
1 30 0 0
1 31 0 0
1 32 0 0
1 33 0 0
1 34 0 0
1 35 0 0
1 36 0 0
1 37 0 0
1 38 0 0
1 39 0 0
1 40 0 0
1 41 0 0
1 42 0 0
1 43 0 0
1 44 0 0
1 45 0 0
1 46 0 0
1 47 0 0
1 48 0 0
1 49 0 0
1 50 0 0
1 51 0 0
1 52 0 0
1 53 0 0
1 54 0 0
1 55 2 1 56 57
1 56 2 1 55 57
1 57 0 0
1 58 2 1 59 60
1 59 2 1 58 60
1 60 0 0
1 61 2 1 62 63
1 62 2 1 61 63
1 63 0 0
1 64 2 1 65 66
1 65 2 1 64 66
1 66 0 0
1 67 2 1 68 69
1 68 2 1 67 69
1 69 0 0
1 70 2 1 71 72
1 71 2 1 70 72
1 72 0 0
1 73 2 1 74 75
1 74 2 1 73 75
1 75 0 0
1 76 2 1 77 78
1 77 2 1 76 78
1 78 0 0
1 79 2 1 80 81
1 80 2 1 79 81
1 81 0 0
1 82 2 1 83 84
1 83 2 1 82 84
1 84 0 0
1 85 2 1 86 87
1 86 2 1 85 87
1 87 0 0
1 88 2 1 89 90
1 89 2 1 88 90
1 90 0 0
1 91 2 1 92 93
1 92 2 1 91 93
1 93 0 0
1 94 2 1 95 96
1 95 2 1 94 96
1 96 0 0
1 97 2 1 98 99
1 98 2 1 97 99
1 99 0 0
1 100 2 1 101 102
1 101 2 1 100 102
1 102 0 0
1 103 2 1 104 105
1 104 2 1 103 105
1 105 0 0
1 106 2 1 107 108
1 107 2 1 106 108
1 108 0 0
1 109 2 1 110 111
1 110 2 1 109 111
1 111 0 0
1 112 2 1 113 114
1 113 2 1 112 114
1 114 0 0
1 115 2 1 116 117
1 116 2 1 115 117
1 117 0 0
1 118 2 1 119 120
1 119 2 1 118 120
1 120 0 0
1 121 2 1 122 123
1 122 2 1 121 123
1 123 0 0
1 124 2 1 125 126
1 125 2 1 124 126
1 126 0 0
1 127 2 1 128 129
1 128 2 1 127 129
1 129 0 0
1 130 2 1 131 132
1 131 2 1 130 132
1 132 0 0
1 133 2 1 134 135
1 134 2 1 133 135
1 135 0 0
1 136 2 1 137 138
1 137 2 1 136 138
1 138 0 0
1 139 2 1 140 141
1 140 2 1 139 141
1 141 0 0
1 142 2 1 143 144
1 143 2 1 142 144
1 144 0 0
1 145 2 1 146 147
1 146 2 1 145 147
1 147 0 0
1 148 2 1 149 150
1 149 2 1 148 150
1 150 0 0
1 151 2 1 152 153
1 152 2 1 151 153
1 153 0 0
1 154 2 1 155 156
1 155 2 1 154 156
1 156 0 0
1 157 2 1 158 159
1 158 2 1 157 159
1 159 0 0
1 160 2 1 161 162
1 161 2 1 160 162
1 162 0 0
1 163 2 1 164 165
1 164 2 1 163 165
1 165 0 0
1 166 2 1 167 168
1 167 2 1 166 168
1 168 0 0
1 169 2 1 170 171
1 170 2 1 169 171
1 171 0 0
1 172 2 1 173 174
1 173 2 1 172 174
1 174 0 0
1 175 2 1 176 177
1 176 2 1 175 177
1 177 0 0
1 178 2 1 179 180
1 179 2 1 178 180
1 180 0 0
1 181 1 0 70
1 182 1 0 79
1 183 1 0 76
1 184 1 0 73
1 185 2 0 181 55
1 186 2 0 181 67
1 187 2 0 181 64
1 184 2 0 181 61
1 188 2 0 181 58
1 185 2 0 184 103
1 188 2 0 184 106
1 182 2 0 184 109
1 181 2 0 184 100
1 185 2 0 183 133
1 182 2 0 183 142
1 189 2 0 183 139
1 190 2 0 183 136
1 185 2 0 182 163
1 183 2 0 182 178
1 189 2 0 182 175
1 190 2 0 182 172
1 184 2 0 182 169
1 188 2 0 182 166
1 181 2 0 185 70
1 182 2 0 185 79
1 183 2 0 185 76
1 184 2 0 185 73
1 181 2 0 188 82
1 190 2 0 188 88
1 187 2 0 188 91
1 186 2 0 188 94
1 182 2 0 188 97
1 184 2 0 188 85
1 189 2 0 190 115
1 183 2 0 190 118
1 182 2 0 190 121
1 188 2 0 190 112
1 190 2 0 189 124
1 182 2 0 189 130
1 183 2 0 189 127
1 181 2 0 187 145
1 186 2 0 187 151
1 188 2 0 187 148
1 181 2 0 186 154
1 187 2 0 186 160
1 188 2 0 186 157
1 1 2 0 55 67
1 1 2 0 55 64
1 1 2 0 55 61
1 1 2 0 55 58
1 1 2 0 58 55
1 1 2 0 58 67
1 1 2 0 58 64
1 1 2 0 58 61
1 1 2 0 61 55
1 1 2 0 61 67
1 1 2 0 61 64
1 1 2 0 61 58
1 1 2 0 64 55
1 1 2 0 64 67
1 1 2 0 64 61
1 1 2 0 64 58
1 1 2 0 67 55
1 1 2 0 67 64
1 1 2 0 67 61
1 1 2 0 67 58
1 1 2 0 70 79
1 1 2 0 70 76
1 1 2 0 70 73
1 1 2 0 73 70
1 1 2 0 73 79
1 1 2 0 73 76
1 1 2 0 76 70
1 1 2 0 76 79
1 1 2 0 76 73
1 1 2 0 79 70
1 1 2 0 79 76
1 1 2 0 79 73
1 1 2 0 82 88
1 1 2 0 82 91
1 1 2 0 82 94
1 1 2 0 82 97
1 1 2 0 82 85
1 1 2 0 85 82
1 1 2 0 85 88
1 1 2 0 85 91
1 1 2 0 85 94
1 1 2 0 85 97
1 1 2 0 88 82
1 1 2 0 88 91
1 1 2 0 88 94
1 1 2 0 88 97
1 1 2 0 88 85
1 1 2 0 91 82
1 1 2 0 91 88
1 1 2 0 91 94
1 1 2 0 91 97
1 1 2 0 91 85
1 1 2 0 94 82
1 1 2 0 94 88
1 1 2 0 94 91
1 1 2 0 94 97
1 1 2 0 94 85
1 1 2 0 97 82
1 1 2 0 97 88
1 1 2 0 97 91
1 1 2 0 97 94
1 1 2 0 97 85
1 1 2 0 100 103
1 1 2 0 100 106
1 1 2 0 100 109
1 1 2 0 103 106
1 1 2 0 103 109
1 1 2 0 103 100
1 1 2 0 106 103
1 1 2 0 106 109
1 1 2 0 106 100
1 1 2 0 109 103
1 1 2 0 109 106
1 1 2 0 109 100
1 1 2 0 112 115
1 1 2 0 112 118
1 1 2 0 112 121
1 1 2 0 115 118
1 1 2 0 115 121
1 1 2 0 115 112
1 1 2 0 118 115
1 1 2 0 118 121
1 1 2 0 118 112
1 1 2 0 121 115
1 1 2 0 121 118
1 1 2 0 121 112
1 1 2 0 124 130
1 1 2 0 124 127
1 1 2 0 127 124
1 1 2 0 127 130
1 1 2 0 130 124
1 1 2 0 130 127
1 1 2 0 133 142
1 1 2 0 133 139
1 1 2 0 133 136
1 1 2 0 136 133
1 1 2 0 136 142
1 1 2 0 136 139
1 1 2 0 139 133
1 1 2 0 139 142
1 1 2 0 139 136
1 1 2 0 142 133
1 1 2 0 142 139
1 1 2 0 142 136
1 1 2 0 145 151
1 1 2 0 145 148
1 1 2 0 148 145
1 1 2 0 148 151
1 1 2 0 151 145
1 1 2 0 151 148
1 1 2 0 154 160
1 1 2 0 154 157
1 1 2 0 157 154
1 1 2 0 157 160
1 1 2 0 160 154
1 1 2 0 160 157
1 1 2 0 163 178
1 1 2 0 163 175
1 1 2 0 163 172
1 1 2 0 163 169
1 1 2 0 163 166
1 1 2 0 166 163
1 1 2 0 166 178
1 1 2 0 166 175
1 1 2 0 166 172
1 1 2 0 166 169
1 1 2 0 169 163
1 1 2 0 169 178
1 1 2 0 169 175
1 1 2 0 169 172
1 1 2 0 169 166
1 1 2 0 172 163
1 1 2 0 172 178
1 1 2 0 172 175
1 1 2 0 172 169
1 1 2 0 172 166
1 1 2 0 175 163
1 1 2 0 175 178
1 1 2 0 175 172
1 1 2 0 175 169
1 1 2 0 175 166
1 1 2 0 178 163
1 1 2 0 178 175
1 1 2 0 178 172
1 1 2 0 178 169
1 1 2 0 178 166
1 1 2 0 55 103
1 1 2 0 55 163
1 1 2 0 55 133
1 1 2 0 58 106
1 1 2 0 58 166
1 1 2 0 58 157
1 1 2 0 58 148
1 1 2 0 58 112
1 1 2 0 61 169
1 1 2 0 61 85
1 1 2 0 61 73
1 1 2 0 64 91
1 1 2 0 64 160
1 1 2 0 67 94
1 1 2 0 67 151
1 1 2 0 70 154
1 1 2 0 70 145
1 1 2 0 70 100
1 1 2 0 70 82
1 1 2 0 73 61
1 1 2 0 73 169
1 1 2 0 73 85
1 1 2 0 76 118
1 1 2 0 76 178
1 1 2 0 76 127
1 1 2 0 79 97
1 1 2 0 79 142
1 1 2 0 79 130
1 1 2 0 79 109
1 1 2 0 79 121
1 1 2 0 82 70
1 1 2 0 82 154
1 1 2 0 82 145
1 1 2 0 82 100
1 1 2 0 85 61
1 1 2 0 85 169
1 1 2 0 85 73
1 1 2 0 88 172
1 1 2 0 88 136
1 1 2 0 88 124
1 1 2 0 91 160
1 1 2 0 91 64
1 1 2 0 94 151
1 1 2 0 94 67
1 1 2 0 97 142
1 1 2 0 97 130
1 1 2 0 97 109
1 1 2 0 97 121
1 1 2 0 97 79
1 1 2 0 100 70
1 1 2 0 100 154
1 1 2 0 100 145
1 1 2 0 100 82
1 1 2 0 103 163
1 1 2 0 103 133
1 1 2 0 103 55
1 1 2 0 106 166
1 1 2 0 106 157
1 1 2 0 106 148
1 1 2 0 106 112
1 1 2 0 106 58
1 1 2 0 109 97
1 1 2 0 109 142
1 1 2 0 109 130
1 1 2 0 109 121
1 1 2 0 109 79
1 1 2 0 112 106
1 1 2 0 112 166
1 1 2 0 112 157
1 1 2 0 112 148
1 1 2 0 112 58
1 1 2 0 115 175
1 1 2 0 115 139
1 1 2 0 118 178
1 1 2 0 118 127
1 1 2 0 118 76
1 1 2 0 121 97
1 1 2 0 121 142
1 1 2 0 121 130
1 1 2 0 121 109
1 1 2 0 121 79
1 1 2 0 124 88
1 1 2 0 124 172
1 1 2 0 124 136
1 1 2 0 127 118
1 1 2 0 127 178
1 1 2 0 127 76
1 1 2 0 130 97
1 1 2 0 130 142
1 1 2 0 130 109
1 1 2 0 130 121
1 1 2 0 130 79
1 1 2 0 133 103
1 1 2 0 133 163
1 1 2 0 133 55
1 1 2 0 136 88
1 1 2 0 136 172
1 1 2 0 136 124
1 1 2 0 139 115
1 1 2 0 139 175
1 1 2 0 142 97
1 1 2 0 142 130
1 1 2 0 142 109
1 1 2 0 142 121
1 1 2 0 142 79
1 1 2 0 145 70
1 1 2 0 145 154
1 1 2 0 145 100
1 1 2 0 145 82
1 1 2 0 148 106
1 1 2 0 148 166
1 1 2 0 148 157
1 1 2 0 148 112
1 1 2 0 148 58
1 1 2 0 151 94
1 1 2 0 151 67
1 1 2 0 154 70
1 1 2 0 154 145
1 1 2 0 154 100
1 1 2 0 154 82
1 1 2 0 157 106
1 1 2 0 157 166
1 1 2 0 157 148
1 1 2 0 157 112
1 1 2 0 157 58
1 1 2 0 160 91
1 1 2 0 160 64
1 1 2 0 163 103
1 1 2 0 163 133
1 1 2 0 163 55
1 1 2 0 166 106
1 1 2 0 166 157
1 1 2 0 166 148
1 1 2 0 166 112
1 1 2 0 166 58
1 1 2 0 169 61
1 1 2 0 169 85
1 1 2 0 169 73
1 1 2 0 172 88
1 1 2 0 172 136
1 1 2 0 172 124
1 1 2 0 175 115
1 1 2 0 175 139
1 1 2 0 178 118
1 1 2 0 178 127
1 1 2 0 178 76
1 1 1 1 181
1 1 1 1 181
1 1 1 1 181
1 1 1 1 181
1 1 1 1 181
1 1 1 1 185
1 1 1 1 185
1 1 1 1 185
1 1 1 1 185
1 1 1 1 188
1 1 1 1 188
1 1 1 1 188
1 1 1 1 188
1 1 1 1 188
1 1 1 1 188
1 1 1 1 184
1 1 1 1 184
1 1 1 1 184
1 1 1 1 184
1 1 1 1 190
1 1 1 1 190
1 1 1 1 190
1 1 1 1 190
1 1 1 1 189
1 1 1 1 189
1 1 1 1 189
1 1 1 1 183
1 1 1 1 183
1 1 1 1 183
1 1 1 1 183
1 1 1 1 187
1 1 1 1 187
1 1 1 1 187
1 1 1 1 186
1 1 1 1 186
1 1 1 1 186
1 1 1 1 182
1 1 1 1 182
1 1 1 1 182
1 1 1 1 182
1 1 1 1 182
1 1 1 1 182
0
45 vertex(0)
46 vertex(1)
47 vertex(2)
48 vertex(3)
49 vertex(4)
50 vertex(5)
51 vertex(6)
52 vertex(7)
53 vertex(8)
54 vertex(9)
3 arc(0,1)
4 arc(0,2)
5 arc(0,3)
6 arc(0,7)
7 arc(0,8)
8 arc(1,0)
9 arc(1,3)
10 arc(1,6)
11 arc(1,9)
12 arc(2,0)
13 arc(2,3)
14 arc(2,4)
15 arc(2,7)
16 arc(2,8)
17 arc(2,9)
18 arc(3,0)
19 arc(3,1)
20 arc(3,2)
21 arc(3,9)
22 arc(4,2)
23 arc(4,5)
24 arc(4,6)
25 arc(4,9)
26 arc(5,4)
27 arc(5,6)
28 arc(5,9)
29 arc(6,1)
30 arc(6,4)
31 arc(6,5)
32 arc(6,9)
33 arc(7,0)
34 arc(7,2)
35 arc(7,8)
36 arc(8,0)
37 arc(8,2)
38 arc(8,7)
39 arc(9,1)
40 arc(9,2)
41 arc(9,3)
42 arc(9,4)
43 arc(9,5)
44 arc(9,6)
181 reached(0)
182 reached(9)
183 reached(6)
184 reached(3)
185 reached(1)
186 reached(8)
187 reached(7)
188 reached(2)
189 reached(5)
190 reached(4)
56 out_hm(0,1)
59 out_hm(0,2)
62 out_hm(0,3)
65 out_hm(0,7)
68 out_hm(0,8)
71 out_hm(1,0)
74 out_hm(1,3)
77 out_hm(1,6)
80 out_hm(1,9)
83 out_hm(2,0)
86 out_hm(2,3)
89 out_hm(2,4)
92 out_hm(2,7)
95 out_hm(2,8)
98 out_hm(2,9)
101 out_hm(3,0)
104 out_hm(3,1)
107 out_hm(3,2)
110 out_hm(3,9)
113 out_hm(4,2)
116 out_hm(4,5)
119 out_hm(4,6)
122 out_hm(4,9)
125 out_hm(5,4)
128 out_hm(5,6)
131 out_hm(5,9)
134 out_hm(6,1)
137 out_hm(6,4)
140 out_hm(6,5)
143 out_hm(6,9)
146 out_hm(7,0)
149 out_hm(7,2)
152 out_hm(7,8)
155 out_hm(8,0)
158 out_hm(8,2)
161 out_hm(8,7)
164 out_hm(9,1)
167 out_hm(9,2)
170 out_hm(9,3)
173 out_hm(9,4)
176 out_hm(9,5)
179 out_hm(9,6)
55 in_hm(0,1)
58 in_hm(0,2)
61 in_hm(0,3)
64 in_hm(0,7)
67 in_hm(0,8)
70 in_hm(1,0)
73 in_hm(1,3)
76 in_hm(1,6)
79 in_hm(1,9)
82 in_hm(2,0)
85 in_hm(2,3)
88 in_hm(2,4)
91 in_hm(2,7)
94 in_hm(2,8)
97 in_hm(2,9)
100 in_hm(3,0)
103 in_hm(3,1)
106 in_hm(3,2)
109 in_hm(3,9)
112 in_hm(4,2)
115 in_hm(4,5)
118 in_hm(4,6)
121 in_hm(4,9)
124 in_hm(5,4)
127 in_hm(5,6)
130 in_hm(5,9)
133 in_hm(6,1)
136 in_hm(6,4)
139 in_hm(6,5)
142 in_hm(6,9)
145 in_hm(7,0)
148 in_hm(7,2)
151 in_hm(7,8)
154 in_hm(8,0)
157 in_hm(8,2)
160 in_hm(8,7)
163 in_hm(9,1)
166 in_hm(9,2)
169 in_hm(9,3)
172 in_hm(9,4)
175 in_hm(9,5)
178 in_hm(9,6)
2 start(1)
0
B+
0
B-
1
0
1
"""
output = """
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(5,4), out_hm(5,6), out_hm(4,9), in_hm(1,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(2,4), out_hm(9,5), out_hm(9,1), out_hm(0,1), in_hm(3,1), out_hm(3,9), out_hm(3,2), out_hm(3,0), out_hm(2,3), in_hm(8,7), out_hm(2,7), out_hm(0,7), out_hm(8,2), out_hm(8,0), out_hm(9,3), in_hm(0,3), out_hm(0,2), out_hm(0,8), in_hm(2,8), out_hm(7,8), out_hm(2,9), out_hm(2,0), in_hm(7,0), out_hm(7,2), in_hm(6,4), out_hm(9,4), out_hm(6,5), out_hm(6,9), in_hm(4,5), in_hm(5,9), out_hm(4,2), in_hm(9,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(0,1), out_hm(9,5), in_hm(8,7), out_hm(2,7), out_hm(0,7), out_hm(8,2), out_hm(8,0), out_hm(9,3), out_hm(6,4), in_hm(5,4), out_hm(9,4), out_hm(5,9), in_hm(6,5), out_hm(4,5), out_hm(6,9), out_hm(3,0), out_hm(2,3), in_hm(0,3), out_hm(0,2), out_hm(0,8), in_hm(2,8), out_hm(7,8), out_hm(2,9), out_hm(2,0), in_hm(7,0), out_hm(7,2), out_hm(9,1), in_hm(3,1), out_hm(3,9), out_hm(3,2), in_hm(4,9), out_hm(4,2), in_hm(9,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(0,1), out_hm(9,5), out_hm(6,4), in_hm(8,7), out_hm(2,7), out_hm(0,7), out_hm(8,2), out_hm(8,0), out_hm(9,3), in_hm(5,4), out_hm(9,4), in_hm(6,5), out_hm(5,9), out_hm(4,5), out_hm(6,9), in_hm(9,1), out_hm(3,0), out_hm(2,3), out_hm(3,1), out_hm(9,2), in_hm(4,2), in_hm(0,3), out_hm(7,2), out_hm(3,2), out_hm(0,2), out_hm(4,9), out_hm(0,8), in_hm(2,8), out_hm(7,8), out_hm(2,9), out_hm(2,0), in_hm(3,9), in_hm(7,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(0,1), out_hm(9,5), out_hm(6,4), in_hm(8,7), out_hm(2,7), out_hm(0,7), out_hm(8,2), out_hm(8,0), out_hm(9,3), out_hm(5,4), out_hm(4,9), out_hm(9,1), out_hm(3,2), in_hm(9,4), in_hm(3,1), out_hm(9,2), in_hm(4,2), out_hm(3,9), out_hm(3,0), out_hm(7,2), out_hm(0,2), out_hm(4,5), out_hm(2,3), in_hm(6,5), in_hm(0,3), out_hm(6,9), in_hm(5,9), out_hm(0,8), in_hm(2,8), out_hm(2,9), out_hm(7,8), out_hm(2,0), in_hm(7,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(9,3), out_hm(0,1), out_hm(9,5), out_hm(5,4), out_hm(4,9), out_hm(9,1), in_hm(3,1), out_hm(3,9), out_hm(3,2), out_hm(3,0), out_hm(2,3), in_hm(0,3), out_hm(0,2), out_hm(0,8), out_hm(0,7), out_hm(6,4), in_hm(9,4), out_hm(8,7), in_hm(2,7), out_hm(9,2), in_hm(4,2), out_hm(2,9), out_hm(2,8), out_hm(2,0), out_hm(7,2), out_hm(8,2), out_hm(4,5), in_hm(7,8), in_hm(6,5), out_hm(7,0), out_hm(6,9), in_hm(5,9), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(9,3), out_hm(0,1), out_hm(9,5), out_hm(5,4), out_hm(4,9), out_hm(9,1), in_hm(6,4), in_hm(3,1), out_hm(9,4), out_hm(6,5), out_hm(6,9), out_hm(8,7), out_hm(3,9), out_hm(3,2), out_hm(3,0), in_hm(4,5), out_hm(2,3), out_hm(4,2), in_hm(9,2), in_hm(0,3), out_hm(8,2), out_hm(7,2), out_hm(0,2), out_hm(0,8), out_hm(0,7), in_hm(2,7), out_hm(2,9), out_hm(2,8), out_hm(2,0), in_hm(5,9), in_hm(7,8), out_hm(7,0), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(9,3), out_hm(0,1), out_hm(9,5), in_hm(5,4), out_hm(6,4), in_hm(6,5), out_hm(9,4), out_hm(5,9), out_hm(4,5), out_hm(6,9), out_hm(8,7), out_hm(3,0), out_hm(2,3), in_hm(0,3), out_hm(0,2), out_hm(0,8), out_hm(0,7), in_hm(2,7), out_hm(2,9), out_hm(2,8), out_hm(2,0), in_hm(7,8), out_hm(7,2), out_hm(7,0), in_hm(8,0), out_hm(8,2), out_hm(9,1), in_hm(3,1), out_hm(3,9), out_hm(3,2), in_hm(4,9), out_hm(4,2), in_hm(9,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(9,3), out_hm(0,1), out_hm(9,5), in_hm(5,4), out_hm(6,4), in_hm(6,5), out_hm(9,4), out_hm(5,9), out_hm(4,5), out_hm(6,9), out_hm(8,7), in_hm(9,1), out_hm(3,0), out_hm(2,3), out_hm(3,1), out_hm(9,2), in_hm(4,2), in_hm(0,3), out_hm(7,2), out_hm(8,2), out_hm(3,2), out_hm(0,2), out_hm(4,9), out_hm(0,8), out_hm(0,7), in_hm(2,7), out_hm(2,9), out_hm(2,8), out_hm(2,0), in_hm(3,9), in_hm(7,8), out_hm(7,0), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(9,3), out_hm(0,1), in_hm(9,5), out_hm(6,5), out_hm(4,5), out_hm(9,4), out_hm(9,2), in_hm(4,2), out_hm(9,1), out_hm(7,2), out_hm(8,2), out_hm(3,2), out_hm(0,2), out_hm(4,9), in_hm(3,1), out_hm(3,9), out_hm(3,0), out_hm(2,3), in_hm(0,3), out_hm(0,8), out_hm(0,7), in_hm(5,4), out_hm(6,4), in_hm(6,9), out_hm(5,9), out_hm(2,9), out_hm(8,7), in_hm(2,7), out_hm(2,8), out_hm(2,0), in_hm(7,8), out_hm(7,0), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), out_hm(9,3), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), in_hm(9,5), out_hm(6,5), out_hm(4,5), out_hm(9,4), out_hm(9,2), in_hm(4,2), out_hm(9,1), out_hm(7,2), out_hm(8,2), out_hm(3,2), out_hm(0,2), out_hm(4,9), out_hm(0,1), in_hm(3,1), out_hm(3,9), out_hm(3,0), out_hm(2,3), in_hm(0,3), out_hm(0,8), out_hm(0,7), in_hm(5,4), in_hm(8,7), out_hm(6,4), in_hm(6,9), out_hm(5,9), out_hm(2,7), in_hm(2,8), out_hm(8,0), out_hm(2,9), out_hm(7,8), out_hm(2,0), in_hm(7,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), out_hm(9,3), out_hm(1,6), out_hm(0,1), in_hm(9,5), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,4), out_hm(9,2), out_hm(9,1), out_hm(1,3), in_hm(2,3), out_hm(0,3), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,0), out_hm(3,1), in_hm(6,1), out_hm(6,4), out_hm(6,9), in_hm(5,4), out_hm(5,6), out_hm(5,9), in_hm(4,6), out_hm(4,9), out_hm(4,2), in_hm(1,0), out_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(1,9), in_hm(3,9), out_hm(3,2), in_hm(8,7), out_hm(0,7), in_hm(0,8), out_hm(8,2), out_hm(0,2), out_hm(7,8), in_hm(7,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), out_hm(0,1), out_hm(1,6), out_hm(9,3), in_hm(9,5), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,4), out_hm(9,2), out_hm(9,1), in_hm(1,0), out_hm(2,0), out_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(1,3), out_hm(1,9), in_hm(2,3), out_hm(0,3), out_hm(2,9), in_hm(3,9), out_hm(2,8), out_hm(2,7), out_hm(3,1), out_hm(8,7), in_hm(0,7), out_hm(4,9), out_hm(5,9), out_hm(6,9), out_hm(3,2), in_hm(6,1), out_hm(0,2), out_hm(0,8), out_hm(6,4), in_hm(7,8), in_hm(5,4), out_hm(7,2), out_hm(5,6), in_hm(4,6), out_hm(4,2), in_hm(8,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), out_hm(0,1), out_hm(1,6), out_hm(9,3), in_hm(9,5), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,4), out_hm(9,2), out_hm(9,1), out_hm(1,0), out_hm(2,9), in_hm(4,2), out_hm(7,2), out_hm(8,2), out_hm(3,2), out_hm(0,2), out_hm(4,9), out_hm(4,6), in_hm(5,6), out_hm(5,9), out_hm(5,4), in_hm(6,4), out_hm(6,9), out_hm(6,1), in_hm(3,1), out_hm(3,9), in_hm(1,9), out_hm(3,0), out_hm(2,3), out_hm(1,3), in_hm(0,3), out_hm(0,8), out_hm(0,7), out_hm(2,8), in_hm(2,7), in_hm(7,8), out_hm(8,7), out_hm(2,0), out_hm(7,0), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), out_hm(1,6), out_hm(0,1), out_hm(1,0), out_hm(9,3), out_hm(2,9), out_hm(1,3), in_hm(9,5), in_hm(2,8), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,4), out_hm(9,2), in_hm(4,2), out_hm(9,1), out_hm(7,8), out_hm(0,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(7,2), out_hm(8,2), out_hm(3,2), out_hm(0,2), out_hm(4,9), out_hm(4,6), in_hm(0,3), in_hm(5,6), out_hm(0,7), out_hm(5,9), out_hm(5,4), in_hm(8,7), in_hm(6,4), out_hm(8,0), out_hm(6,9), out_hm(6,1), in_hm(3,1), out_hm(3,9), in_hm(1,9), out_hm(3,0), in_hm(7,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(2,4), out_hm(1,6), out_hm(9,2), out_hm(0,1), out_hm(1,0), out_hm(9,3), out_hm(9,5), out_hm(2,9), in_hm(4,2), out_hm(7,2), out_hm(8,2), out_hm(3,2), out_hm(0,2), out_hm(4,9), out_hm(4,6), out_hm(4,5), in_hm(9,6), in_hm(5,4), in_hm(6,5), out_hm(5,6), out_hm(9,4), out_hm(9,1), out_hm(6,4), out_hm(5,9), out_hm(6,9), out_hm(6,1), in_hm(3,1), out_hm(3,9), in_hm(1,9), out_hm(3,0), out_hm(2,3), out_hm(1,3), in_hm(0,3), out_hm(0,8), out_hm(0,7), in_hm(2,8), out_hm(7,8), out_hm(2,7), out_hm(2,0), in_hm(8,7), out_hm(8,0), in_hm(7,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(2,4), out_hm(9,2), out_hm(0,1), out_hm(1,0), out_hm(9,3), out_hm(9,5), out_hm(2,9), in_hm(4,2), out_hm(2,8), out_hm(7,2), out_hm(8,2), out_hm(3,2), out_hm(0,2), out_hm(4,9), out_hm(4,6), out_hm(4,5), in_hm(9,6), in_hm(5,4), in_hm(6,5), out_hm(5,6), out_hm(9,4), out_hm(9,1), out_hm(6,4), out_hm(5,9), out_hm(6,9), out_hm(6,1), in_hm(3,1), out_hm(3,9), in_hm(1,9), out_hm(3,0), out_hm(2,3), out_hm(1,3), in_hm(0,3), out_hm(0,8), in_hm(2,7), out_hm(0,7), in_hm(7,8), out_hm(8,7), out_hm(2,0), out_hm(7,0), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(2,4), out_hm(9,2), out_hm(9,1), out_hm(9,5), out_hm(0,1), out_hm(1,0), out_hm(4,2), out_hm(9,3), in_hm(2,9), in_hm(1,3), out_hm(4,9), out_hm(3,9), out_hm(5,9), out_hm(6,9), out_hm(1,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), in_hm(3,0), out_hm(0,3), out_hm(7,0), out_hm(8,0), out_hm(3,2), out_hm(3,1), out_hm(5,4), in_hm(6,1), out_hm(6,4), out_hm(6,5), in_hm(9,4), in_hm(4,5), out_hm(9,6), out_hm(4,6), in_hm(5,6), out_hm(0,8), in_hm(0,7), in_hm(7,8), out_hm(0,2), out_hm(8,7), out_hm(7,2), in_hm(8,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(2,4), out_hm(9,2), out_hm(9,1), out_hm(9,5), out_hm(0,1), out_hm(1,0), out_hm(4,2), out_hm(9,3), in_hm(2,9), in_hm(1,3), in_hm(0,8), out_hm(4,9), out_hm(3,9), out_hm(5,9), out_hm(6,9), out_hm(1,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), in_hm(3,0), out_hm(0,3), out_hm(0,2), out_hm(7,8), out_hm(0,7), out_hm(7,0), out_hm(8,0), out_hm(3,2), out_hm(3,1), in_hm(8,7), out_hm(5,4), in_hm(6,1), out_hm(8,2), out_hm(6,4), out_hm(6,5), in_hm(9,4), in_hm(7,2), in_hm(4,5), out_hm(9,6), out_hm(4,6), in_hm(5,6)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(2,4), out_hm(9,2), out_hm(9,1), out_hm(9,3), out_hm(9,5), out_hm(0,1), in_hm(1,0), out_hm(2,0), out_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(1,3), out_hm(1,9), out_hm(2,9), in_hm(3,9), out_hm(4,9), out_hm(5,9), out_hm(6,9), out_hm(3,2), out_hm(3,1), out_hm(4,2), out_hm(5,4), in_hm(6,1), out_hm(6,4), out_hm(6,5), in_hm(9,4), in_hm(4,5), out_hm(9,6), out_hm(4,6), in_hm(5,6), out_hm(0,3), in_hm(2,3), out_hm(2,8), out_hm(2,7), out_hm(0,8), in_hm(0,7), in_hm(7,8), out_hm(0,2), out_hm(8,7), out_hm(7,2), in_hm(8,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(2,4), out_hm(9,2), out_hm(9,1), out_hm(9,3), out_hm(9,5), out_hm(2,9), in_hm(1,0), out_hm(2,0), out_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(1,3), out_hm(1,9), in_hm(3,9), out_hm(0,1), in_hm(0,8), out_hm(4,9), out_hm(5,9), out_hm(6,9), out_hm(3,2), out_hm(3,1), out_hm(0,2), out_hm(0,3), out_hm(7,8), out_hm(2,8), out_hm(0,7), out_hm(4,2), out_hm(5,4), in_hm(6,1), in_hm(2,3), out_hm(6,4), out_hm(6,5), in_hm(9,4), out_hm(2,7), in_hm(4,5), out_hm(9,6), in_hm(8,7), out_hm(4,6), out_hm(8,2), in_hm(5,6), in_hm(7,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(2,4), out_hm(9,2), out_hm(9,1), out_hm(9,3), out_hm(9,5), out_hm(2,9), out_hm(1,0), in_hm(0,1), in_hm(4,2), in_hm(1,3), out_hm(0,8), out_hm(0,7), out_hm(0,3), out_hm(0,2), out_hm(6,1), out_hm(3,1), out_hm(7,2), out_hm(8,2), out_hm(3,2), out_hm(4,9), out_hm(4,6), out_hm(4,5), in_hm(9,6), in_hm(5,4), out_hm(2,3), out_hm(1,9), in_hm(3,9), in_hm(6,5), out_hm(5,6), out_hm(9,4), out_hm(6,4), out_hm(5,9), out_hm(6,9), out_hm(3,0), out_hm(2,8), in_hm(2,7), in_hm(7,8), out_hm(8,7), out_hm(2,0), out_hm(7,0), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(2,4), out_hm(9,2), out_hm(9,1), out_hm(9,3), out_hm(9,5), out_hm(2,9), in_hm(2,8), out_hm(1,0), in_hm(0,1), in_hm(4,2), in_hm(1,3), out_hm(7,8), out_hm(0,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(0,7), out_hm(0,3), out_hm(0,2), out_hm(6,1), out_hm(3,1), out_hm(7,2), out_hm(8,2), out_hm(3,2), out_hm(4,9), out_hm(4,6), out_hm(4,5), in_hm(9,6), in_hm(5,4), out_hm(1,9), in_hm(3,9), in_hm(8,7), in_hm(6,5), out_hm(5,6), out_hm(9,4), out_hm(6,4), out_hm(5,9), out_hm(6,9), out_hm(3,0), out_hm(8,0), in_hm(7,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(2,4), out_hm(9,2), out_hm(9,1), out_hm(9,3), in_hm(9,5), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,4), in_hm(2,9), in_hm(1,3), out_hm(4,9), out_hm(3,9), out_hm(5,9), out_hm(6,9), out_hm(1,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(0,3), out_hm(1,0), in_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(3,2), out_hm(3,1), out_hm(0,1), in_hm(6,1), out_hm(6,4), in_hm(5,4), out_hm(5,6), in_hm(4,6), out_hm(4,2), out_hm(0,8), in_hm(0,7), in_hm(7,8), out_hm(0,2), out_hm(8,7), out_hm(7,2), in_hm(8,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(0,1), out_hm(2,4), in_hm(9,5), out_hm(9,2), out_hm(9,1), out_hm(9,3), out_hm(1,0), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,4), in_hm(2,9), in_hm(1,3), in_hm(0,8), out_hm(4,9), out_hm(3,9), out_hm(5,9), out_hm(6,9), out_hm(1,9), out_hm(2,8), in_hm(3,0), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(0,3), out_hm(0,2), out_hm(7,8), out_hm(0,7), out_hm(7,0), out_hm(8,0), out_hm(3,2), out_hm(3,1), in_hm(8,7), in_hm(6,1), out_hm(8,2), out_hm(6,4), in_hm(5,4), out_hm(5,6), in_hm(4,6), out_hm(4,2), in_hm(7,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(0,1), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), in_hm(9,5), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,3), out_hm(9,2), out_hm(9,1), out_hm(1,0), in_hm(3,0), in_hm(1,3), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(0,3), out_hm(1,9), in_hm(6,1), out_hm(6,9), in_hm(4,9), out_hm(5,9), out_hm(4,6), out_hm(4,2), in_hm(5,6), out_hm(0,8), in_hm(0,7), in_hm(7,8), out_hm(0,2), out_hm(8,7), out_hm(7,2), in_hm(8,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), in_hm(9,5), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,3), out_hm(9,2), out_hm(9,1), out_hm(0,1), out_hm(1,0), in_hm(3,0), in_hm(1,3), in_hm(0,8), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(0,3), out_hm(1,9), out_hm(0,2), out_hm(7,8), out_hm(0,7), in_hm(6,1), in_hm(8,7), out_hm(6,9), in_hm(4,9), out_hm(8,2), out_hm(5,9), out_hm(4,6), out_hm(4,2), in_hm(5,6), in_hm(7,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(0,3), out_hm(0,1), out_hm(9,5), in_hm(1,0), out_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(1,3), out_hm(1,9), in_hm(9,3), out_hm(9,6), out_hm(9,2), out_hm(9,1), out_hm(4,2), out_hm(0,8), in_hm(0,7), in_hm(7,8), out_hm(0,2), out_hm(8,7), out_hm(7,2), out_hm(3,9), out_hm(6,1), in_hm(3,1), out_hm(3,2), in_hm(8,2), in_hm(5,6), out_hm(4,6), in_hm(4,5), out_hm(5,9), out_hm(6,5), out_hm(4,9), in_hm(6,9)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(9,5), out_hm(0,3), out_hm(0,1), in_hm(1,0), out_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(1,3), out_hm(1,9), in_hm(9,3), out_hm(9,6), out_hm(9,2), out_hm(9,1), out_hm(4,2), out_hm(0,8), in_hm(0,7), in_hm(7,8), out_hm(0,2), out_hm(8,7), out_hm(7,2), out_hm(3,9), out_hm(6,1), out_hm(5,6), in_hm(4,6), out_hm(4,9), in_hm(3,1), out_hm(4,5), out_hm(3,2), in_hm(6,5), in_hm(8,2), out_hm(6,9), in_hm(5,9)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(9,5), out_hm(0,3), out_hm(0,1), in_hm(1,0), out_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(1,3), out_hm(1,9), in_hm(9,3), out_hm(9,6), out_hm(9,2), out_hm(9,1), out_hm(3,9), out_hm(6,1), in_hm(3,1), out_hm(3,2), out_hm(4,2), in_hm(0,8), out_hm(0,2), out_hm(7,8), out_hm(0,7), in_hm(8,7), out_hm(8,2), in_hm(7,2), in_hm(5,6), out_hm(4,6), in_hm(4,5), out_hm(5,9), out_hm(6,5), out_hm(4,9), in_hm(6,9)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(9,5), out_hm(0,3), out_hm(0,1), in_hm(1,0), out_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(1,3), out_hm(1,9), in_hm(9,3), out_hm(3,9), out_hm(9,6), out_hm(9,2), out_hm(9,1), out_hm(6,1), in_hm(3,1), out_hm(3,2), out_hm(4,2), in_hm(0,8), out_hm(0,2), out_hm(7,8), out_hm(0,7), out_hm(5,6), in_hm(4,6), out_hm(4,9), in_hm(8,7), out_hm(4,5), out_hm(8,2), in_hm(6,5), in_hm(7,2), out_hm(6,9), in_hm(5,9)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(9,5), out_hm(0,3), out_hm(0,1), out_hm(1,0), in_hm(3,0), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(4,2), in_hm(9,3), out_hm(1,3), out_hm(9,6), out_hm(9,2), out_hm(9,1), in_hm(6,1), in_hm(1,9), out_hm(6,5), out_hm(6,9), out_hm(4,9), out_hm(5,9), in_hm(4,5), out_hm(4,6), in_hm(5,6), in_hm(0,8), out_hm(0,2), out_hm(7,8), out_hm(0,7), in_hm(8,7), out_hm(8,2), in_hm(7,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(1,0), in_hm(3,0), out_hm(9,5), out_hm(0,3), out_hm(0,1), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(4,2), in_hm(9,3), out_hm(1,3), out_hm(9,6), out_hm(9,2), out_hm(9,1), out_hm(0,8), in_hm(0,7), in_hm(6,1), in_hm(7,8), out_hm(0,2), out_hm(8,7), in_hm(1,9), out_hm(6,5), out_hm(6,9), out_hm(7,2), out_hm(4,9), out_hm(5,9), in_hm(4,5), in_hm(8,2), out_hm(4,6), in_hm(5,6)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(1,0), in_hm(3,0), out_hm(9,5), out_hm(9,3), in_hm(1,3), out_hm(0,3), out_hm(0,1), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(1,9), out_hm(4,2), out_hm(9,6), out_hm(6,1), in_hm(9,1), out_hm(9,2), out_hm(0,8), in_hm(0,7), in_hm(7,8), out_hm(0,2), out_hm(8,7), out_hm(7,2), in_hm(8,2), in_hm(5,6), out_hm(4,6), in_hm(4,5), out_hm(5,9), out_hm(6,5), out_hm(4,9), in_hm(6,9)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(1,0), in_hm(3,0), out_hm(9,5), out_hm(9,3), in_hm(1,3), out_hm(0,3), out_hm(0,1), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(1,9), out_hm(4,2), out_hm(9,6), out_hm(6,1), in_hm(9,1), out_hm(9,2), out_hm(0,8), in_hm(0,7), in_hm(7,8), out_hm(0,2), out_hm(8,7), out_hm(5,6), in_hm(4,6), out_hm(4,9), out_hm(7,2), out_hm(4,5), in_hm(8,2), in_hm(6,5), out_hm(6,9), in_hm(5,9)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(1,0), in_hm(3,0), out_hm(9,5), out_hm(9,3), in_hm(1,3), out_hm(0,3), out_hm(0,1), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(1,9), out_hm(4,2), out_hm(9,6), in_hm(0,8), out_hm(6,1), out_hm(0,2), out_hm(7,8), out_hm(0,7), in_hm(9,1), in_hm(8,7), out_hm(9,2), out_hm(8,2), in_hm(7,2), out_hm(5,6), in_hm(4,6), out_hm(4,9), out_hm(4,5), in_hm(6,5), out_hm(6,9), in_hm(5,9)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(1,0), in_hm(3,0), out_hm(9,5), out_hm(9,3), in_hm(1,3), out_hm(0,3), out_hm(0,1), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(1,9), out_hm(4,2), out_hm(9,6), in_hm(0,8), in_hm(5,6), out_hm(6,1), out_hm(0,2), out_hm(7,8), out_hm(0,7), out_hm(4,6), in_hm(4,5), out_hm(5,9), in_hm(9,1), in_hm(8,7), out_hm(6,5), out_hm(4,9), out_hm(9,2), out_hm(8,2), in_hm(6,9), in_hm(7,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), in_hm(9,6), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(1,0), in_hm(3,0), out_hm(9,5), out_hm(9,3), in_hm(1,3), out_hm(0,3), out_hm(0,1), out_hm(5,6), out_hm(4,6), out_hm(9,2), out_hm(9,1), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(1,9), out_hm(4,9), in_hm(6,1), out_hm(4,2), out_hm(6,5), out_hm(6,9), in_hm(4,5), in_hm(5,9), in_hm(0,8), out_hm(0,2), out_hm(7,8), out_hm(0,7), in_hm(8,7), out_hm(8,2), in_hm(7,2)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,4), in_hm(9,6), out_hm(5,4), out_hm(6,4), out_hm(9,4), out_hm(2,9), out_hm(2,8), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(1,0), in_hm(3,0), out_hm(9,5), out_hm(9,3), in_hm(1,3), out_hm(0,8), in_hm(0,7), out_hm(0,3), out_hm(0,1), out_hm(5,6), out_hm(4,6), out_hm(9,2), out_hm(9,1), out_hm(7,0), out_hm(8,0), out_hm(3,9), out_hm(3,2), out_hm(3,1), out_hm(1,9), in_hm(7,8), out_hm(0,2), out_hm(8,7), out_hm(4,9), in_hm(6,1), out_hm(4,2), out_hm(7,2), out_hm(6,5), out_hm(6,9), in_hm(4,5), in_hm(8,2), in_hm(5,9)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), out_hm(2,4), in_hm(9,5), in_hm(0,1), out_hm(9,2), out_hm(9,1), out_hm(9,3), in_hm(1,3), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,4), out_hm(0,8), out_hm(0,7), out_hm(0,3), out_hm(0,2), out_hm(6,1), out_hm(3,1), out_hm(2,9), out_hm(2,3), out_hm(1,9), in_hm(3,9), out_hm(1,0), in_hm(4,2), out_hm(4,9), out_hm(5,9), out_hm(6,9), out_hm(3,2), out_hm(3,0), out_hm(7,2), out_hm(8,2), out_hm(4,6), in_hm(5,6), out_hm(5,4), in_hm(6,4), out_hm(2,8), in_hm(2,7), in_hm(7,8), out_hm(8,7), out_hm(2,0), out_hm(7,0), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), out_hm(1,6), in_hm(2,8), out_hm(2,4), in_hm(9,5), in_hm(0,1), out_hm(9,2), out_hm(9,1), out_hm(9,3), in_hm(1,3), out_hm(7,8), out_hm(0,8), out_hm(2,9), out_hm(2,7), out_hm(2,3), out_hm(2,0), out_hm(6,5), out_hm(4,5), out_hm(9,6), out_hm(9,4), out_hm(0,7), out_hm(0,3), out_hm(0,2), out_hm(6,1), out_hm(3,1), out_hm(1,9), in_hm(3,9), out_hm(1,0), in_hm(4,2), in_hm(8,7), out_hm(4,9), out_hm(5,9), out_hm(6,9), out_hm(3,2), out_hm(3,0), out_hm(7,2), out_hm(8,2), out_hm(4,6), out_hm(8,0), in_hm(5,6), in_hm(7,0), out_hm(5,4), in_hm(6,4)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(2,4), in_hm(9,3), out_hm(2,3), out_hm(0,3), out_hm(9,4), out_hm(9,5), out_hm(9,2), out_hm(9,1), out_hm(3,9), in_hm(0,1), out_hm(0,8), out_hm(0,7), out_hm(0,2), out_hm(3,1), out_hm(2,9), out_hm(4,2), in_hm(3,2), out_hm(7,2), out_hm(8,2), out_hm(3,0), in_hm(6,5), out_hm(4,5), out_hm(6,9), out_hm(6,4), in_hm(5,4), out_hm(5,9), in_hm(4,9), in_hm(2,8), out_hm(7,8), out_hm(2,7), out_hm(2,0), in_hm(8,7), out_hm(8,0), in_hm(7,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(2,4), in_hm(9,3), out_hm(2,3), out_hm(0,3), out_hm(9,4), out_hm(9,5), out_hm(9,2), out_hm(9,1), out_hm(3,9), in_hm(0,1), out_hm(0,8), out_hm(0,7), out_hm(0,2), out_hm(3,1), out_hm(2,9), out_hm(4,2), in_hm(3,2), out_hm(7,2), out_hm(8,2), out_hm(3,0), in_hm(6,5), out_hm(4,5), out_hm(6,9), out_hm(6,4), out_hm(2,8), in_hm(2,7), in_hm(5,4), in_hm(7,8), out_hm(8,7), out_hm(2,0), out_hm(5,9), in_hm(4,9), out_hm(7,0), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(2,4), in_hm(9,3), out_hm(2,3), out_hm(0,3), out_hm(9,4), out_hm(9,5), out_hm(9,2), out_hm(9,1), out_hm(3,9), in_hm(0,1), out_hm(0,8), out_hm(0,7), out_hm(0,2), out_hm(3,1), out_hm(2,9), out_hm(6,5), in_hm(6,4), out_hm(4,2), in_hm(3,2), in_hm(4,5), out_hm(5,4), out_hm(6,9), out_hm(7,2), out_hm(8,2), out_hm(3,0), out_hm(4,9), in_hm(5,9), out_hm(2,8), in_hm(2,7), in_hm(7,8), out_hm(8,7), out_hm(2,0), out_hm(7,0), in_hm(8,0)}
{start(1), arc(0,1), arc(0,2), arc(0,3), arc(0,7), arc(0,8), arc(1,0), arc(1,3), arc(1,6), arc(1,9), arc(2,0), arc(2,3), arc(2,4), arc(2,7), arc(2,8), arc(2,9), arc(3,0), arc(3,1), arc(3,2), arc(3,9), arc(4,2), arc(4,5), arc(4,6), arc(4,9), arc(5,4), arc(5,6), arc(5,9), arc(6,1), arc(6,4), arc(6,5), arc(6,9), arc(7,0), arc(7,2), arc(7,8), arc(8,0), arc(8,2), arc(8,7), arc(9,1), arc(9,2), arc(9,3), arc(9,4), arc(9,5), arc(9,6), vertex(0), vertex(1), vertex(2), vertex(3), vertex(4), vertex(5), vertex(6), vertex(7), vertex(8), vertex(9), reached(0), reached(1), reached(2), reached(3), reached(4), reached(5), reached(6), reached(7), reached(8), reached(9), in_hm(1,6), out_hm(5,6), out_hm(9,6), out_hm(4,6), out_hm(1,9), out_hm(1,3), out_hm(1,0), out_hm(6,1), out_hm(2,4), in_hm(9,3), out_hm(2,3), out_hm(0,3), out_hm(9,4), out_hm(9,5), out_hm(9,2), out_hm(9,1), out_hm(3,9), in_hm(0,1), in_hm(2,8), out_hm(0,8), out_hm(0,7), out_hm(0,2), out_hm(3,1), out_hm(2,9), out_hm(6,5), in_hm(6,4), out_hm(4,2), in_hm(3,2), out_hm(7,8), out_hm(2,7), out_hm(2,0), in_hm(4,5), out_hm(5,4), out_hm(6,9), out_hm(7,2), out_hm(8,2), out_hm(3,0), in_hm(8,7), out_hm(4,9), in_hm(5,9), out_hm(8,0), in_hm(7,0)}
"""
| 82.335518 | 1,196 | 0.584795 | 18,563 | 62,822 | 1.87502 | 0.010828 | 0.208297 | 0.025254 | 0.033557 | 0.907688 | 0.831667 | 0.826007 | 0.818221 | 0.811814 | 0.809458 | 0 | 0.253222 | 0.128108 | 62,822 | 762 | 1,197 | 82.44357 | 0.382225 | 0 | 0 | 0.057743 | 0 | 0.057743 | 0.999507 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
54ce3345807c270f7849354c7b331df13c05a4eb | 16,481 | py | Python | mpf/tests/test_Timer.py | atummons/mpf | 0830578680ddf5ab50515a69a3070756a45b1ed0 | [
"MIT"
] | 163 | 2015-01-25T02:19:50.000Z | 2022-03-26T12:00:28.000Z | mpf/tests/test_Timer.py | atummons/mpf | 0830578680ddf5ab50515a69a3070756a45b1ed0 | [
"MIT"
] | 1,086 | 2015-03-23T19:53:17.000Z | 2022-03-24T20:46:11.000Z | mpf/tests/test_Timer.py | atummons/mpf | 0830578680ddf5ab50515a69a3070756a45b1ed0 | [
"MIT"
] | 148 | 2015-01-28T02:31:39.000Z | 2022-03-22T13:54:01.000Z | # TODO: test remaining actions
# TODO: test empty control_events
from mpf.tests.MpfFakeGameTestCase import MpfFakeGameTestCase
class TestTimer(MpfFakeGameTestCase):
def get_config_file(self):
return 'test_timer.yaml'
def get_machine_path(self):
return 'tests/machine_files/timer/'
def _timer_start(self, **kwargs):
del kwargs
self.started = True
def _timer_tick(self, **kwargs):
del kwargs
self.tick += 1
def _timer_complete(self, **kwargs):
del kwargs
self.started = False
def test_start_with_game(self):
self.start_game()
self.advance_time_and_run()
self.assertIn(self.machine.modes["mode_with_timers2"],
self.machine.mode_controller.active_modes)
self.assertIn(self.machine.modes["game"],
self.machine.mode_controller.active_modes)
def test_timer_down_outside_of_game(self):
self.machine.events.add_handler("timer_timer_down_tick", self._timer_tick)
self.machine.events.add_handler("timer_timer_down_started", self._timer_start)
self.machine.events.add_handler("timer_timer_down_complete", self._timer_complete)
self.machine.events.add_handler("timer_timer_down_stopped", self._timer_complete)
self.assertFalse(self.machine.modes["mode_with_timers"].active)
self.tick = 0
self.started = False
# timer should not start when mode is not running
self.machine.events.post('start_timer_down')
self.advance_time_and_run(10)
self.assertFalse(self.started)
self.assertEqual(0, self.tick)
# mode should not start automatically
self.machine.events.post('start_mode_with_timers')
self.advance_time_and_run(10)
self.assertTrue(self.machine.modes["mode_with_timers"].active)
self.assertFalse(self.started)
self.assertEqual(0, self.tick)
self.machine.events.post('stop_mode_with_timers')
self.advance_time_and_run(10)
self.assertFalse(self.machine.modes["mode_with_timers"].active)
# start mode
self.machine.events.post('start_mode_with_timers')
self.advance_time_and_run(10)
self.assertTrue(self.machine.modes["mode_with_timers"].active)
self.assertFalse(self.started)
self.assertEqual(0, self.tick)
timer = self.machine.timers['timer_down']
self.assertFalse(timer.running)
# timer should start now
self.machine.events.post('start_timer_down')
self.advance_time_and_run(1)
self.assertTrue(timer.running)
self.assertTrue(self.started)
self.assertEqual(1, self.tick)
self.assertEqual(5, timer.ticks)
self.advance_time_and_run(.6)
self.assertTrue(self.started)
self.assertEqual(2, self.tick)
self.assertEqual(4, timer.ticks)
self.post_event("add_timer_down")
self.assertEqual(6, timer.ticks)
self.post_event("subtract_timer_down")
self.assertEqual(4, timer.ticks)
self.advance_time_and_run(1.5)
self.assertEqual(3, self.tick)
self.assertEqual(3, timer.ticks)
self.post_event("pause_timer_down")
self.advance_time_and_run(0.5)
self.assertEqual(3, self.tick)
self.assertEqual(3, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(3, self.tick)
self.assertEqual(3, timer.ticks)
self.advance_time_and_run(1)
# Resuming the paused timer re-ticks at the last number,
# so self.ticks increases but timer.ticks does not
self.assertEqual(4, self.tick)
self.assertEqual(3, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(5, self.tick)
self.assertEqual(2, timer.ticks)
self.advance_time_and_run(1.5)
self.assertEqual(6, self.tick)
self.assertEqual(1, timer.ticks)
self.advance_time_and_run(1.5)
self.assertEqual(6, self.tick)
self.assertEqual(0, timer.ticks)
self.advance_time_and_run(1.5)
self.assertEqual(6, self.tick)
# and complete at some point
self.assertFalse(self.started)
# stay off
self.advance_time_and_run(20)
self.assertEqual(6, self.tick)
self.assertFalse(self.started)
# cannot be start without reset
self.post_event("start_timer_down")
self.advance_time_and_run()
self.assertEqual(0, timer.ticks)
self.assertFalse(timer.running)
self.post_event("reset_timer_down")
self.advance_time_and_run()
self.assertEqual(5, timer.ticks)
self.assertFalse(timer.running)
self.post_event("start_timer_down")
self.advance_time_and_run()
self.assertTrue(timer.running)
self.assertEqual(5, timer.ticks)
self.advance_time_and_run()
self.assertEqual(4, timer.ticks)
def test_change_tick(self):
self.start_mode("mode_with_timers")
self.mock_event("timer_timer_change_tick_tick")
self.advance_time_and_run()
self.post_event("timer_change_tick_start")
self.advance_time_and_run(.1)
self.assertEventCalled("timer_timer_change_tick_tick", 1)
self.advance_time_and_run(1)
self.assertEventCalled("timer_timer_change_tick_tick", 2)
self.post_event("timer_change_tick_event")
self.assertEventCalled("timer_timer_change_tick_tick", 2)
self.advance_time_and_run(1)
self.assertEventCalled("timer_timer_change_tick_tick", 12)
def test_set_tick_fixed(self):
self.start_mode("mode_with_timers")
self.mock_event("timer_timer_change_tick_tick")
self.advance_time_and_run()
self.post_event("timer_change_tick_start")
self.advance_time_and_run(.1)
self.assertEventCalled("timer_timer_change_tick_tick", 1)
self.advance_time_and_run(1)
self.assertEventCalled("timer_timer_change_tick_tick", 2)
self.post_event("timer_set_tick_event_fixed")
self.assertEventCalled("timer_timer_change_tick_tick", 2)
self.advance_time_and_run(1)
self.assertEventCalled("timer_timer_change_tick_tick", 7)
def test_set_tick_kwarg(self):
self.start_mode("mode_with_timers")
self.mock_event("timer_timer_change_tick_tick")
self.advance_time_and_run()
self.post_event("timer_change_tick_start")
self.advance_time_and_run(.1)
self.assertEventCalled("timer_timer_change_tick_tick", 1)
self.advance_time_and_run(1)
self.assertEventCalled("timer_timer_change_tick_tick", 2)
self.post_event_with_params("timer_set_tick_event_kwarg", event_value=0.1)
self.assertEventCalled("timer_timer_change_tick_tick", 2)
self.advance_time_and_run(1)
self.assertEventCalled("timer_timer_change_tick_tick", 12)
self.post_event_with_params("timer_set_tick_event_kwarg", event_value=0.2)
self.assertEventCalled("timer_timer_change_tick_tick", 12)
self.advance_time_and_run(1)
self.assertEventCalled("timer_timer_change_tick_tick", 17)
def test_start_running(self):
# add a fake player
self.start_game()
self.mock_event("timer_timer_start_running_complete")
# start mode
self.machine.events.post('start_mode_with_timers')
self.machine_run()
timer = self.machine.timers['timer_start_running']
self.assertTrue(timer.running)
self.assertEqual(0, timer.ticks)
self.assertEqual(0, self._events['timer_timer_start_running_complete'])
self.advance_time_and_run(1)
self.assertEqual(1, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(0, self._events['timer_timer_start_running_complete'])
self.advance_time_and_run(1)
self.advance_time_and_run(1)
self.advance_time_and_run(1)
self.advance_time_and_run(1)
self.advance_time_and_run(1)
self.advance_time_and_run(1)
self.advance_time_and_run(1)
self.assertTrue(timer.running)
self.advance_time_and_run(1)
self.assertFalse(timer.running)
self.advance_time_and_run(1)
self.assertEqual(1, self._events['timer_timer_start_running_complete'])
def test_restart_on_complete(self):
# add a fake player
self.start_game()
# start mode
self.machine.events.post('start_mode_with_timers')
self.machine_run()
timer = self.machine.timers['timer_restart_on_complete']
self.assertTrue(timer.running)
self.assertEqual(0, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(1, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(2, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(3, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(4, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(0, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(1, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(2, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(3, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(4, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(0, timer.ticks)
self.assertTrue(timer.running)
def test_timer_events(self):
# add a fake player
self.start_game()
# start mode
self.machine.events.post('start_mode_with_timers')
self.advance_time_and_run(10)
self.assertTrue(self.machine.modes["mode_with_timers"].active)
timer = self.machine.timers['timer_up']
self.assertFalse(timer.running)
# timer should start now
self.post_event('start_timer_up')
self.assertTrue(timer.running)
self.assertEqual(0, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(1, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(2, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(3, timer.ticks)
self.post_event('reset_timer_up')
self.assertEqual(0, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(1, timer.ticks)
self.post_event('stop_timer_up')
self.assertEqual(1, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(1, timer.ticks)
self.post_event('restart_timer_up')
self.assertEqual(0, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(1, timer.ticks)
self.post_event('jump_timer_up')
self.assertEqual(5, timer.ticks)
self.post_event_with_params("change_tick_interval_timer_up", change=1)
# 1s * 4 = 4s
self.advance_time_and_run(1)
self.assertEqual(5, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(5, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(5, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(6, timer.ticks)
self.advance_time_and_run(4)
self.assertEqual(7, timer.ticks)
self.post_event("change_tick_interval_timer_up")
# 4s * 4 = 16s
self.advance_time_and_run(8)
self.assertEqual(7, timer.ticks)
self.advance_time_and_run(8)
self.assertEqual(8, timer.ticks)
self.post_event("set_tick_interval_timer_up")
# back to 2s
self.advance_time_and_run(1)
self.assertEqual(8, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(9, timer.ticks)
self.advance_time_and_run(2)
self.assertEqual(10, timer.ticks)
# and complete at some point
self.assertFalse(timer.running)
self.post_event('jump_timer_up')
self.assertEqual(5, timer.ticks)
self.post_event('jump_over_max_timer_up')
self.assertEqual(15, timer.ticks)
self.post_event('add_timer_up')
self.assertEqual(15, timer.ticks)
self.post_event('restart_timer_up')
self.post_event("reset_tick_interval")
self.advance_time_and_run()
self.assertEqual(1, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(2, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(3, timer.ticks)
self.post_event("change_tick_interval_timer_up")
self.advance_time_and_run(4)
self.assertEqual(4, timer.ticks)
self.post_event("reset_tick_interval")
self.advance_time_and_run()
self.assertEqual(5, timer.ticks)
def test_interrupt_timer_by_mode_stop_with_player(self):
self.machine.events.add_handler("timer_timer_down_tick", self._timer_tick)
self.machine.events.add_handler("timer_timer_down_started", self._timer_start)
self.machine.events.add_handler("timer_timer_down_complete", self._timer_complete)
self.machine.events.add_handler("timer_timer_down_stopped", self._timer_complete)
# add a fake player
self.start_game()
self.assertFalse(self.machine.modes["mode_with_timers"].active)
self.tick = 0
self.started = False
# start mode
self.machine.events.post('start_mode_with_timers')
self.advance_time_and_run(10)
self.assertTrue(self.machine.modes["mode_with_timers"].active)
self.assertFalse(self.started)
self.assertEqual(0, self.tick)
# timer should start now
self.machine.events.post('start_timer_down')
self.advance_time_and_run(1)
self.assertTrue(self.started)
self.assertEqual(1, self.tick)
self.advance_time_and_run(.6)
self.assertTrue(self.started)
self.assertEqual(2, self.tick)
self.advance_time_and_run(1.5)
self.assertEqual(3, self.tick)
# stop mode. timer should stop
self.machine.events.post('stop_mode_with_timers')
self.advance_time_and_run(10)
self.assertFalse(self.machine.modes["mode_with_timers"].active)
self.assertFalse(self.started)
# and stay off
self.advance_time_and_run(20)
self.assertEqual(3, self.tick)
self.assertFalse(self.started)
def test_mode_timer_with_player_var(self):
# add a fake player
self.start_game()
# start mode. no player vars
self.mock_event("timer_timer_player_var_complete")
self.machine.events.post('start_mode_with_timers')
self.advance_time_and_run(.1)
self.assertEventCalled("timer_timer_player_var_complete")
self.machine.events.post('stop_mode_with_timers')
self.advance_time_and_run(.1)
# set player vars. timer should run 5s
self.machine.game.player.start = 2
self.machine.game.player.end = 7
self.machine.log.debug("START")
self.mock_event("timer_timer_player_var_complete")
self.machine.events.post('start_mode_with_timers')
self.advance_time_and_run(4.5)
self.machine.log.debug("END")
self.assertEventNotCalled("timer_timer_player_var_complete")
self.advance_time_and_run(0.6)
self.assertEventCalled("timer_timer_player_var_complete")
self.machine.events.post('stop_mode_with_timers')
def test_timer_control_with_player_var(self):
self.start_game()
self.machine.game.player.timer_amount = 10
timer = self.machine.timers['timer_with_player_var_control_events']
self.machine.events.post('start_player_var_timer')
self.advance_time_and_run(.1)
self.assertTrue(timer.running)
self.assertEqual(0, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(1, timer.ticks)
self.machine.events.post('add_player_var_timer')
self.advance_time_and_run(.1)
self.assertEqual(11, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(12, timer.ticks)
self.machine.game.player.timer_amount = 5
self.machine.events.post('subtract_player_var_timer')
self.advance_time_and_run(.1)
self.assertEqual(7, timer.ticks)
self.advance_time_and_run(1)
self.assertEqual(8, timer.ticks)
| 37.713959 | 90 | 0.677022 | 2,205 | 16,481 | 4.739683 | 0.062585 | 0.096833 | 0.132045 | 0.158454 | 0.87915 | 0.844991 | 0.803751 | 0.782413 | 0.760501 | 0.725385 | 0 | 0.017951 | 0.222559 | 16,481 | 436 | 91 | 37.800459 | 0.797705 | 0.042291 | 0 | 0.802899 | 0 | 0 | 0.141243 | 0.104996 | 0 | 0 | 0 | 0.002294 | 0.4 | 1 | 0.046377 | false | 0 | 0.002899 | 0.005797 | 0.057971 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
54d2ebc73e8d37ae9a8021dfb5fa136a2397f219 | 416 | py | Python | amatino/tests/derived/__init__.py | Amatino-Code/amatino-python | 6c5f66b2e61bede5bf9d3e6eee8130a16f511a5f | [
"MIT"
] | 2 | 2018-07-20T20:00:33.000Z | 2020-10-08T15:49:06.000Z | amatino/tests/derived/__init__.py | Amatino-Code/amatino-python | 6c5f66b2e61bede5bf9d3e6eee8130a16f511a5f | [
"MIT"
] | 1 | 2020-05-21T02:49:29.000Z | 2020-05-21T02:49:29.000Z | amatino/tests/derived/__init__.py | Amatino-Code/amatino-python | 6c5f66b2e61bede5bf9d3e6eee8130a16f511a5f | [
"MIT"
] | 3 | 2018-09-03T09:31:31.000Z | 2020-05-21T05:30:00.000Z | from amatino.tests.derived.ledger import LedgerTest
from amatino.tests.derived.recursive_ledger import RecursiveLedgerTest
from amatino.tests.derived.balance import BalanceTest
from amatino.tests.derived.recursive_balance import RecursiveBalanceTest
from amatino.tests.derived.performance import PerformanceTest
from amatino.tests.derived.position import PositionTest
from amatino.tests.derived.tree import TreeTest
| 52 | 72 | 0.882212 | 51 | 416 | 7.156863 | 0.352941 | 0.210959 | 0.306849 | 0.441096 | 0.175342 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067308 | 416 | 7 | 73 | 59.428571 | 0.940722 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.