hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
bd65bbffcae46e5bc4e6fd5b09271e31d4a40f35 | 36 | py | Python | hello-world/hello_world.py | philcleveland/exercism_python | bf0be451bbddf40ccc9967149d7259f4810d9972 | [
"MIT"
] | null | null | null | hello-world/hello_world.py | philcleveland/exercism_python | bf0be451bbddf40ccc9967149d7259f4810d9972 | [
"MIT"
] | null | null | null | hello-world/hello_world.py | philcleveland/exercism_python | bf0be451bbddf40ccc9967149d7259f4810d9972 | [
"MIT"
] | null | null | null | def hello(): return "Hello, World!"
| 18 | 35 | 0.666667 | 5 | 36 | 4.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138889 | 36 | 1 | 36 | 36 | 0.774194 | 0 | 0 | 0 | 0 | 0 | 0.361111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | true | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
bde2617d84f22b4b3f2416c6551aeb32ab6a1d84 | 47 | py | Python | Topsis_Paras_101916051/__init__.py | Paras-Sood/Topsis_Paras_101916051 | bacf3b724c66582b31bb7f6fc46d029f5fdb5c66 | [
"MIT"
] | null | null | null | Topsis_Paras_101916051/__init__.py | Paras-Sood/Topsis_Paras_101916051 | bacf3b724c66582b31bb7f6fc46d029f5fdb5c66 | [
"MIT"
] | null | null | null | Topsis_Paras_101916051/__init__.py | Paras-Sood/Topsis_Paras_101916051 | bacf3b724c66582b31bb7f6fc46d029f5fdb5c66 | [
"MIT"
] | null | null | null | from Topsis_Paras_101916051.topsis import solve | 47 | 47 | 0.914894 | 7 | 47 | 5.857143 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.204545 | 0.06383 | 47 | 1 | 47 | 47 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
da2d0376bf5cef2c161820e22a664e7c5f270592 | 199 | py | Python | aaem/__init__.py | gina-alaska/alaska_affordable_energy_model | 96fed0137152985ce280ea37e0affec131e3087f | [
"MIT-feh"
] | 1 | 2022-01-23T07:18:36.000Z | 2022-01-23T07:18:36.000Z | aaem/__init__.py | gina-alaska/alaska_affordable_energy_model | 96fed0137152985ce280ea37e0affec131e3087f | [
"MIT-feh"
] | 5 | 2017-07-14T21:56:46.000Z | 2017-07-14T21:59:15.000Z | aaem/__init__.py | gina-alaska/alaska_affordable_energy_model | 96fed0137152985ce280ea37e0affec131e3087f | [
"MIT-feh"
] | 2 | 2020-04-28T18:12:55.000Z | 2021-01-13T01:56:57.000Z | __version__ = '1.0.0'
__url__ = 'https://github.com/gina-alaska/alaska_affordable_energy_model'
__download_url__ = 'https://github.com/gina-alaska/alaska_affordable_energy_model/releases/tag/v1.0.0'
| 49.75 | 102 | 0.80402 | 30 | 199 | 4.7 | 0.533333 | 0.028369 | 0.198582 | 0.241135 | 0.765957 | 0.765957 | 0.765957 | 0.765957 | 0.765957 | 0.765957 | 0 | 0.031579 | 0.045226 | 199 | 3 | 103 | 66.333333 | 0.710526 | 0 | 0 | 0 | 0 | 0.333333 | 0.738693 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
e58024a87a1906623bff2fa5c65a9d9478cfda38 | 206 | py | Python | src/reliapy/monte_carlo/__init__.py | reliapy/reliapy | 3efd48af5cc3bedbcbc5de64fb43e6c5625e3f6d | [
"BSD-3-Clause"
] | null | null | null | src/reliapy/monte_carlo/__init__.py | reliapy/reliapy | 3efd48af5cc3bedbcbc5de64fb43e6c5625e3f6d | [
"BSD-3-Clause"
] | null | null | null | src/reliapy/monte_carlo/__init__.py | reliapy/reliapy | 3efd48af5cc3bedbcbc5de64fb43e6c5625e3f6d | [
"BSD-3-Clause"
] | null | null | null |
from reliapy.monte_carlo._monte_carlo import MonteCarlo
from reliapy.monte_carlo._importance import Importance
from reliapy.monte_carlo._monte_carlo import *
from reliapy.monte_carlo._importance import *
| 29.428571 | 55 | 0.864078 | 28 | 206 | 6 | 0.25 | 0.357143 | 0.380952 | 0.5 | 0.880952 | 0.880952 | 0.440476 | 0 | 0 | 0 | 0 | 0 | 0.087379 | 206 | 6 | 56 | 34.333333 | 0.893617 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
e5806ee0d952ddff1bf9930131e2e6d26888240e | 10,734 | py | Python | pyparallelproj/wrapper.py | gschramm/parallelproj | 0e6eadb81b8d961b9ed932420d77b56fd87c4bf0 | [
"MIT"
] | 5 | 2021-01-27T15:05:03.000Z | 2022-03-18T08:40:13.000Z | pyparallelproj/wrapper.py | gschramm/parallelproj | 0e6eadb81b8d961b9ed932420d77b56fd87c4bf0 | [
"MIT"
] | 13 | 2021-02-10T12:15:29.000Z | 2021-09-23T10:38:53.000Z | pyparallelproj/wrapper.py | gschramm/parallelproj | 0e6eadb81b8d961b9ed932420d77b56fd87c4bf0 | [
"MIT"
] | 2 | 2021-02-14T21:26:32.000Z | 2021-09-19T18:43:48.000Z | import ctypes
from pyparallelproj.config import lib_parallelproj_c, lib_parallelproj_cuda, n_visible_gpus
def calc_chunks(nLORs, n_chunks):
""" calculate indices to split an array of length nLORs into n_chunks chunks
example: splitting an array of length 10 into 3 chunks returns [0,4,7,10]
"""
rem = nLORs % n_chunks
div = (nLORs // n_chunks)
chunks = [0]
for i in range(n_chunks):
if i < rem:
nLORs_chunck = div + 1
else:
nLORs_chunck = div
chunks.append(chunks[i] + nLORs_chunck)
return chunks
#------------------
def joseph3d_fwd(xstart, xend, img, img_origin, voxsize, img_fwd, nLORs, img_dim,
threadsperblock = 64, n_chunks = 1):
if n_visible_gpus > 0:
nvox = ctypes.c_longlong(img_dim[0]*img_dim[1]*img_dim[2])
# send image to all devices
d_img = lib_parallelproj_cuda.copy_float_array_to_all_devices(img.ravel(), nvox)
# split call to GPU lib into chunks (useful for systems with limited memory)
ic = calc_chunks(nLORs, n_chunks)
for i in range(n_chunks):
ok = lib_parallelproj_cuda.joseph3d_fwd_cuda(xstart[(3*ic[i]):(3*ic[i+1])], xend[(3*ic[i]):(3*ic[i+1])],
d_img, img_origin, voxsize,
img_fwd[ic[i]:ic[i+1]], ic[i+1] - ic[i],
img_dim, threadsperblock)
# free image device arrays
lib_parallelproj_cuda.free_float_array_on_all_devices(d_img, nvox)
else:
ok = lib_parallelproj_c.joseph3d_fwd(xstart, xend, img, img_origin, voxsize, img_fwd, nLORs, img_dim)
return ok
#------------------
def joseph3d_back(xstart, xend, back_img, img_origin, voxsize, sino, nLORs, img_dim,
threadsperblock = 64, n_chunks = 1):
if n_visible_gpus > 0:
nvox = ctypes.c_longlong(img_dim[0]*img_dim[1]*img_dim[2])
# send image to all devices
d_back_img = lib_parallelproj_cuda.copy_float_array_to_all_devices(back_img, nvox)
# split call to GPU lib into chunks (useful for systems with limited memory)
ic = calc_chunks(nLORs, n_chunks)
for i in range(n_chunks):
ok = lib_parallelproj_cuda.joseph3d_back_cuda(xstart[(3*ic[i]):(3*ic[i+1])], xend[(3*ic[i]):(3*ic[i+1])],
d_back_img, img_origin, voxsize,
sino[ic[i]:ic[i+1]], ic[i+1] - ic[i],
img_dim, threadsperblock)
# sum all device arrays in the first device
lib_parallelproj_cuda.sum_float_arrays_on_first_device(d_back_img, nvox)
# copy summed image back from first device
lib_parallelproj_cuda.get_float_array_from_device(d_back_img, nvox, 0, back_img)
# free image device arrays
lib_parallelproj_cuda.free_float_array_on_all_devices(d_back_img, nvox)
else:
ok = lib_parallelproj_c.joseph3d_back(xstart, xend, back_img, img_origin, voxsize, sino, nLORs, img_dim)
return ok
#------------------
def joseph3d_fwd_tof_sino(xstart, xend, img, img_origin, voxsize, img_fwd, nLORs, img_dim,
tofbin_width, sigma_tof, tofcenter_offset, nsigmas, ntofbins,
threadsperblock = 64, n_chunks = 1):
if n_visible_gpus > 0:
nvox = ctypes.c_longlong(img_dim[0]*img_dim[1]*img_dim[2])
# send image to all devices
d_img = lib_parallelproj_cuda.copy_float_array_to_all_devices(img.ravel(), nvox)
# split call to GPU lib into chunks (useful for systems with limited memory)
ic = calc_chunks(nLORs, n_chunks)
for i in range(n_chunks):
ok = lib_parallelproj_cuda.joseph3d_fwd_tof_sino_cuda(xstart[(3*ic[i]):(3*ic[i+1])],
xend[(3*ic[i]):(3*ic[i+1])],
d_img, img_origin, voxsize,
img_fwd[(ntofbins*ic[i]):(ntofbins*ic[i+1])],
ic[i+1] - ic[i], img_dim,
tofbin_width, sigma_tof[ic[i]:ic[i+1]],
tofcenter_offset[ic[i]:ic[i+1]],
nsigmas, ntofbins, threadsperblock)
# free image device arrays
lib_parallelproj_cuda.free_float_array_on_all_devices(d_img, nvox)
else:
ok = lib_parallelproj_c.joseph3d_fwd_tof_sino(xstart, xend, img, img_origin, voxsize,
img_fwd, nLORs, img_dim,
tofbin_width, sigma_tof, tofcenter_offset,
nsigmas, ntofbins)
return ok
#------------------
def joseph3d_back_tof_sino(xstart, xend, back_img, img_origin, voxsize, sino, nLORs, img_dim,
tofbin_width, sigma_tof, tofcenter_offset, nsigmas, ntofbins,
threadsperblock = 64, n_chunks = 1):
if n_visible_gpus > 0:
nvox = ctypes.c_longlong(img_dim[0]*img_dim[1]*img_dim[2])
# send image to all devices
d_back_img = lib_parallelproj_cuda.copy_float_array_to_all_devices(back_img, nvox)
# split call to GPU lib into chunks (useful for systems with limited memory)
ic = calc_chunks(nLORs, n_chunks)
for i in range(n_chunks):
ok = lib_parallelproj_cuda.joseph3d_back_tof_sino_cuda(xstart[(3*ic[i]):(3*ic[i+1])],
xend[(3*ic[i]):(3*ic[i+1])],
d_back_img, img_origin, voxsize,
sino[(ntofbins*ic[i]):(ntofbins*ic[i+1])],
ic[i+1] - ic[i], img_dim,
tofbin_width, sigma_tof[ic[i]:ic[i+1]],
tofcenter_offset[ic[i]:ic[i+1]],
nsigmas, ntofbins, threadsperblock)
# sum all device arrays in the first device
lib_parallelproj_cuda.sum_float_arrays_on_first_device(d_back_img, nvox)
# copy summed image back from first device
lib_parallelproj_cuda.get_float_array_from_device(d_back_img, nvox, 0, back_img)
# free image device arrays
lib_parallelproj_cuda.free_float_array_on_all_devices(d_back_img, nvox)
else:
ok = lib_parallelproj_c.joseph3d_back_tof_sino(xstart, xend, back_img, img_origin, voxsize,
sino, nLORs, img_dim,
tofbin_width, sigma_tof, tofcenter_offset,
nsigmas, ntofbins)
return ok
#------------------
def joseph3d_fwd_tof_lm(xstart, xend, img, img_origin, voxsize, img_fwd, nLORs, img_dim,
tofbin_width, sigma_tof, tofcenter_offset, nsigmas, tofbin,
threadsperblock = 64, n_chunks = 1):
if n_visible_gpus > 0:
nvox = ctypes.c_longlong(img_dim[0]*img_dim[1]*img_dim[2])
# send image to all devices
d_img = lib_parallelproj_cuda.copy_float_array_to_all_devices(img.ravel(), nvox)
# split call to GPU lib into chunks (useful for systems with limited memory)
ic = calc_chunks(nLORs, n_chunks)
for i in range(n_chunks):
ok = lib_parallelproj_cuda.joseph3d_fwd_tof_lm_cuda(xstart[(3*ic[i]):(3*ic[i+1])],
xend[(3*ic[i]):(3*ic[i+1])],
d_img, img_origin, voxsize,
img_fwd[ic[i]:ic[i+1]], ic[i+1] - ic[i], img_dim,
tofbin_width, sigma_tof[ic[i]:ic[i+1]],
tofcenter_offset[ic[i]:ic[i+1]],
nsigmas, tofbin[ic[i]:ic[i+1]], threadsperblock)
# free image device arrays
lib_parallelproj_cuda.free_float_array_on_all_devices(d_img, nvox)
else:
ok = lib_parallelproj_c.joseph3d_fwd_tof_lm(xstart, xend, img, img_origin, voxsize,
img_fwd, nLORs, img_dim,
tofbin_width, sigma_tof, tofcenter_offset,
nsigmas, tofbin)
return ok
#------------------
def joseph3d_back_tof_lm(xstart, xend, back_img, img_origin, voxsize, lst, nLORs, img_dim,
tofbin_width, sigma_tof, tofcenter_offset, nsigmas, tofbin,
threadsperblock = 64, n_chunks = 1):
if n_visible_gpus > 0:
nvox = ctypes.c_longlong(img_dim[0]*img_dim[1]*img_dim[2])
# send image to all devices
d_back_img = lib_parallelproj_cuda.copy_float_array_to_all_devices(back_img, nvox)
# split call to GPU lib into chunks (useful for systems with limited memory)
ic = calc_chunks(nLORs, n_chunks)
for i in range(n_chunks):
ok = lib_parallelproj_cuda.joseph3d_back_tof_lm_cuda(xstart[(3*ic[i]):(3*ic[i+1])],
xend[(3*ic[i]):(3*ic[i+1])],
d_back_img, img_origin, voxsize,
lst[ic[i]:ic[i+1]], ic[i+1] - ic[i], img_dim,
tofbin_width, sigma_tof[ic[i]:ic[i+1]],
tofcenter_offset[ic[i]:ic[i+1]],
nsigmas, tofbin[ic[i]:ic[i+1]], threadsperblock)
# sum all device arrays in the first device
lib_parallelproj_cuda.sum_float_arrays_on_first_device(d_back_img, nvox)
# copy summed image back from first device
lib_parallelproj_cuda.get_float_array_from_device(d_back_img, nvox, 0, back_img)
# free image device arrays
lib_parallelproj_cuda.free_float_array_on_all_devices(d_back_img, nvox)
else:
ok = lib_parallelproj_c.joseph3d_back_tof_lm(xstart, xend, back_img, img_origin, voxsize,
lst, nLORs, img_dim,
tofbin_width, sigma_tof, tofcenter_offset,
nsigmas, tofbin)
return ok
| 44.17284 | 111 | 0.539128 | 1,353 | 10,734 | 3.980044 | 0.070953 | 0.037883 | 0.025255 | 0.06351 | 0.938904 | 0.934819 | 0.930362 | 0.926277 | 0.926277 | 0.926277 | 0 | 0.019174 | 0.363518 | 10,734 | 242 | 112 | 44.355372 | 0.769028 | 0.117477 | 0 | 0.733813 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05036 | false | 0 | 0.014388 | 0 | 0.115108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e5994d4760de9f226b655f41a8a23db22f8b8e64 | 83 | py | Python | torchgan/trainer/__init__.py | kevinoop0/torchgan | 72eceeda32b7a80a9d619885886b794a44fd1250 | [
"MIT"
] | null | null | null | torchgan/trainer/__init__.py | kevinoop0/torchgan | 72eceeda32b7a80a9d619885886b794a44fd1250 | [
"MIT"
] | null | null | null | torchgan/trainer/__init__.py | kevinoop0/torchgan | 72eceeda32b7a80a9d619885886b794a44fd1250 | [
"MIT"
] | null | null | null | from .base_trainer import *
from .trainer import *
from .parallel_trainer import *
| 20.75 | 31 | 0.783133 | 11 | 83 | 5.727273 | 0.454545 | 0.619048 | 0.539683 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144578 | 83 | 3 | 32 | 27.666667 | 0.887324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e5e8827601f883ff8a24b7c7594a67284ecd451a | 4,618 | py | Python | AseModel/models.py | 5wimming/ase | 0d506add3a83caf9afd01f216c256c4678010918 | [
"Apache-2.0"
] | 10 | 2021-07-13T02:15:15.000Z | 2022-02-21T07:27:54.000Z | AseModel/models.py | 5wimming/ase | 0d506add3a83caf9afd01f216c256c4678010918 | [
"Apache-2.0"
] | 3 | 2022-02-21T08:59:01.000Z | 2022-03-05T02:45:34.000Z | AseModel/models.py | 5wimming/ase | 0d506add3a83caf9afd01f216c256c4678010918 | [
"Apache-2.0"
] | 3 | 2021-08-19T07:54:39.000Z | 2022-02-21T07:27:55.000Z | from django.db import models
from django.db import models
from .fields import RestrictedFileField
import time
class ScanPort(models.Model):
ip = models.CharField(max_length=255, verbose_name="target")
domain = models.CharField(max_length=1022, null=True, blank=True)
port = models.CharField(max_length=255)
service_name = models.CharField(max_length=255, null=True, blank=True)
application = models.CharField(max_length=1022, null=True, blank=True)
version = models.CharField(max_length=1022, null=True, blank=True)
vendor = models.CharField(max_length=1022, null=True, blank=True)
scan_time = models.DateTimeField(auto_now=True, null=True, blank=True)
scan_engine = models.CharField(max_length=255, null=True, blank=True) # 扫描平台
scan_task = models.CharField(max_length=255, null=True, blank=True)
strategy_id = models.CharField(max_length=255, null=True, blank=True) # 策略id
scan_node_id = models.CharField(max_length=255, null=True, blank=True) # 扫描器节点
remarks = models.CharField(max_length=1022, null=True, blank=True) # 备注
cpe = models.CharField(max_length=1022, null=True, blank=True)
state = models.CharField(max_length=255, null=True, blank=True) # nmap状态
extra_info = models.CharField(max_length=1022, null=True, blank=True) # nmap额外信息
hostname = models.CharField(max_length=255, null=True, blank=True)
# process_name = models.CharField(max_length=255, null=True, blank=True)
# process_path = models.CharField(max_length=1022, null=True, blank=True)
# process_pid = models.CharField(max_length=255, null=True, blank=True)
proto = models.CharField(max_length=255, null=True, blank=True) # 传输层协议
class Meta:
verbose_name = 'port info'
verbose_name_plural = 'port info'
def upload_to(instance, filename):
# 后缀
sub = filename.split('.')[-1]
name = filename.split('.')[-2]
t = time.strftime('%Y%m%d%H%M%S', time.localtime())
return 'files/%s_%s.%s' % (name, t, sub)
class ScanVuln(models.Model):
ip = models.CharField(max_length=255, verbose_name="target")
domain = models.CharField(max_length=1022, null=True, blank=True)
port = models.CharField(max_length=255, default=0)
vuln_desc = models.CharField(max_length=1022, null=True, blank=True, verbose_name="vuln name")
scan_time = models.DateTimeField(auto_now=True, null=True, blank=True)
scan_engine = models.CharField(max_length=255, null=True, blank=True)
scan_type = models.CharField(max_length=255, null=True, blank=True) # nvd or vuln strategy
base_score = models.CharField(max_length=255, null=True, blank=True, verbose_name="score")
scan_task = models.CharField(max_length=255, null=True, blank=True)
strategy_id = models.CharField(max_length=255, null=True, blank=True)
scan_node_id = models.CharField(max_length=255, null=True, blank=True)
remarks = models.CharField(max_length=1022, null=True, blank=True)
cpe = models.CharField(max_length=1022, null=True, blank=True)
class Meta:
verbose_name = 'vuln info'
verbose_name_plural = 'vuln info'
class ScanWeb(models.Model):
url = models.CharField(max_length=2046, null=True, blank=True)
target = models.CharField(max_length=2046, null=True, blank=True)
port = models.CharField(max_length=255, default=0)
status = models.IntegerField(null=True, blank=True)
title = models.CharField(max_length=1022, null=True, blank=True)
headers = models.TextField(null=True, blank=True)
body_size = models.IntegerField(null=True, blank=True)
body_content = models.TextField(null=True, blank=True)
redirect_url = models.CharField(max_length=1022, null=True, blank=True)
application = models.CharField(max_length=1022, null=True, blank=True)
scan_time = models.DateTimeField(auto_now=True, null=True, blank=True)
scan_engine = models.CharField(max_length=255, null=True, blank=True) # 扫描平台
scan_task = models.CharField(max_length=255, null=True, blank=True)
strategy_id = models.CharField(max_length=255, null=True, blank=True) # 策略id
scan_node_id = models.CharField(max_length=255, null=True, blank=True) # 扫描器节点
remarks = models.CharField(max_length=1022, null=True, blank=True) # 备注
def short_headers(self):
if len(str(self.headers)) > 20:
return '{}...'.format(str(self.headers)[0:50])
else:
return str(self.headers)
short_headers.allow_tags = True
short_headers.short_description = "headers"
class Meta:
verbose_name = 'web info'
verbose_name_plural = 'web info'
| 49.655914 | 98 | 0.718493 | 655 | 4,618 | 4.920611 | 0.170992 | 0.111697 | 0.181508 | 0.237357 | 0.787465 | 0.780949 | 0.716413 | 0.716413 | 0.716413 | 0.547006 | 0 | 0.04 | 0.155479 | 4,618 | 92 | 99 | 50.195652 | 0.78641 | 0.064097 | 0 | 0.44 | 0 | 0 | 0.02741 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026667 | false | 0 | 0.053333 | 0 | 0.826667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
e5fb435045b631c53f812db75b511bb3cd58e0c7 | 3,530 | py | Python | html5print/test/test_html5print.py | hydrobuilder/html5print | 489e8b6046a7332405d4d8025c783018f5025faf | [
"Apache-2.0"
] | 31 | 2015-02-19T08:35:07.000Z | 2022-02-24T15:49:37.000Z | html5print/test/test_html5print.py | hydrobuilder/html5print | 489e8b6046a7332405d4d8025c783018f5025faf | [
"Apache-2.0"
] | 10 | 2015-02-16T19:33:23.000Z | 2022-03-31T20:33:14.000Z | html5print/test/test_html5print.py | hydrobuilder/html5print | 489e8b6046a7332405d4d8025c783018f5025faf | [
"Apache-2.0"
] | 13 | 2015-02-24T17:19:11.000Z | 2022-01-31T23:04:12.000Z | from __future__ import unicode_literals
from __future__ import absolute_import
import pytest
import os
import sys
import textwrap
@pytest.fixture
def html5_beautify():
import sys
abspath = os.path.abspath('.')
sys.path.insert(0, abspath)
from html5print import HTMLBeautifier
return HTMLBeautifier.beautify
@pytest.fixture
def html_fragment():
html = '''
<div id="page-wrap">
<img alt="Some binary data="pic" src="data:image/png;base64,
iVBORw0KGgoAAAANSUhEUgAAAOgAAAEvCAYAAABPM43AAAAAAXNSR0IArs4c6QAAAAlwSFlzAAAL
EwAACxMBAJqcGAAAAAd0SU1FB90FCgQSFKAOxjcAAAAdaVRYdENvbW1lbnQAAAAAAENyZWF0ZWQg
d2l0aCBHSU1QZC5lBwAAIABJREFUeNrsvVmSJElyJfiYRVTVzNwjIzd09xThCHOK+Z7vukOfoqiu
g5+5TROB8DuNAhqoyiUi3MxUVYTng1lEWMTUoxJd4dgmg8gpNnczNVXh7fHjx4Rff/3669dfAAAR
YQAzgMW+Tvb7bF/vAHwP4L8C+BbA/0NE/+Mtryn++lh+/fWfzMgIALuv4P4cATwBeA/gGzOyb+zv
TwAuZoRf279/Z7+/A3C2n2cAnHNOOef/AeBXA/3116+/DqLc7CLca1Hu/wDw3+zP35gBzi46+q/J
jO8X/fr5xx+3v/+Hf3j63e9+x7///e/zrwb666//bFGOhujG9m/TQZT7+pUoVyLcNwC+sigX3OuG
g/f4S68d+7bhD3//9/jbv/3br3/zm98EAL8a6K+//kMZYDBD8/XcYv+2DFHuv9nXX7koV77/7L5K
lOQ3vnj97bX/Tgm3lxe8vLxQSun7bz5+jAC2Xw3011//VlFujHB8EOV8FLvY/71zEXCs5cLBl38P
+ksN7HNG5r+n+/P4bwf/JzlDcgaHQEL0X/8Q4wTg+quB/vrrXyPKza9Euf/qotzXrpYr0e1iv5/s
3/8yI/tLotxoXKORifzFb0/MiCFQIPqry+Xypjb0q4H+54ly9EqU+xoNlfx6iHJfDVGu1HqnV6Jc
7u6/fv/7dWJ+4JTyOo/qAwKsSfUOUlqvEDfTbvc0vXt33O12+vHjR6493T9+/z3/01uWe3v8GT3+
F2hm2H/e+SvoAAAAAElFTkSuQmCC"/>'''
return html
@pytest.fixture
def fixture_dir():
import os
import sys
abspath = os.path.abspath('.')
return os.path.join(abspath, 'html5print', 'test', 'fixture')
def test_html_beautify_multiline_tag(html5_beautify, html_fragment):
func = html5_beautify
got = func(html_fragment)
expected = textwrap.dedent('''
<html>
<head>
</head>
<body>
<div id="page-wrap">
<img alt="Some binary data=" pic"="" src="data:image/png;base64,
iVBORw0KGgoAAAANSUhEUgAAAOgAAAEvCAYAAABPM43AAAAAAXNSR0IArs4c6QAAAAlwSFlzAAAL
EwAACxMBAJqcGAAAAAd0SU1FB90FCgQSFKAOxjcAAAAdaVRYdENvbW1lbnQAAAAAAENyZWF0ZWQg
d2l0aCBHSU1QZC5lBwAAIABJREFUeNrsvVmSJElyJfiYRVTVzNwjIzd09xThCHOK+Z7vukOfoqiu
g5+5TROB8DuNAhqoyiUi3MxUVYTng1lEWMTUoxJd4dgmg8gpNnczNVXh7fHjx4Rff/3669dfAAAR
YQAzgMW+Tvb7bF/vAHwP4L8C+BbA/0NE/+Mtryn++lh+/fWfzMgIALuv4P4cATwBeA/gGzOyb+zv
TwAuZoRf279/Z7+/A3C2n2cAnHNOOef/AeBXA/3116+/DqLc7CLca1Hu/wDw3+zP35gBzi46+q/J
jO8X/fr5xx+3v/+Hf3j63e9+x7///e/zrwb666//bFGOhujG9m/TQZT7+pUoVyLcNwC+sigX3OuG
g/f4S68d+7bhD3//9/jbv/3br3/zm98EAL8a6K+//kMZYDBD8/XcYv+2DFHuv9nXX7koV77/7L5K
lOQ3vnj97bX/Tgm3lxe8vLxQSun7bz5+jAC2Xw3011//VlFujHB8EOV8FLvY/71zEXCs5cLBl38P
+ksN7HNG5r+n+/P4bwf/JzlDcgaHQEL0X/8Q4wTg+quB/vrrXyPKza9Euf/qotzXrpYr0e1iv5/s
3/8yI/tLotxoXKORifzFb0/MiCFQIPqry+Xypjb0q4H+54ly9EqU+xoNlfx6iHJfDVGu1HqnV6Jc
7u6/fv/7dWJ+4JTyOo/qAwKsSfUOUlqvEDfTbvc0vXt33O12+vHjR6493T9+/z3/01uWe3v8GT3+
F2hm2H/e+SvoAAAAAElFTkSuQmCC"/>
</div>
</body>
</html>
''')
expected = expected.replace('\n', '', 1)
assert got == expected, 'Expected "{}", Got "{}"'.format(expected, got)
| 45.844156 | 96 | 0.694051 | 302 | 3,530 | 8.043046 | 0.437086 | 0.016468 | 0.019761 | 0.013998 | 0.774804 | 0.774804 | 0.750926 | 0.750926 | 0.750926 | 0.750926 | 0 | 0.113025 | 0.210482 | 3,530 | 76 | 97 | 46.447368 | 0.758522 | 0 | 0 | 0.537313 | 0 | 0.059701 | 0.770822 | 0.549575 | 0 | 1 | 0 | 0 | 0.014925 | 1 | 0.059701 | false | 0 | 0.149254 | 0 | 0.253731 | 0.029851 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
dab1379eef2d828bb95c994e2e8ac4c55e1de924 | 2,505 | py | Python | tests/unit/command/test_data_sync.py | jackwellsxyz/dvc | 6a64f861783f3c2eadfc0364725ab06aa3ebb387 | [
"Apache-2.0"
] | null | null | null | tests/unit/command/test_data_sync.py | jackwellsxyz/dvc | 6a64f861783f3c2eadfc0364725ab06aa3ebb387 | [
"Apache-2.0"
] | null | null | null | tests/unit/command/test_data_sync.py | jackwellsxyz/dvc | 6a64f861783f3c2eadfc0364725ab06aa3ebb387 | [
"Apache-2.0"
] | null | null | null | from dvc.cli import parse_args
from dvc.command.data_sync import CmdDataFetch, CmdDataPull, CmdDataPush
def test_fetch(mocker):
cli_args = parse_args(
[
"fetch",
"target1",
"target2",
"--jobs",
"2",
"--remote",
"remote",
"--all-branches",
"--all-tags",
"--all-commits",
"--with-deps",
"--recursive",
]
)
assert cli_args.func == CmdDataFetch
cmd = cli_args.func(cli_args)
m = mocker.patch.object(cmd.repo, "fetch", autospec=True, return_value=0)
assert cmd.run() == 0
m.assert_called_once_with(
targets=["target1", "target2"],
jobs=2,
remote="remote",
all_branches=True,
all_tags=True,
all_commits=True,
with_deps=True,
recursive=True,
)
def test_pull(mocker):
cli_args = parse_args(
[
"pull",
"target1",
"target2",
"--jobs",
"2",
"--remote",
"remote",
"--all-branches",
"--all-tags",
"--all-commits",
"--with-deps",
"--force",
"--recursive",
]
)
assert cli_args.func == CmdDataPull
cmd = cli_args.func(cli_args)
m = mocker.patch.object(cmd.repo, "pull", autospec=True)
assert cmd.run() == 0
m.assert_called_once_with(
targets=["target1", "target2"],
jobs=2,
remote="remote",
all_branches=True,
all_tags=True,
all_commits=True,
with_deps=True,
force=True,
recursive=True,
)
def test_push(mocker):
cli_args = parse_args(
[
"push",
"target1",
"target2",
"--jobs",
"2",
"--remote",
"remote",
"--all-branches",
"--all-tags",
"--all-commits",
"--with-deps",
"--recursive",
]
)
assert cli_args.func == CmdDataPush
cmd = cli_args.func(cli_args)
m = mocker.patch.object(cmd.repo, "push", autospec=True, return_value=0)
assert cmd.run() == 0
m.assert_called_once_with(
targets=["target1", "target2"],
jobs=2,
remote="remote",
all_branches=True,
all_tags=True,
all_commits=True,
with_deps=True,
recursive=True,
)
| 22.168142 | 77 | 0.475848 | 246 | 2,505 | 4.670732 | 0.191057 | 0.073107 | 0.093995 | 0.099217 | 0.835509 | 0.72846 | 0.72846 | 0.72846 | 0.72846 | 0.72846 | 0 | 0.014867 | 0.382435 | 2,505 | 112 | 78 | 22.366071 | 0.72786 | 0 | 0 | 0.71134 | 0 | 0 | 0.149701 | 0 | 0 | 0 | 0 | 0 | 0.092784 | 1 | 0.030928 | false | 0 | 0.020619 | 0 | 0.051546 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
dac466ed89e09451be48391c605bd9373088c890 | 27,862 | py | Python | tests/test_backend_six.py | 166MMX/hiro-python-library | fb29e3247a8fe1b0f7dc4e68141cf7340a8dd0a5 | [
"MIT"
] | null | null | null | tests/test_backend_six.py | 166MMX/hiro-python-library | fb29e3247a8fe1b0f7dc4e68141cf7340a8dd0a5 | [
"MIT"
] | null | null | null | tests/test_backend_six.py | 166MMX/hiro-python-library | fb29e3247a8fe1b0f7dc4e68141cf7340a8dd0a5 | [
"MIT"
] | null | null | null | import json
from datetime import date, datetime, time, timezone
from types import MappingProxyType
from typing import Generator, ContextManager
from uuid import uuid4
# noinspection PyPackageRequirements
import pytest
from requests import Response
from arago.hiro.client.client import HiroClient
from arago.hiro.model.auth import SessionCredentials, AccessToken
from arago.hiro.model.graph.edge import Edge
from arago.hiro.model.graph.history import HistoryFormat, HistoryEntry, HistoryDiff
from arago.hiro.model.graph.vertex import VertexId, Vertex
from arago.hiro.model.storage import BlobVertex, TimeSeriesValue, TimeSeriesVertex
from arago.hiro.utils.datetime import datetime_to_timestamp_ms
from arago.ogit import OgitEntity, OgitVerb
from arago.ontology import Attribute
def uuid() -> str:
return str(uuid4())
class TestClassMetaInfo:
def test_info_rest(self, client: HiroClient):
from arago.hiro.backend.six.meta import Hiro6MetaRest
meta = Hiro6MetaRest(client)
res = meta.info()
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_info_data(self, client: HiroClient):
from arago.hiro.backend.six.meta import Hiro6MetaData
meta = Hiro6MetaData(client)
res = meta.info()
assert isinstance(res, dict)
pass
def test_info_model(self, client: HiroClient):
from arago.hiro.backend.six.meta import Hiro6MetaModel
meta = Hiro6MetaModel(client)
res = meta.info()
assert isinstance(res, dict)
pass
def test_version_rest(self, client: HiroClient):
from arago.hiro.backend.six.meta import Hiro6MetaRest
meta = Hiro6MetaRest(client)
with pytest.raises(NotImplementedError):
res = meta.version()
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_version_data(self, client: HiroClient):
from arago.hiro.backend.six.meta import Hiro6MetaData
meta = Hiro6MetaData(client)
res = meta.version()
assert isinstance(res, dict)
pass
def test_version_model(self, client: HiroClient):
from arago.hiro.backend.six.meta import Hiro6MetaModel
meta = Hiro6MetaModel(client)
res = meta.version()
assert isinstance(res, dict)
pass
def test_versions_rest(self, client: HiroClient):
from arago.hiro.backend.six.meta import Hiro6MetaRest
meta = Hiro6MetaRest(client)
with pytest.raises(NotImplementedError):
res = meta.versions()
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_versions_data(self, client: HiroClient):
from arago.hiro.backend.six.meta import Hiro6MetaData
meta = Hiro6MetaData(client)
res = meta.versions()
assert isinstance(res, dict)
pass
def test_versions_model(self, client: HiroClient):
from arago.hiro.backend.six.meta import Hiro6MetaModel
meta = Hiro6MetaModel(client)
res = meta.versions()
assert isinstance(res, dict)
pass
class TestClassHealth:
def test_health_rest(self, client: HiroClient):
from arago.hiro.backend.six.health import Hiro6HealthRest
health = Hiro6HealthRest(client)
res = health.check()
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_health_data(self, client: HiroClient):
from arago.hiro.backend.six.health import Hiro6HealthData
health = Hiro6HealthData(client)
res = health.check()
assert isinstance(res, dict)
pass
def test_health_model(self, client: HiroClient):
from arago.hiro.backend.six.health import Hiro6HealthModel
health = Hiro6HealthModel(client)
res = health.check()
assert isinstance(res, dict)
pass
# noinspection PyUnusedLocal
class TestClassAuth:
def test_token_get_rest(self, client: HiroClient, credentials: SessionCredentials):
from arago.hiro.backend.six.auth import Hiro6AuthRest
auth = Hiro6AuthRest(client)
req_data = {
'client_id': credentials.client.id,
'client_secret': credentials.client.secret,
'username': credentials.account.username,
'password': credentials.account.password,
}
res = auth.password(req_data)
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_token_get_data(self, client: HiroClient, credentials: SessionCredentials):
from arago.hiro.backend.six.auth import Hiro6AuthData
auth = Hiro6AuthData(client)
res = auth.password(
credentials.client.id,
credentials.client.secret,
credentials.account.username,
credentials.account.password,
)
assert isinstance(res, dict)
pass
def test_token_get_model(self, client: HiroClient, credentials: SessionCredentials):
from arago.hiro.backend.six.auth import Hiro6AuthModel
auth = Hiro6AuthModel(client)
res = auth.password(credentials)
assert isinstance(res, AccessToken)
pass
@pytest.mark.skip
def test_token_refresh_rest(self, client: HiroClient):
from arago.hiro.backend.six.auth import Hiro6AuthRest
auth = Hiro6AuthRest(client)
pass
@pytest.mark.skip
def test_token_refresh_data(self, client: HiroClient):
from arago.hiro.backend.six.auth import Hiro6AuthData
auth = Hiro6AuthData(client)
pass
@pytest.mark.skip
def test_token_refresh_model(self, client: HiroClient):
from arago.hiro.backend.six.auth import Hiro6AuthModel
auth = Hiro6AuthModel(client)
pass
def test_token_revoke_rest(self, client: HiroClient, credentials: SessionCredentials):
from arago.hiro.backend.six.auth import Hiro6AuthRest
auth = Hiro6AuthRest(client)
req_data = {
'client_id': credentials.client.id,
}
res = auth.revoke(req_data)
res.raise_for_status()
assert isinstance(res, Response)
def test_token_revoke_data(self, client: HiroClient, credentials: SessionCredentials):
from arago.hiro.backend.six.auth import Hiro6AuthData
auth = Hiro6AuthData(client)
auth.revoke(credentials.client.id)
def test_token_revoke_model(self, client: HiroClient, credentials: SessionCredentials):
from arago.hiro.backend.six.auth import Hiro6AuthModel
auth = Hiro6AuthModel(client)
auth.revoke(credentials.client)
class TestClassGraph:
def test_vertex_create_rest(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphRest
graph = Hiro6GraphRest(client)
vertex_type = OgitEntity.OGIT_COMMENT.value.name.uri
res = graph.vertex.create(vertex_type, {})
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_vertex_create_data(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphData
graph = Hiro6GraphData(client)
vertex_type = OgitEntity.OGIT_COMMENT.value.name.uri
res = graph.vertex.create(vertex_type)
assert isinstance(res, dict)
pass
def test_vertex_create_model(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
graph = Hiro6GraphModel(client)
vertex_type = OgitEntity.OGIT_COMMENT
res = graph.vertex.create(vertex_type)
assert isinstance(res, Vertex)
pass
def test_vertex_get_rest(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphRest
graph = Hiro6GraphRest(client)
vertex_id = OgitEntity.OGIT_COMMENT.value.name.uri
res = graph.vertex.get(vertex_id)
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_vertex_get_data(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphData
graph = Hiro6GraphData(client)
vertex_id = OgitEntity.OGIT_COMMENT.value.name.uri
res = graph.vertex.get(vertex_id)
assert isinstance(res, dict)
pass
def test_vertex_get_model(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
graph = Hiro6GraphModel(client)
vertex_id = VertexId(OgitEntity.OGIT_COMMENT.value.name.uri)
res = graph.vertex.get(vertex_id)
assert isinstance(res, Vertex)
pass
def test_vertex_update_rest(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
from arago.hiro.backend.six.graph import Hiro6GraphRest
graph_m = Hiro6GraphModel(client)
graph = Hiro6GraphRest(client)
comment_v = graph_m.vertex.create(OgitEntity.OGIT_COMMENT)
res = graph.vertex.update(comment_v.id, {})
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_vertex_update_data(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
from arago.hiro.backend.six.graph import Hiro6GraphData
graph_m = Hiro6GraphModel(client)
graph = Hiro6GraphData(client)
comment_v = graph_m.vertex.create(OgitEntity.OGIT_COMMENT)
res = graph.vertex.update(comment_v.id, {})
assert isinstance(res, dict)
pass
def test_vertex_update_model(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
graph = Hiro6GraphModel(client)
comment_v = graph.vertex.create(OgitEntity.OGIT_COMMENT)
res = graph.vertex.update(comment_v, {})
assert isinstance(res, Vertex)
pass
def test_vertex_delete_rest(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
from arago.hiro.backend.six.graph import Hiro6GraphRest
graph_m = Hiro6GraphModel(client)
graph = Hiro6GraphRest(client)
comment_v = graph_m.vertex.create(OgitEntity.OGIT_COMMENT)
res = graph.vertex.delete(comment_v.id)
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_vertex_delete_data(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
from arago.hiro.backend.six.graph import Hiro6GraphData
graph_m = Hiro6GraphModel(client)
graph = Hiro6GraphData(client)
comment_v = graph_m.vertex.create(OgitEntity.OGIT_COMMENT)
res = graph.vertex.delete(comment_v.id)
assert isinstance(res, dict)
pass
def test_vertex_delete_model(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
graph = Hiro6GraphModel(client)
comment_v = graph.vertex.create(OgitEntity.OGIT_COMMENT)
res = graph.vertex.delete(comment_v)
assert isinstance(res, Vertex)
pass
def test_vertex_history_rest(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphRest
res = client.root.model.search.index(rf'''ogit\/_type:"{OgitEntity.OGIT_LICENSE_REQUEST.value.name.uri!s}"''')
vertex = next(res)
graph = Hiro6GraphRest(client)
v_res = graph.vertex.history(vertex.id)
v_res.raise_for_status()
assert isinstance(v_res, Response)
pass
def test_vertex_history_data(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphData
res_1 = client.root.model.search.index(rf'''ogit\/_type:"{OgitEntity.OGIT_LICENSE_REQUEST.value.name.uri!s}"''')
vertex = next(res_1)
graph = Hiro6GraphData(client)
res_2 = graph.vertex.history(vertex.id)
assert isinstance(res_2, Generator)
vertex = next(res_2)
assert isinstance(vertex, dict)
pass
def test_vertex_history_model_element(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
res_1 = client.root.model.search.index(rf'''ogit\/_type:"{OgitEntity.OGIT_LICENSE_REQUEST.value.name.uri!s}"''')
vertex = next(res_1)
graph = Hiro6GraphModel(client)
res_2 = graph.vertex.history(vertex, res_format=HistoryFormat.ELEMENT)
assert isinstance(res_2, Generator)
vertex = next(res_2)
assert isinstance(vertex, Vertex)
pass
def test_vertex_history_model_full(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
res_1 = client.root.model.search.index(rf'''ogit\/_type:"{OgitEntity.OGIT_LICENSE_REQUEST.value.name.uri!s}"''')
vertex = next(res_1)
graph = Hiro6GraphModel(client)
res_2 = graph.vertex.history(vertex, res_format=HistoryFormat.FULL)
assert isinstance(res_2, Generator)
entry = next(res_2)
assert isinstance(entry, HistoryEntry)
vertex = entry.data
assert isinstance(vertex, Vertex)
pass
def test_vertex_history_model_diff(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
res_1 = client.root.model.search.index(rf'''ogit\/_type:"{OgitEntity.OGIT_LICENSE_REQUEST.value.name.uri!s}"''')
vertex = next(res_1)
graph = Hiro6GraphModel(client)
res_2 = graph.vertex.history(vertex, res_format=HistoryFormat.DIFF)
assert isinstance(res_2, Generator)
diff = next(res_2)
diff = next(res_2)
assert isinstance(diff, HistoryDiff)
replaced = diff.replaced
assert isinstance(replaced, MappingProxyType)
keys = iter(replaced)
key = next(keys)
assert isinstance(key, Attribute)
pass
def test_edge_create_rest(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
from arago.hiro.backend.six.graph import Hiro6GraphRest
graph_m = Hiro6GraphModel(client)
graph = Hiro6GraphRest(client)
vertex_a = graph_m.vertex.create(OgitEntity.OGIT_ATTACHMENT)
vertex_b = graph_m.vertex.create(OgitEntity.OGIT_COMMENT)
req_data = {
'out': str(vertex_a.id),
'in': str(vertex_b.id),
}
edge_type = OgitVerb.OGIT_BELONGS.value.name.uri
res = graph.edge.create(edge_type, req_data)
res.raise_for_status()
isinstance(res, Response)
pass
def test_edge_create_data(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
from arago.hiro.backend.six.graph import Hiro6GraphData
graph_m = Hiro6GraphModel(client)
graph = Hiro6GraphData(client)
vertex_a = graph_m.vertex.create(OgitEntity.OGIT_ATTACHMENT)
vertex_b = graph_m.vertex.create(OgitEntity.OGIT_COMMENT)
edge_type = OgitVerb.OGIT_BELONGS.value.name.uri
res = graph.edge.create(vertex_a.id, edge_type, vertex_b.id)
isinstance(res, dict)
pass
def test_edge_create_model(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
graph = Hiro6GraphModel(client)
vertex_a = graph.vertex.create(OgitEntity.OGIT_ATTACHMENT)
vertex_b = graph.vertex.create(OgitEntity.OGIT_COMMENT)
edge_type = OgitVerb.OGIT_BELONGS
res = graph.edge.create(vertex_a, edge_type, vertex_b)
isinstance(res, Edge)
pass
def test_edge_delete_rest(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
from arago.hiro.backend.six.graph import Hiro6GraphRest
graph_m = Hiro6GraphModel(client)
graph = Hiro6GraphRest(client)
vertex_a = graph_m.vertex.create(OgitEntity.OGIT_ATTACHMENT)
vertex_b = graph_m.vertex.create(OgitEntity.OGIT_COMMENT)
edge_c = graph_m.edge.create(vertex_a, OgitVerb.OGIT_BELONGS, vertex_b)
res = graph.edge.delete(edge_c.id)
res.raise_for_status()
isinstance(res, Response)
pass
def test_edge_delete_data(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
from arago.hiro.backend.six.graph import Hiro6GraphData
graph_m = Hiro6GraphModel(client)
graph = Hiro6GraphData(client)
vertex_a = graph_m.vertex.create(OgitEntity.OGIT_ATTACHMENT)
vertex_b = graph_m.vertex.create(OgitEntity.OGIT_COMMENT)
edge_c = graph_m.edge.create(vertex_a, OgitVerb.OGIT_BELONGS, vertex_b)
res = graph.edge.delete(edge_c.id)
isinstance(res, dict)
pass
def test_edge_delete_model(self, client: HiroClient):
from arago.hiro.backend.six.graph import Hiro6GraphModel
graph = Hiro6GraphModel(client)
vertex_a = graph.vertex.create(OgitEntity.OGIT_ATTACHMENT)
vertex_b = graph.vertex.create(OgitEntity.OGIT_COMMENT)
edge_c = graph.edge.create(vertex_a, OgitVerb.OGIT_BELONGS, vertex_b)
res = graph.edge.delete(edge_c)
isinstance(res, Edge)
pass
class TestClassFoo:
@pytest.mark.skip
def test_search_blob(self, client: HiroClient):
from arago.hiro.backend.six.search import Hiro6SearchModel
search = Hiro6SearchModel(client)
res = search.index(
rf'+ogit\/_type:"{OgitEntity.OGIT_ATTACHMENT.id}"'
r' -ogit\/_creator:"jharth@arago.co"'
r' -ogit\/_creator:"mgrohrock@arago.co"'
r' -ogit\/_creator:"cschulz@arago.co"'
)
import pprint
for vertex in res:
pprint.pprint(vertex.to_dict())
pass
@pytest.mark.skip
def test_search_ts(self, client: HiroClient):
from arago.hiro.backend.six.search import Hiro6SearchModel
search = Hiro6SearchModel(client)
with search.index(
rf'+ogit\/_type:"{OgitEntity.OGIT_TIME_SERIES.id}"'
r' -\/DataName:"Time 99 percentile"'
) as res:
yield from res
pass
class TestClassSearch:
def test_index_rest(self, client: HiroClient):
from arago.hiro.backend.six.search import Hiro6SearchRest
search = Hiro6SearchRest(client)
res = search.index({'query': r'+ogit\/_id="ogit/Node"'})
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_index_data(self, client: HiroClient):
from arago.hiro.backend.six.search import Hiro6SearchData
search = Hiro6SearchData(client)
res = search.index(r'+ogit\/_id:"ogit/Node"')
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, dict)
pass
def test_index_model(self, client: HiroClient):
from arago.hiro.backend.six.search import Hiro6SearchModel
search = Hiro6SearchModel(client)
res = search.index(r'+ogit\/_id="ogit/Node"')
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, Vertex)
pass
def test_graph_rest(self, client: HiroClient):
from arago.hiro.backend.six.search import Hiro6SearchRest
search = Hiro6SearchRest(client)
res = search.graph({'root': 'ogit/Node', 'query': 'out()'})
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_graph_data(self, client: HiroClient):
from arago.hiro.backend.six.search import Hiro6SearchData
search = Hiro6SearchData(client)
res = search.graph('ogit/Node', 'out()')
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, dict)
pass
def test_graph_model(self, client: HiroClient):
from arago.hiro.backend.six.search import Hiro6SearchModel
search = Hiro6SearchModel(client)
res = search.graph(VertexId('ogit/Node'), 'out()')
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, Vertex)
pass
def test_graph_model_2(self, client: HiroClient):
from arago.hiro.backend.six.search import Hiro6SearchModel
search = Hiro6SearchModel(client)
res = search.graph(VertexId('ogit/Node'), 'outE()', result_type=Edge)
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, Edge)
pass
class TestClassStorage:
@pytest.fixture
def empty_blob_vertex(self, client: HiroClient) -> Generator[BlobVertex, None, None]:
graph = client.model.graph
vertex = graph.vertex.create(OgitEntity.OGIT_ATTACHMENT)
yield vertex
vertex.delete()
@pytest.fixture
def empty_ts_vertex(self, client: HiroClient) -> Generator[TimeSeriesVertex, None, None]:
graph = client.model.graph
vertex = graph.vertex.create(OgitEntity.OGIT_TIME_SERIES)
yield vertex
vertex.delete()
@pytest.fixture
def existing_ts_vertex(self, client: HiroClient) -> Generator[TimeSeriesVertex, None, None]:
search = client.model.search
res = search.index(
rf'+ogit\/_type:"{OgitEntity.OGIT_TIME_SERIES.value.name.uri}"'
r' -\/DataName:"Time 99 percentile"',
limit=1
)
yield from res
@pytest.fixture
def isaac_asimov_birth_day(self) -> datetime:
return datetime.combine(date.fromisoformat('1920-01-02'), time(), timezone.utc)
def test_blob_set_rest(self, client: HiroClient, empty_blob_vertex: BlobVertex, png_img: bytes):
from arago.hiro.backend.six.storage import Hiro6StorageRest
storage = Hiro6StorageRest(client)
res = storage.blob.set(empty_blob_vertex.id, png_img)
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_blob_set_data(self, client: HiroClient, empty_blob_vertex: BlobVertex, png_img: bytes):
from arago.hiro.backend.six.storage import Hiro6StorageData
storage = Hiro6StorageData(client)
storage.blob.set(empty_blob_vertex.id, png_img)
pass
def test_blob_set_model(self, client: HiroClient, empty_blob_vertex: BlobVertex, png_img: bytes):
from arago.hiro.backend.six.storage import Hiro6StorageModel
storage = Hiro6StorageModel(client)
storage.blob.set(empty_blob_vertex, png_img)
pass
def test_blob_get_rest(self, client: HiroClient, empty_blob_vertex: BlobVertex, png_img: bytes):
from arago.hiro.backend.six.storage import Hiro6StorageRest
storage = Hiro6StorageRest(client)
storage.blob.set(empty_blob_vertex.id, png_img)
res = storage.blob.get(empty_blob_vertex.id)
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_blob_get_data(self, client: HiroClient, empty_blob_vertex: BlobVertex, png_img: bytes):
from arago.hiro.backend.six.storage import Hiro6StorageData
storage = Hiro6StorageData(client)
storage.blob.set(empty_blob_vertex.id, png_img)
res = storage.blob.get(empty_blob_vertex.id)
assert isinstance(res, ContextManager)
with res as g:
assert isinstance(g, Generator)
i = next(g)
assert isinstance(i, bytes)
pass
def test_blob_get_model(self, client: HiroClient, empty_blob_vertex: BlobVertex, png_img: bytes):
from arago.hiro.backend.six.storage import Hiro6StorageModel
storage = Hiro6StorageModel(client)
storage.blob.set(empty_blob_vertex, png_img)
res = storage.blob.get(empty_blob_vertex)
assert isinstance(res, ContextManager)
with res as g:
assert isinstance(g, Generator)
i = next(g)
assert isinstance(i, bytes)
pass
@pytest.mark.skip
def test_log_get_rest(self, client: HiroClient):
from arago.hiro.backend.six.storage import Hiro6StorageRest
storage = Hiro6StorageRest(client)
res = storage.log.get()
res.raise_for_status()
assert isinstance(res, Response)
pass
@pytest.mark.skip
def test_log_get_data(self, client: HiroClient):
from arago.hiro.backend.six.storage import Hiro6StorageData
storage = Hiro6StorageData(client)
res = storage.log.get()
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, dict)
pass
@pytest.mark.skip
def test_log_get_model(self, client: HiroClient):
from arago.hiro.backend.six.storage import Hiro6StorageModel
storage = Hiro6StorageModel(client)
res = storage.log.get()
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, Edge)
pass
def test_ts_get_rest(self, client: HiroClient, existing_ts_vertex: Vertex):
from arago.hiro.backend.six.storage import Hiro6StorageRest
storage = Hiro6StorageRest(client)
res = storage.ts.get(existing_ts_vertex.id)
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_ts_get_data(self, client: HiroClient, existing_ts_vertex: Vertex):
from arago.hiro.backend.six.storage import Hiro6StorageData
storage = Hiro6StorageData(client)
res = storage.ts.get(existing_ts_vertex.id)
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, dict)
pass
def test_ts_get_model(self, client: HiroClient, existing_ts_vertex: TimeSeriesVertex):
from arago.hiro.backend.six.storage import Hiro6StorageModel
storage = Hiro6StorageModel(client)
res = storage.ts.get(existing_ts_vertex)
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, TimeSeriesValue)
pass
def test_ts_add_rest(self, client: HiroClient, empty_ts_vertex: Vertex):
from arago.hiro.backend.six.storage import Hiro6StorageRest
storage = Hiro6StorageRest(client)
body = json.dumps({
'timestamp': datetime_to_timestamp_ms(
datetime.combine(date.fromisoformat('1920-01-02'), time(), timezone.utc)),
'value': 'Isaac Asimov',
})
res = storage.ts.add(empty_ts_vertex.id, body)
res.raise_for_status()
assert isinstance(res, Response)
pass
def test_ts_add_data(self, client: HiroClient, empty_ts_vertex: Vertex):
def g():
yield {
'timestamp': datetime_to_timestamp_ms(
datetime.combine(date.fromisoformat('1920-01-02'), time(), timezone.utc)),
'value': 'Isaac Asimov',
}
from arago.hiro.backend.six.storage import Hiro6StorageData
storage = Hiro6StorageData(client)
storage.ts.add(empty_ts_vertex.id, g())
pass
def test_ts_add_model(self, client: HiroClient, empty_ts_vertex: TimeSeriesVertex,
isaac_asimov_birth_day: datetime):
def g():
yield TimeSeriesValue(
timestamp=datetime.combine(date.fromisoformat('1920-01-02'), time(), timezone.utc),
value='Isaac Asimov',
)
from arago.hiro.backend.six.storage import Hiro6StorageModel
storage = Hiro6StorageModel(client)
storage.ts.add(empty_ts_vertex, g())
res = storage.ts.get(empty_ts_vertex,
start=datetime.combine(date.fromisoformat('1900-01-01'), time(), timezone.utc))
assert isinstance(res, Generator)
i = next(res)
assert isinstance(i, TimeSeriesValue)
pass
| 39.18706 | 120 | 0.671847 | 3,245 | 27,862 | 5.6151 | 0.060401 | 0.041985 | 0.059217 | 0.08342 | 0.872949 | 0.837605 | 0.820976 | 0.803908 | 0.785632 | 0.748587 | 0 | 0.010421 | 0.23882 | 27,862 | 710 | 121 | 39.242254 | 0.848736 | 0.002189 | 0 | 0.683788 | 0 | 0 | 0.034067 | 0.023059 | 0 | 0 | 0 | 0 | 0.117175 | 1 | 0.120385 | false | 0.11236 | 0.149278 | 0.00321 | 0.284109 | 0.00321 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
971a508b3b19df26239489f2de55f967d5176afa | 36 | py | Python | pyblazing/pyblazing/apiv2/algebra/__init__.py | msadang/blazingsql | 5fe3e418dbee4a3961998b0e25ec81100a1a1490 | [
"Apache-2.0"
] | null | null | null | pyblazing/pyblazing/apiv2/algebra/__init__.py | msadang/blazingsql | 5fe3e418dbee4a3961998b0e25ec81100a1a1490 | [
"Apache-2.0"
] | null | null | null | pyblazing/pyblazing/apiv2/algebra/__init__.py | msadang/blazingsql | 5fe3e418dbee4a3961998b0e25ec81100a1a1490 | [
"Apache-2.0"
] | null | null | null | from .json_plan import get_json_plan | 36 | 36 | 0.888889 | 7 | 36 | 4.142857 | 0.714286 | 0.551724 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 36 | 1 | 36 | 36 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
975d43b5000138d6dc21260ddd6b1c9dfaf67be5 | 8,181 | py | Python | tests/test_io.py | brews/dearprudence | c73d32346d6f0b276dfd5a9b3c8791b17a1fe026 | [
"Apache-2.0"
] | 2 | 2021-11-24T14:24:20.000Z | 2021-11-30T17:14:55.000Z | tests/test_io.py | brews/dearprudence | c73d32346d6f0b276dfd5a9b3c8791b17a1fe026 | [
"Apache-2.0"
] | null | null | null | tests/test_io.py | brews/dearprudence | c73d32346d6f0b276dfd5a9b3c8791b17a1fe026 | [
"Apache-2.0"
] | null | null | null | import io
from dearprudence import read_params, write_params, Cmip6Record, SimpleRun
def test_read_params_reads():
"""
Test that dodola.read_params reads two params from JSON string in parameter file
"""
expected_p1 = SimpleRun(
target="historical",
variable_id="tasmax",
historical=Cmip6Record(
activity_id="CMIP",
experiment_id="historical",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
ssp=Cmip6Record(
activity_id="ScenarioMIP",
experiment_id="ssp370",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
)
expected_p2 = SimpleRun(
target="ssp",
variable_id="tasmax",
historical=Cmip6Record(
activity_id="CMIP",
experiment_id="historical",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
ssp=Cmip6Record(
activity_id="ScenarioMIP",
experiment_id="ssp370",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
)
fl = io.StringIO(
"""jobs: |
[
{
"target": "historical",
"variable_id": "tasmax",
"historical": {
"activity_id": "CMIP",
"experiment_id": "historical",
"table_id": "day",
"variable_id": "tasmax",
"source_id": "ACCESS-CM2",
"institution_id": "CSIRO-ARCCSS",
"member_id": "r1i1p1f1",
"grid_label": "gn",
"version": "20191108"
},
"ssp": {
"activity_id": "ScenarioMIP",
"experiment_id": "ssp370",
"table_id": "day",
"variable_id": "tasmax",
"source_id": "ACCESS-CM2",
"institution_id": "CSIRO-ARCCSS",
"member_id": "r1i1p1f1",
"grid_label": "gn",
"version": "20191108"
}
},
{
"target": "ssp",
"variable_id": "tasmax",
"historical": {
"activity_id": "CMIP",
"experiment_id": "historical",
"table_id": "day",
"variable_id": "tasmax",
"source_id": "ACCESS-CM2",
"institution_id": "CSIRO-ARCCSS",
"member_id": "r1i1p1f1",
"grid_label": "gn",
"version": "20191108"
},
"ssp": {
"activity_id": "ScenarioMIP",
"experiment_id": "ssp370",
"table_id": "day",
"variable_id": "tasmax",
"source_id": "ACCESS-CM2",
"institution_id": "CSIRO-ARCCSS",
"member_id": "r1i1p1f1",
"grid_label": "gn",
"version": "20191108"
}
}
]
"""
)
p1, p2 = read_params(fl)
assert p1 == expected_p1
assert p2 == expected_p2
def test_write_params_writes(tmp_path):
"""
Test that dodola.write_params writes two SimpleRuns to a parameter file.
"""
params = [
SimpleRun(
target="historical",
variable_id="tasmax",
historical=Cmip6Record(
activity_id="CMIP",
experiment_id="historical",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
ssp=Cmip6Record(
activity_id="ScenarioMIP",
experiment_id="ssp370",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
),
SimpleRun(
target="ssp",
variable_id="tasmax",
historical=Cmip6Record(
activity_id="CMIP",
experiment_id="historical",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
ssp=Cmip6Record(
activity_id="ScenarioMIP",
experiment_id="ssp370",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
),
]
path = tmp_path / "test_write_params_writes.yaml"
expected_contents = """jobs: |
[
{
"target": "historical",
"variable_id": "tasmax",
"historical": {
"activity_id": "CMIP",
"experiment_id": "historical",
"table_id": "day",
"variable_id": "tasmax",
"source_id": "ACCESS-CM2",
"institution_id": "CSIRO-ARCCSS",
"member_id": "r1i1p1f1",
"grid_label": "gn",
"version": "20191108"
},
"ssp": {
"activity_id": "ScenarioMIP",
"experiment_id": "ssp370",
"table_id": "day",
"variable_id": "tasmax",
"source_id": "ACCESS-CM2",
"institution_id": "CSIRO-ARCCSS",
"member_id": "r1i1p1f1",
"grid_label": "gn",
"version": "20191108"
}
},
{
"target": "ssp",
"variable_id": "tasmax",
"historical": {
"activity_id": "CMIP",
"experiment_id": "historical",
"table_id": "day",
"variable_id": "tasmax",
"source_id": "ACCESS-CM2",
"institution_id": "CSIRO-ARCCSS",
"member_id": "r1i1p1f1",
"grid_label": "gn",
"version": "20191108"
},
"ssp": {
"activity_id": "ScenarioMIP",
"experiment_id": "ssp370",
"table_id": "day",
"variable_id": "tasmax",
"source_id": "ACCESS-CM2",
"institution_id": "CSIRO-ARCCSS",
"member_id": "r1i1p1f1",
"grid_label": "gn",
"version": "20191108"
}
}
]
"""
write_params(path, params)
contents = path.read_text()
assert contents == expected_contents
def test_reread_written_params(tmp_path):
"""
Test dodola.read_params reads matching params written by dodola.write_params.
"""
params_written = [
SimpleRun(
target="ssp",
variable_id="tasmax",
historical=Cmip6Record(
activity_id="CMIP",
experiment_id="historical",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
ssp=Cmip6Record(
activity_id="ScenarioMIP",
experiment_id="ssp370",
table_id="day",
variable_id="tasmax",
source_id="ACCESS-CM2",
institution_id="CSIRO-ARCCSS",
member_id="r1i1p1f1",
grid_label="gn",
version="20191108",
),
)
]
path = tmp_path / "test_reread_written_params.yaml"
write_params(path, params_written)
params_reread = read_params(str(path))
assert params_written == params_reread
| 29.113879 | 84 | 0.496883 | 721 | 8,181 | 5.368932 | 0.092926 | 0.069749 | 0.111599 | 0.083699 | 0.82666 | 0.82666 | 0.82666 | 0.82666 | 0.82666 | 0.82666 | 0 | 0.054002 | 0.366214 | 8,181 | 280 | 85 | 29.217857 | 0.692575 | 0.028236 | 0 | 0.802885 | 0 | 0 | 0.345867 | 0.009323 | 0 | 0 | 0 | 0 | 0.019231 | 1 | 0.014423 | false | 0 | 0.009615 | 0 | 0.024038 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9769e15a0e8f6b1b8aa3d8b270e8ae468a2f6cfa | 147 | py | Python | MeshToolkit/Tool/__init__.py | microy/MeshToolkit | df239e73fcd78c726e14c6b92eef7318da5e4297 | [
"MIT"
] | 4 | 2017-03-03T15:18:02.000Z | 2020-01-26T23:23:34.000Z | MeshToolkit/Tool/__init__.py | microy/PyMeshToolkit | df239e73fcd78c726e14c6b92eef7318da5e4297 | [
"MIT"
] | null | null | null | MeshToolkit/Tool/__init__.py | microy/PyMeshToolkit | df239e73fcd78c726e14c6b92eef7318da5e4297 | [
"MIT"
] | 2 | 2017-07-16T08:59:30.000Z | 2018-10-19T15:54:35.000Z | from . import Colormap
from .Colormap import *
from . import Primitive
from .Primitive import *
from . import Statistics
from .Statistics import *
| 21 | 25 | 0.77551 | 18 | 147 | 6.333333 | 0.277778 | 0.263158 | 0.280702 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 147 | 6 | 26 | 24.5 | 0.926829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
977bcf73d6fdd0190968ee07e9b979ca2d43165e | 30,633 | py | Python | transformer/Models.py | bahducoup/factorized_training | 0af38f16338a9bcfcc11091b1a6b75befd67f234 | [
"MIT"
] | null | null | null | transformer/Models.py | bahducoup/factorized_training | 0af38f16338a9bcfcc11091b1a6b75befd67f234 | [
"MIT"
] | null | null | null | transformer/Models.py | bahducoup/factorized_training | 0af38f16338a9bcfcc11091b1a6b75befd67f234 | [
"MIT"
] | null | null | null | ''' Define the Transformer model '''
import torch
import torch.nn as nn
import numpy as np
from transformer.Layers import EncoderLayer, DecoderLayer, LowRankEncoderLayer, LowRankDecoderLayer, LowRankResidualEncoderLayer, LowRankResidualDecoderLayer
from torch.cuda.amp import autocast
import logging
logging.basicConfig()
logger = logging.getLogger()
logger.setLevel(logging.INFO)
__author__ = "Yu-Hsiang Huang"
def get_pad_mask(seq, pad_idx):
return (seq != pad_idx).unsqueeze(-2)
def get_subsequent_mask(seq):
''' For masking out the subsequent info. '''
sz_b, len_s = seq.size()
subsequent_mask = (1 - torch.triu(
torch.ones((1, len_s, len_s), device=seq.device), diagonal=1)).bool()
return subsequent_mask
class PositionalEncoding(nn.Module):
def __init__(self, d_hid, n_position=200):
super(PositionalEncoding, self).__init__()
# Not a parameter
self.register_buffer('pos_table', self._get_sinusoid_encoding_table(n_position, d_hid))
def _get_sinusoid_encoding_table(self, n_position, d_hid):
''' Sinusoid position encoding table '''
# TODO: make it with torch instead of numpy
def get_position_angle_vec(position):
return [position / np.power(10000, 2 * (hid_j // 2) / d_hid) for hid_j in range(d_hid)]
sinusoid_table = np.array([get_position_angle_vec(pos_i) for pos_i in range(n_position)])
sinusoid_table[:, 0::2] = np.sin(sinusoid_table[:, 0::2]) # dim 2i
sinusoid_table[:, 1::2] = np.cos(sinusoid_table[:, 1::2]) # dim 2i+1
return torch.FloatTensor(sinusoid_table).unsqueeze(0)
@autocast()
def forward(self, x):
return x + self.pos_table[:, :x.size(1)].clone().detach()
class Encoder(nn.Module):
''' A encoder model with self attention mechanism. '''
def __init__(
self, n_src_vocab, d_word_vec, n_layers, n_head, d_k, d_v,
d_model, d_inner, pad_idx, dropout=0.1, n_position=200, scale_emb=False):
super().__init__()
self.src_word_emb = nn.Embedding(n_src_vocab, d_word_vec, padding_idx=pad_idx)
self.position_enc = PositionalEncoding(d_word_vec, n_position=n_position)
self.dropout = nn.Dropout(p=dropout)
self.layer_stack = nn.ModuleList([
EncoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout)
for _ in range(n_layers)])
self.layer_norm = nn.LayerNorm(d_model, eps=1e-6)
self.scale_emb = scale_emb
self.d_model = d_model
@autocast()
def forward(self, src_seq, src_mask, return_attns=False):
enc_slf_attn_list = []
# -- Forward
enc_output = self.src_word_emb(src_seq)
if self.scale_emb:
enc_output *= self.d_model ** 0.5
enc_output = self.dropout(self.position_enc(enc_output))
enc_output = self.layer_norm(enc_output)
for enc_layer in self.layer_stack:
enc_output, enc_slf_attn = enc_layer(enc_output, slf_attn_mask=src_mask)
enc_slf_attn_list += [enc_slf_attn] if return_attns else []
if return_attns:
return enc_output, enc_slf_attn_list
return enc_output,
class LowRankEncoder(nn.Module):
''' A encoder model with self attention mechanism. '''
def __init__(
self, n_src_vocab, d_word_vec, n_layers, n_head, d_k, d_v,
d_model, d_inner, pad_idx, dropout=0.1, n_position=200, scale_emb=False):
super().__init__()
self.src_word_emb = nn.Embedding(n_src_vocab, d_word_vec, padding_idx=pad_idx)
self.position_enc = PositionalEncoding(d_word_vec, n_position=n_position)
self.dropout = nn.Dropout(p=dropout)
#self.layer_stack = nn.ModuleList([
# LowRankEncoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout)
# for _ in range(n_layers)])
layer_stack = []
for layer_index in range(n_layers):
if layer_index < 1:
layer_stack.append(EncoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout))
else:
layer_stack.append(LowRankEncoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout))
self.layer_stack = nn.ModuleList(layer_stack)
self.layer_norm = nn.LayerNorm(d_model, eps=1e-6)
self.scale_emb = scale_emb
self.d_model = d_model
@autocast()
def forward(self, src_seq, src_mask, return_attns=False):
enc_slf_attn_list = []
# -- Forward
enc_output = self.src_word_emb(src_seq)
if self.scale_emb:
enc_output *= self.d_model ** 0.5
enc_output = self.dropout(self.position_enc(enc_output))
enc_output = self.layer_norm(enc_output)
for enc_layer in self.layer_stack:
enc_output, enc_slf_attn = enc_layer(enc_output, slf_attn_mask=src_mask)
enc_slf_attn_list += [enc_slf_attn] if return_attns else []
if return_attns:
return enc_output, enc_slf_attn_list
return enc_output,
class LowRankResidualEncoder(nn.Module):
''' A encoder model with self attention mechanism. '''
def __init__(
self, n_src_vocab, d_word_vec, n_layers, n_head, d_k, d_v,
d_model, d_inner, pad_idx, dropout=0.1, n_position=200, scale_emb=False):
super().__init__()
self.src_word_emb = nn.Embedding(n_src_vocab, d_word_vec, padding_idx=pad_idx)
self.position_enc = PositionalEncoding(d_word_vec, n_position=n_position)
self.dropout = nn.Dropout(p=dropout)
#self.layer_stack = nn.ModuleList([
# LowRankEncoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout)
# for _ in range(n_layers)])
layer_stack = []
for layer_index in range(n_layers):
if layer_index < 1:
layer_stack.append(EncoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout))
else:
layer_stack.append(LowRankResidualEncoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout))
self.layer_stack = nn.ModuleList(layer_stack)
self.layer_norm = nn.LayerNorm(d_model, eps=1e-6)
self.scale_emb = scale_emb
self.d_model = d_model
@autocast()
def forward(self, src_seq, src_mask, return_attns=False):
enc_slf_attn_list = []
# -- Forward
enc_output = self.src_word_emb(src_seq)
if self.scale_emb:
enc_output *= self.d_model ** 0.5
enc_output = self.dropout(self.position_enc(enc_output))
enc_output = self.layer_norm(enc_output)
for enc_layer in self.layer_stack:
enc_output, enc_slf_attn = enc_layer(enc_output, slf_attn_mask=src_mask)
enc_slf_attn_list += [enc_slf_attn] if return_attns else []
if return_attns:
return enc_output, enc_slf_attn_list
return enc_output,
class AdaptEncoder(nn.Module):
''' A encoder model with self attention mechanism. '''
def __init__(
self, n_src_vocab, d_word_vec, n_layers, n_head, d_k, d_v,
d_model, d_inner, pad_idx, dropout=0.1, n_position=200, scale_emb=False, est_ranks=None):
super().__init__()
self.src_word_emb = nn.Embedding(n_src_vocab, d_word_vec, padding_idx=pad_idx)
#self.src_word_emb_u = nn.Embedding(n_src_vocab, int(d_word_vec/4), padding_idx=pad_idx)
#self.src_word_emb_v = nn.Linear(int(d_word_vec/4), d_word_vec)
self.position_enc = PositionalEncoding(d_word_vec, n_position=n_position)
self.dropout = nn.Dropout(p=dropout)
encoder_counter = 0
layer_stack = []
for layer_index in range(n_layers):
if layer_index < 1:
layer_stack.append(
EncoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout)
)
else:
layer_stack.append(
AdaptEncoderLayer(d_model, d_inner, n_head, d_k, d_v,
dropout=dropout,
est_rank_enc=est_ranks[encoder_counter:encoder_counter+6])
)
encoder_counter += 6
self.layer_stack = nn.ModuleList(layer_stack)
self.layer_norm = nn.LayerNorm(d_model, eps=1e-6)
self.scale_emb = scale_emb
self.d_model = d_model
@autocast()
def forward(self, src_seq, src_mask, return_attns=False):
enc_slf_attn_list = []
# -- Forward
enc_output = self.src_word_emb_v(self.src_word_emb_u(src_seq))
if self.scale_emb:
enc_output *= self.d_model ** 0.5
enc_output = self.dropout(self.position_enc(enc_output))
enc_output = self.layer_norm(enc_output)
for enc_layer in self.layer_stack:
enc_output, enc_slf_attn = enc_layer(enc_output, slf_attn_mask=src_mask)
enc_slf_attn_list += [enc_slf_attn] if return_attns else []
if return_attns:
return enc_output, enc_slf_attn_list
return enc_output,
class Decoder(nn.Module):
''' A decoder model with self attention mechanism. '''
def __init__(
self, n_trg_vocab, d_word_vec, n_layers, n_head, d_k, d_v,
d_model, d_inner, pad_idx, n_position=200, dropout=0.1, scale_emb=False):
super().__init__()
self.trg_word_emb = nn.Embedding(n_trg_vocab, d_word_vec, padding_idx=pad_idx)
self.position_enc = PositionalEncoding(d_word_vec, n_position=n_position)
self.dropout = nn.Dropout(p=dropout)
self.layer_stack = nn.ModuleList([
DecoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout)
for _ in range(n_layers)])
self.layer_norm = nn.LayerNorm(d_model, eps=1e-6)
self.scale_emb = scale_emb
self.d_model = d_model
@autocast()
def forward(self, trg_seq, trg_mask, enc_output, src_mask, return_attns=False):
dec_slf_attn_list, dec_enc_attn_list = [], []
# -- Forward
dec_output = self.trg_word_emb(trg_seq)
if self.scale_emb:
dec_output *= self.d_model ** 0.5
dec_output = self.dropout(self.position_enc(dec_output))
dec_output = self.layer_norm(dec_output)
for dec_layer in self.layer_stack:
dec_output, dec_slf_attn, dec_enc_attn = dec_layer(
dec_output, enc_output, slf_attn_mask=trg_mask, dec_enc_attn_mask=src_mask)
dec_slf_attn_list += [dec_slf_attn] if return_attns else []
dec_enc_attn_list += [dec_enc_attn] if return_attns else []
if return_attns:
return dec_output, dec_slf_attn_list, dec_enc_attn_list
return dec_output,
class LowRankDecoder(nn.Module):
''' A decoder model with self attention mechanism. '''
def __init__(
self, n_trg_vocab, d_word_vec, n_layers, n_head, d_k, d_v,
d_model, d_inner, pad_idx, n_position=200, dropout=0.1, scale_emb=False):
super().__init__()
self.trg_word_emb = nn.Embedding(n_trg_vocab, d_word_vec, padding_idx=pad_idx)
self.position_enc = PositionalEncoding(d_word_vec, n_position=n_position)
self.dropout = nn.Dropout(p=dropout)
#self.layer_stack = nn.ModuleList([
# LowRankDecoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout)
# for _ in range(n_layers)])
layer_stack = []
for layer_index in range(n_layers):
if layer_index < 1:
layer_stack.append(DecoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout))
else:
layer_stack.append(LowRankDecoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout))
self.layer_stack = nn.ModuleList(layer_stack)
self.layer_norm = nn.LayerNorm(d_model, eps=1e-6)
self.scale_emb = scale_emb
self.d_model = d_model
@autocast()
def forward(self, trg_seq, trg_mask, enc_output, src_mask, return_attns=False):
dec_slf_attn_list, dec_enc_attn_list = [], []
# -- Forward
dec_output = self.trg_word_emb(trg_seq)
if self.scale_emb:
dec_output *= self.d_model ** 0.5
dec_output = self.dropout(self.position_enc(dec_output))
dec_output = self.layer_norm(dec_output)
for dec_layer in self.layer_stack:
dec_output, dec_slf_attn, dec_enc_attn = dec_layer(
dec_output, enc_output, slf_attn_mask=trg_mask, dec_enc_attn_mask=src_mask)
dec_slf_attn_list += [dec_slf_attn] if return_attns else []
dec_enc_attn_list += [dec_enc_attn] if return_attns else []
if return_attns:
return dec_output, dec_slf_attn_list, dec_enc_attn_list
return dec_output,
class LowRankResidualDecoder(nn.Module):
''' A decoder model with self attention mechanism. '''
def __init__(
self, n_trg_vocab, d_word_vec, n_layers, n_head, d_k, d_v,
d_model, d_inner, pad_idx, n_position=200, dropout=0.1, scale_emb=False):
super().__init__()
self.trg_word_emb = nn.Embedding(n_trg_vocab, d_word_vec, padding_idx=pad_idx)
self.position_enc = PositionalEncoding(d_word_vec, n_position=n_position)
self.dropout = nn.Dropout(p=dropout)
#self.layer_stack = nn.ModuleList([
# LowRankDecoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout)
# for _ in range(n_layers)])
layer_stack = []
for layer_index in range(n_layers):
if layer_index < 1:
layer_stack.append(DecoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout))
else:
layer_stack.append(LowRankResidualDecoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout))
self.layer_stack = nn.ModuleList(layer_stack)
self.layer_norm = nn.LayerNorm(d_model, eps=1e-6)
self.scale_emb = scale_emb
self.d_model = d_model
@autocast()
def forward(self, trg_seq, trg_mask, enc_output, src_mask, return_attns=False):
dec_slf_attn_list, dec_enc_attn_list = [], []
# -- Forward
dec_output = self.trg_word_emb(trg_seq)
if self.scale_emb:
dec_output *= self.d_model ** 0.5
dec_output = self.dropout(self.position_enc(dec_output))
dec_output = self.layer_norm(dec_output)
for dec_layer in self.layer_stack:
dec_output, dec_slf_attn, dec_enc_attn = dec_layer(
dec_output, enc_output, slf_attn_mask=trg_mask, dec_enc_attn_mask=src_mask)
dec_slf_attn_list += [dec_slf_attn] if return_attns else []
dec_enc_attn_list += [dec_enc_attn] if return_attns else []
if return_attns:
return dec_output, dec_slf_attn_list, dec_enc_attn_list
return dec_output,
class AdaptDecoder(nn.Module):
''' A decoder model with self attention mechanism. '''
def __init__(
self, n_trg_vocab, d_word_vec, n_layers, n_head, d_k, d_v,
d_model, d_inner, pad_idx, n_position=200, dropout=0.1, scale_emb=False, est_ranks=None):
super().__init__()
self.trg_word_emb = nn.Embedding(n_trg_vocab, d_word_vec, padding_idx=pad_idx)
#self.trg_word_emb_u = nn.Embedding(n_trg_vocab, int(d_word_vec/4), padding_idx=pad_idx)
#self.trg_word_emb_v = nn.Linear(int(d_word_vec/4), d_word_vec)
self.position_enc = PositionalEncoding(d_word_vec, n_position=n_position)
self.dropout = nn.Dropout(p=dropout)
decoder_counter = int(6 * (n_layers - 1))
layer_stack = []
for layer_index in range(n_layers):
if layer_index < 1:
layer_stack.append(DecoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout))
else:
layer_stack.append(AdaptDecoderLayer(d_model, d_inner, n_head, d_k, d_v, dropout=dropout,
est_rank_dec=est_ranks[decoder_counter:decoder_counter+10]))
decoder_counter += 10
self.layer_stack = nn.ModuleList(layer_stack)
self.layer_norm = nn.LayerNorm(d_model, eps=1e-6)
self.scale_emb = scale_emb
self.d_model = d_model
@autocast()
def forward(self, trg_seq, trg_mask, enc_output, src_mask, return_attns=False):
dec_slf_attn_list, dec_enc_attn_list = [], []
# -- Forward
dec_output = self.trg_word_emb_v(self.trg_word_emb_u(trg_seq))
if self.scale_emb:
dec_output *= self.d_model ** 0.5
dec_output = self.dropout(self.position_enc(dec_output))
dec_output = self.layer_norm(dec_output)
for dec_layer in self.layer_stack:
dec_output, dec_slf_attn, dec_enc_attn = dec_layer(
dec_output, enc_output, slf_attn_mask=trg_mask, dec_enc_attn_mask=src_mask)
dec_slf_attn_list += [dec_slf_attn] if return_attns else []
dec_enc_attn_list += [dec_enc_attn] if return_attns else []
if return_attns:
return dec_output, dec_slf_attn_list, dec_enc_attn_list
return dec_output,
class Transformer(nn.Module):
''' A sequence to sequence model with attention mechanism. '''
def __init__(
self, n_src_vocab, n_trg_vocab, src_pad_idx, trg_pad_idx,
d_word_vec=512, d_model=512, d_inner=2048,
n_layers=6, n_head=8, d_k=64, d_v=64, dropout=0.1, n_position=200,
trg_emb_prj_weight_sharing=True, emb_src_trg_weight_sharing=True,
scale_emb_or_prj='prj'):
super().__init__()
self.src_pad_idx, self.trg_pad_idx = src_pad_idx, trg_pad_idx
# In section 3.4 of paper "Attention Is All You Need", there is such detail:
# "In our model, we share the same weight matrix between the two
# embedding layers and the pre-softmax linear transformation...
# In the embedding layers, we multiply those weights by \sqrt{d_model}".
#
# Options here:
# 'emb': multiply \sqrt{d_model} to embedding output
# 'prj': multiply (\sqrt{d_model} ^ -1) to linear projection output
# 'none': no multiplication
assert scale_emb_or_prj in ['emb', 'prj', 'none']
scale_emb = (scale_emb_or_prj == 'emb') if trg_emb_prj_weight_sharing else False
self.scale_prj = (scale_emb_or_prj == 'prj') if trg_emb_prj_weight_sharing else False
self.d_model = d_model
self.encoder = Encoder(
n_src_vocab=n_src_vocab, n_position=n_position,
d_word_vec=d_word_vec, d_model=d_model, d_inner=d_inner,
n_layers=n_layers, n_head=n_head, d_k=d_k, d_v=d_v,
pad_idx=src_pad_idx, dropout=dropout, scale_emb=scale_emb)
self.decoder = Decoder(
n_trg_vocab=n_trg_vocab, n_position=n_position,
d_word_vec=d_word_vec, d_model=d_model, d_inner=d_inner,
n_layers=n_layers, n_head=n_head, d_k=d_k, d_v=d_v,
pad_idx=trg_pad_idx, dropout=dropout, scale_emb=scale_emb)
self.trg_word_prj = nn.Linear(d_model, n_trg_vocab, bias=False)
for p in self.parameters():
if p.dim() > 1:
nn.init.xavier_uniform_(p)
assert d_model == d_word_vec, \
'To facilitate the residual connections, \
the dimensions of all module outputs shall be the same.'
if trg_emb_prj_weight_sharing:
# Share the weight between target word embedding & last dense layer
self.trg_word_prj.weight = self.decoder.trg_word_emb.weight
if emb_src_trg_weight_sharing:
self.encoder.src_word_emb.weight = self.decoder.trg_word_emb.weight
@autocast()
def forward(self, src_seq, trg_seq):
src_mask = get_pad_mask(src_seq, self.src_pad_idx)
trg_mask = get_pad_mask(trg_seq, self.trg_pad_idx) & get_subsequent_mask(trg_seq)
enc_output, *_ = self.encoder(src_seq, src_mask)
dec_output, *_ = self.decoder(trg_seq, trg_mask, enc_output, src_mask)
seq_logit = self.trg_word_prj(dec_output)
if self.scale_prj:
seq_logit *= self.d_model ** -0.5
return seq_logit
#return seq_logit.view(-1, seq_logit.size(2))
class LowRankTransformer(nn.Module):
''' A sequence to sequence model with attention mechanism. '''
def __init__(
self, n_src_vocab, n_trg_vocab, src_pad_idx, trg_pad_idx,
d_word_vec=512, d_model=512, d_inner=2048,
n_layers=6, n_head=8, d_k=64, d_v=64, dropout=0.1, n_position=200,
trg_emb_prj_weight_sharing=True, emb_src_trg_weight_sharing=True,
scale_emb_or_prj='prj'):
super().__init__()
self.src_pad_idx, self.trg_pad_idx = src_pad_idx, trg_pad_idx
# In section 3.4 of paper "Attention Is All You Need", there is such detail:
# "In our model, we share the same weight matrix between the two
# embedding layers and the pre-softmax linear transformation...
# In the embedding layers, we multiply those weights by \sqrt{d_model}".
#
# Options here:
# 'emb': multiply \sqrt{d_model} to embedding output
# 'prj': multiply (\sqrt{d_model} ^ -1) to linear projection output
# 'none': no multiplication
assert scale_emb_or_prj in ['emb', 'prj', 'none']
scale_emb = (scale_emb_or_prj == 'emb') if trg_emb_prj_weight_sharing else False
self.scale_prj = (scale_emb_or_prj == 'prj') if trg_emb_prj_weight_sharing else False
self.d_model = d_model
self.encoder = LowRankEncoder(
n_src_vocab=n_src_vocab, n_position=n_position,
d_word_vec=d_word_vec, d_model=d_model, d_inner=d_inner,
n_layers=n_layers, n_head=n_head, d_k=d_k, d_v=d_v,
pad_idx=src_pad_idx, dropout=dropout, scale_emb=scale_emb)
self.decoder = LowRankDecoder(
n_trg_vocab=n_trg_vocab, n_position=n_position,
d_word_vec=d_word_vec, d_model=d_model, d_inner=d_inner,
n_layers=n_layers, n_head=n_head, d_k=d_k, d_v=d_v,
pad_idx=trg_pad_idx, dropout=dropout, scale_emb=scale_emb)
self.trg_word_prj = nn.Linear(d_model, n_trg_vocab, bias=False)
for p in self.parameters():
if p.dim() > 1:
nn.init.xavier_uniform_(p)
assert d_model == d_word_vec, \
'To facilitate the residual connections, \
the dimensions of all module outputs shall be the same.'
if trg_emb_prj_weight_sharing:
# Share the weight between target word embedding & last dense layer
self.trg_word_prj.weight = self.decoder.trg_word_emb.weight
if emb_src_trg_weight_sharing:
self.encoder.src_word_emb.weight = self.decoder.trg_word_emb.weight
@autocast()
def forward(self, src_seq, trg_seq):
src_mask = get_pad_mask(src_seq, self.src_pad_idx)
trg_mask = get_pad_mask(trg_seq, self.trg_pad_idx) & get_subsequent_mask(trg_seq)
enc_output, *_ = self.encoder(src_seq, src_mask)
dec_output, *_ = self.decoder(trg_seq, trg_mask, enc_output, src_mask)
seq_logit = self.trg_word_prj(dec_output)
if self.scale_prj:
seq_logit *= self.d_model ** -0.5
return seq_logit
class AdaptTransformer(nn.Module):
''' A sequence to sequence model with attention mechanism. '''
def __init__(
self, n_src_vocab, n_trg_vocab, src_pad_idx, trg_pad_idx,
d_word_vec=512, d_model=512, d_inner=2048,
n_layers=6, n_head=8, d_k=64, d_v=64, dropout=0.1, n_position=200,
trg_emb_prj_weight_sharing=True, emb_src_trg_weight_sharing=True,
scale_emb_or_prj='prj', est_ranks=None):
super().__init__()
self.src_pad_idx, self.trg_pad_idx = src_pad_idx, trg_pad_idx
# In section 3.4 of paper "Attention Is All You Need", there is such detail:
# "In our model, we share the same weight matrix between the two
# embedding layers and the pre-softmax linear transformation...
# In the embedding layers, we multiply those weights by \sqrt{d_model}".
#
# Options here:
# 'emb': multiply \sqrt{d_model} to embedding output
# 'prj': multiply (\sqrt{d_model} ^ -1) to linear projection output
# 'none': no multiplication
assert scale_emb_or_prj in ['emb', 'prj', 'none']
scale_emb = (scale_emb_or_prj == 'emb') if trg_emb_prj_weight_sharing else False
self.scale_prj = (scale_emb_or_prj == 'prj') if trg_emb_prj_weight_sharing else False
self.d_model = d_model
self.encoder = AdaptEncoder(
n_src_vocab=n_src_vocab, n_position=n_position,
d_word_vec=d_word_vec, d_model=d_model, d_inner=d_inner,
n_layers=n_layers, n_head=n_head, d_k=d_k, d_v=d_v,
pad_idx=src_pad_idx, dropout=dropout, scale_emb=scale_emb,
est_ranks=est_ranks)
self.decoder = AdaptDecoder(
n_trg_vocab=n_trg_vocab, n_position=n_position,
d_word_vec=d_word_vec, d_model=d_model, d_inner=d_inner,
n_layers=n_layers, n_head=n_head, d_k=d_k, d_v=d_v,
pad_idx=trg_pad_idx, dropout=dropout, scale_emb=scale_emb,
est_ranks=est_ranks)
#self.trg_word_prj = nn.Linear(d_model, n_trg_vocab, bias=False)
self.trg_word_prj_u = nn.Linear(d_model, int(d_model/4), bias=False)
self.trg_word_prj_v = nn.Linear(int(d_model/4), n_trg_vocab, bias=False)
for p in self.parameters():
if p.dim() > 1:
nn.init.xavier_uniform_(p)
assert d_model == d_word_vec, \
'To facilitate the residual connections, \
the dimensions of all module outputs shall be the same.'
if trg_emb_prj_weight_sharing:
# Share the weight between target word embedding & last dense layer
self.trg_word_prj.weight = self.decoder.trg_word_emb.weight
#self.trg_word_prj_u.weight.data = self.decoder.trg_word_emb_v.weight.data.transpose(0, 1)
#self.trg_word_prj_v.weight = self.decoder.trg_word_emb_u.weight
if emb_src_trg_weight_sharing:
self.encoder.src_word_emb.weight = self.decoder.trg_word_emb.weight
#self.encoder.src_word_emb_u.weight = self.decoder.trg_word_emb_u.weight
#self.encoder.src_word_emb_v.weight = self.decoder.trg_word_emb_v.weight
#self.encoder.src_word_emb_v.bias = self.decoder.trg_word_emb_v.bias
@autocast()
def forward(self, src_seq, trg_seq):
src_mask = get_pad_mask(src_seq, self.src_pad_idx)
trg_mask = get_pad_mask(trg_seq, self.trg_pad_idx) & get_subsequent_mask(trg_seq)
enc_output, *_ = self.encoder(src_seq, src_mask)
dec_output, *_ = self.decoder(trg_seq, trg_mask, enc_output, src_mask)
seq_logit = self.trg_word_prj(dec_output)
#seq_logit = self.trg_word_prj_v(self.trg_word_prj_u(dec_output))
#print(dec_output.size(), self.trg_word_prj_u.weight.size())
#a = self.trg_word_prj_u(dec_output)
#seq_logit = self.trg_word_prj_v(a)
if self.scale_prj:
seq_logit *= self.d_model ** -0.5
return seq_logit
class LowRankResidualTransformer(nn.Module):
''' A sequence to sequence model with attention mechanism. '''
def __init__(
self, n_src_vocab, n_trg_vocab, src_pad_idx, trg_pad_idx,
d_word_vec=512, d_model=512, d_inner=2048,
n_layers=6, n_head=8, d_k=64, d_v=64, dropout=0.1, n_position=200,
trg_emb_prj_weight_sharing=True, emb_src_trg_weight_sharing=True,
scale_emb_or_prj='prj'):
super().__init__()
self.src_pad_idx, self.trg_pad_idx = src_pad_idx, trg_pad_idx
# In section 3.4 of paper "Attention Is All You Need", there is such detail:
# "In our model, we share the same weight matrix between the two
# embedding layers and the pre-softmax linear transformation...
# In the embedding layers, we multiply those weights by \sqrt{d_model}".
#
# Options here:
# 'emb': multiply \sqrt{d_model} to embedding output
# 'prj': multiply (\sqrt{d_model} ^ -1) to linear projection output
# 'none': no multiplication
assert scale_emb_or_prj in ['emb', 'prj', 'none']
scale_emb = (scale_emb_or_prj == 'emb') if trg_emb_prj_weight_sharing else False
self.scale_prj = (scale_emb_or_prj == 'prj') if trg_emb_prj_weight_sharing else False
self.d_model = d_model
self.encoder = LowRankResidualEncoder(
n_src_vocab=n_src_vocab, n_position=n_position,
d_word_vec=d_word_vec, d_model=d_model, d_inner=d_inner,
n_layers=n_layers, n_head=n_head, d_k=d_k, d_v=d_v,
pad_idx=src_pad_idx, dropout=dropout, scale_emb=scale_emb)
self.decoder = LowRankResidualDecoder(
n_trg_vocab=n_trg_vocab, n_position=n_position,
d_word_vec=d_word_vec, d_model=d_model, d_inner=d_inner,
n_layers=n_layers, n_head=n_head, d_k=d_k, d_v=d_v,
pad_idx=trg_pad_idx, dropout=dropout, scale_emb=scale_emb)
self.trg_word_prj = nn.Linear(d_model, n_trg_vocab, bias=False)
for p in self.parameters():
if p.dim() > 1:
nn.init.xavier_uniform_(p)
assert d_model == d_word_vec, \
'To facilitate the residual connections, \
the dimensions of all module outputs shall be the same.'
if trg_emb_prj_weight_sharing:
# Share the weight between target word embedding & last dense layer
self.trg_word_prj.weight = self.decoder.trg_word_emb.weight
if emb_src_trg_weight_sharing:
self.encoder.src_word_emb.weight = self.decoder.trg_word_emb.weight
@autocast()
def forward(self, src_seq, trg_seq):
src_mask = get_pad_mask(src_seq, self.src_pad_idx)
trg_mask = get_pad_mask(trg_seq, self.trg_pad_idx) & get_subsequent_mask(trg_seq)
enc_output, *_ = self.encoder(src_seq, src_mask)
dec_output, *_ = self.decoder(trg_seq, trg_mask, enc_output, src_mask)
seq_logit = self.trg_word_prj(dec_output)
if self.scale_prj:
seq_logit *= self.d_model ** -0.5
return seq_logit | 40.519841 | 157 | 0.653544 | 4,595 | 30,633 | 3.964091 | 0.048313 | 0.037222 | 0.022289 | 0.013066 | 0.900412 | 0.896733 | 0.889047 | 0.880373 | 0.875103 | 0.870766 | 0 | 0.010123 | 0.25182 | 30,633 | 756 | 158 | 40.519841 | 0.784633 | 0.14331 | 0 | 0.842767 | 0 | 0 | 0.003835 | 0 | 0 | 0 | 0 | 0.001323 | 0.016771 | 1 | 0.062893 | false | 0 | 0.012579 | 0.006289 | 0.155136 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
978aad74526c4b51efe4123d53a7d5d3bf35b8cc | 1,308 | py | Python | tests/test_fields.py | goerz/docxcompose | def1f70d64385253a170ae8f88fedea61b4e7cf7 | [
"MIT"
] | null | null | null | tests/test_fields.py | goerz/docxcompose | def1f70d64385253a170ae8f88fedea61b4e7cf7 | [
"MIT"
] | null | null | null | tests/test_fields.py | goerz/docxcompose | def1f70d64385253a170ae8f88fedea61b4e7cf7 | [
"MIT"
] | null | null | null | from docxcompose.properties import FieldBase
class FieldForTesting(FieldBase):
def _get_fieldname_string(self):
return self.node
class TestFieldNameParsing(object):
def test_can_parse_quoted_property_names(self):
node = ' DOCPROPERTY "Propertyname" \\* MERGEFORMAT '
assert "Propertyname" == FieldForTesting(node).name
def test_can_parse_unquoted_property_names(self):
node = ' DOCPROPERTY Propertyname \\* MERGEFORMAT '
assert "Propertyname" == FieldForTesting(node).name
def test_can_parse_quoted_property_names_with_spaces(self):
node = ' DOCPROPERTY "Text Property" \\* MERGEFORMAT '
assert "Text Property" == FieldForTesting(node).name
def test_can_parse_unquoted_property_names_with_spaces(self):
node = ' DOCPROPERTY Text Property \\* MERGEFORMAT '
assert "Text Property" == FieldForTesting(node).name
def test_can_parse_quoted_property_names_with_extra_spaces(self):
node = ' DOCPROPERTY "Text Property" \\* MERGEFORMAT '
assert "Text Property" == FieldForTesting(node).name
def test_can_parse_unquoted_property_names_with_extra_spaces(self):
node = ' DOCPROPERTY Text Property \\* MERGEFORMAT '
assert "Text Property" == FieldForTesting(node).name
| 37.371429 | 71 | 0.718654 | 139 | 1,308 | 6.453237 | 0.230216 | 0.107023 | 0.06689 | 0.100334 | 0.845039 | 0.845039 | 0.845039 | 0.821628 | 0.821628 | 0.821628 | 0 | 0 | 0.194954 | 1,308 | 34 | 72 | 38.470588 | 0.851852 | 0 | 0 | 0.434783 | 0 | 0 | 0.264526 | 0 | 0 | 0 | 0 | 0 | 0.26087 | 1 | 0.304348 | false | 0 | 0.043478 | 0.043478 | 0.478261 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
97c58ee8c563ece38985e9137906949cb50e2bf9 | 84,049 | py | Python | pybind/slxos/v17r_2_00/cpu_state/top/__init__.py | extremenetworks/pybind | 44c467e71b2b425be63867aba6e6fa28b2cfe7fb | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v17r_2_00/cpu_state/top/__init__.py | extremenetworks/pybind | 44c467e71b2b425be63867aba6e6fa28b2cfe7fb | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v17r_2_00/cpu_state/top/__init__.py | extremenetworks/pybind | 44c467e71b2b425be63867aba6e6fa28b2cfe7fb | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import cpu_top_process_information
class top(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-RAS-operational - based on the path /cpu-state/top. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: Top CPU utilization
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__cpu_curr_time','__cpu_system_uptime','__cpu_no_of_users','__cpu_load_average_one_min','__cpu_load_average_five_min','__cpu_load_average_fifteen_min','__cpu_total_task','__cpu_running_task','__cpu_sleeping_task','__cpu_stopped_task','__cpu_zombie_task','__cpu_util_user','__cpu_util_kernel','__cpu_util_nice','__cpu_util_idle','__cpu_util_iowait','__cpu_util_hi','__cpu_util_si','__cpu_util_st','__cpu_total_mem','__cpu_used_mem','__cpu_free_mem','__cpu_buffer_mem','__cpu_total_mem_swap','__cpu_used_mem_swap','__cpu_free_mem_swap','__cpu_cache_mem_swap','__cpu_top_process_information',)
_yang_name = 'top'
_rest_name = 'top'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__cpu_no_of_users = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-no-of-users", rest_name="cpu-no-of-users", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
self.__cpu_util_kernel = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-kernel", rest_name="cpu-util-kernel", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_running_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-running-task", rest_name="cpu-running-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
self.__cpu_sleeping_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-sleeping-task", rest_name="cpu-sleeping-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
self.__cpu_stopped_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-stopped-task", rest_name="cpu-stopped-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
self.__cpu_cache_mem_swap = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-cache-mem-swap", rest_name="cpu-cache-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
self.__cpu_total_mem_swap = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-total-mem-swap", rest_name="cpu-total-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
self.__cpu_buffer_mem = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-buffer-mem", rest_name="cpu-buffer-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
self.__cpu_util_iowait = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-iowait", rest_name="cpu-util-iowait", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_total_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-total-task", rest_name="cpu-total-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
self.__cpu_util_nice = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-nice", rest_name="cpu-util-nice", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_load_average_fifteen_min = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-fifteen-min", rest_name="cpu-load-average-fifteen-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_load_average_five_min = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-five-min", rest_name="cpu-load-average-five-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_util_hi = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-hi", rest_name="cpu-util-hi", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_util_idle = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-idle", rest_name="cpu-util-idle", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_system_uptime = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-system-uptime", rest_name="cpu-system-uptime", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
self.__cpu_util_user = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-user", rest_name="cpu-util-user", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_load_average_one_min = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-one-min", rest_name="cpu-load-average-one-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_used_mem = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-used-mem", rest_name="cpu-used-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
self.__cpu_curr_time = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-curr-time", rest_name="cpu-curr-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
self.__cpu_total_mem = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-total-mem", rest_name="cpu-total-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
self.__cpu_top_process_information = YANGDynClass(base=YANGListType("cpu_process_id",cpu_top_process_information.cpu_top_process_information, yang_name="cpu-top-process-information", rest_name="cpu-top-process-information", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='cpu-process-id', extensions={u'tailf-common': {u'callpoint': u'RAS-cpu-top-process-information', u'cli-suppress-show-path': None}}), is_container='list', yang_name="cpu-top-process-information", rest_name="cpu-top-process-information", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'RAS-cpu-top-process-information', u'cli-suppress-show-path': None}}, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='list', is_config=False)
self.__cpu_free_mem = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-free-mem", rest_name="cpu-free-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
self.__cpu_zombie_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-zombie-task", rest_name="cpu-zombie-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
self.__cpu_util_si = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-si", rest_name="cpu-util-si", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_util_st = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-st", rest_name="cpu-util-st", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
self.__cpu_free_mem_swap = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-free-mem-swap", rest_name="cpu-free-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
self.__cpu_used_mem_swap = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-used-mem-swap", rest_name="cpu-used-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'cpu-state', u'top']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'cpu-state', u'top']
def _get_cpu_curr_time(self):
"""
Getter method for cpu_curr_time, mapped from YANG variable /cpu_state/top/cpu_curr_time (string)
YANG Description: Current time of the system
"""
return self.__cpu_curr_time
def _set_cpu_curr_time(self, v, load=False):
"""
Setter method for cpu_curr_time, mapped from YANG variable /cpu_state/top/cpu_curr_time (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_curr_time is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_curr_time() directly.
YANG Description: Current time of the system
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-curr-time", rest_name="cpu-curr-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_curr_time must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-curr-time", rest_name="cpu-curr-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_curr_time = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_curr_time(self):
self.__cpu_curr_time = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-curr-time", rest_name="cpu-curr-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_system_uptime(self):
"""
Getter method for cpu_system_uptime, mapped from YANG variable /cpu_state/top/cpu_system_uptime (string)
YANG Description: System uptime since last boot
"""
return self.__cpu_system_uptime
def _set_cpu_system_uptime(self, v, load=False):
"""
Setter method for cpu_system_uptime, mapped from YANG variable /cpu_state/top/cpu_system_uptime (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_system_uptime is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_system_uptime() directly.
YANG Description: System uptime since last boot
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-system-uptime", rest_name="cpu-system-uptime", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_system_uptime must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-system-uptime", rest_name="cpu-system-uptime", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_system_uptime = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_system_uptime(self):
self.__cpu_system_uptime = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-system-uptime", rest_name="cpu-system-uptime", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_no_of_users(self):
"""
Getter method for cpu_no_of_users, mapped from YANG variable /cpu_state/top/cpu_no_of_users (uint32)
YANG Description: Current number of users logged in
"""
return self.__cpu_no_of_users
def _set_cpu_no_of_users(self, v, load=False):
"""
Setter method for cpu_no_of_users, mapped from YANG variable /cpu_state/top/cpu_no_of_users (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_no_of_users is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_no_of_users() directly.
YANG Description: Current number of users logged in
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-no-of-users", rest_name="cpu-no-of-users", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_no_of_users must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-no-of-users", rest_name="cpu-no-of-users", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)""",
})
self.__cpu_no_of_users = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_no_of_users(self):
self.__cpu_no_of_users = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-no-of-users", rest_name="cpu-no-of-users", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
def _get_cpu_load_average_one_min(self):
"""
Getter method for cpu_load_average_one_min, mapped from YANG variable /cpu_state/top/cpu_load_average_one_min (decimal64)
YANG Description: CPU load average in the last one minute
"""
return self.__cpu_load_average_one_min
def _set_cpu_load_average_one_min(self, v, load=False):
"""
Setter method for cpu_load_average_one_min, mapped from YANG variable /cpu_state/top/cpu_load_average_one_min (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_load_average_one_min is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_load_average_one_min() directly.
YANG Description: CPU load average in the last one minute
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-one-min", rest_name="cpu-load-average-one-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_load_average_one_min must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-one-min", rest_name="cpu-load-average-one-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_load_average_one_min = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_load_average_one_min(self):
self.__cpu_load_average_one_min = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-one-min", rest_name="cpu-load-average-one-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_load_average_five_min(self):
"""
Getter method for cpu_load_average_five_min, mapped from YANG variable /cpu_state/top/cpu_load_average_five_min (decimal64)
YANG Description: CPU load average in the last five minute
"""
return self.__cpu_load_average_five_min
def _set_cpu_load_average_five_min(self, v, load=False):
"""
Setter method for cpu_load_average_five_min, mapped from YANG variable /cpu_state/top/cpu_load_average_five_min (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_load_average_five_min is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_load_average_five_min() directly.
YANG Description: CPU load average in the last five minute
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-five-min", rest_name="cpu-load-average-five-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_load_average_five_min must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-five-min", rest_name="cpu-load-average-five-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_load_average_five_min = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_load_average_five_min(self):
self.__cpu_load_average_five_min = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-five-min", rest_name="cpu-load-average-five-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_load_average_fifteen_min(self):
"""
Getter method for cpu_load_average_fifteen_min, mapped from YANG variable /cpu_state/top/cpu_load_average_fifteen_min (decimal64)
YANG Description: CPU load average in the last fifteen minute
"""
return self.__cpu_load_average_fifteen_min
def _set_cpu_load_average_fifteen_min(self, v, load=False):
"""
Setter method for cpu_load_average_fifteen_min, mapped from YANG variable /cpu_state/top/cpu_load_average_fifteen_min (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_load_average_fifteen_min is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_load_average_fifteen_min() directly.
YANG Description: CPU load average in the last fifteen minute
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-fifteen-min", rest_name="cpu-load-average-fifteen-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_load_average_fifteen_min must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-fifteen-min", rest_name="cpu-load-average-fifteen-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_load_average_fifteen_min = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_load_average_fifteen_min(self):
self.__cpu_load_average_fifteen_min = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-load-average-fifteen-min", rest_name="cpu-load-average-fifteen-min", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_total_task(self):
"""
Getter method for cpu_total_task, mapped from YANG variable /cpu_state/top/cpu_total_task (uint32)
YANG Description: Total number of tasks running
"""
return self.__cpu_total_task
def _set_cpu_total_task(self, v, load=False):
"""
Setter method for cpu_total_task, mapped from YANG variable /cpu_state/top/cpu_total_task (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_total_task is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_total_task() directly.
YANG Description: Total number of tasks running
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-total-task", rest_name="cpu-total-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_total_task must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-total-task", rest_name="cpu-total-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)""",
})
self.__cpu_total_task = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_total_task(self):
self.__cpu_total_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-total-task", rest_name="cpu-total-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
def _get_cpu_running_task(self):
"""
Getter method for cpu_running_task, mapped from YANG variable /cpu_state/top/cpu_running_task (uint32)
YANG Description: Number of running tasks
"""
return self.__cpu_running_task
def _set_cpu_running_task(self, v, load=False):
"""
Setter method for cpu_running_task, mapped from YANG variable /cpu_state/top/cpu_running_task (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_running_task is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_running_task() directly.
YANG Description: Number of running tasks
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-running-task", rest_name="cpu-running-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_running_task must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-running-task", rest_name="cpu-running-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)""",
})
self.__cpu_running_task = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_running_task(self):
self.__cpu_running_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-running-task", rest_name="cpu-running-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
def _get_cpu_sleeping_task(self):
"""
Getter method for cpu_sleeping_task, mapped from YANG variable /cpu_state/top/cpu_sleeping_task (uint32)
YANG Description: N umber of sleeping tasks
"""
return self.__cpu_sleeping_task
def _set_cpu_sleeping_task(self, v, load=False):
"""
Setter method for cpu_sleeping_task, mapped from YANG variable /cpu_state/top/cpu_sleeping_task (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_sleeping_task is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_sleeping_task() directly.
YANG Description: N umber of sleeping tasks
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-sleeping-task", rest_name="cpu-sleeping-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_sleeping_task must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-sleeping-task", rest_name="cpu-sleeping-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)""",
})
self.__cpu_sleeping_task = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_sleeping_task(self):
self.__cpu_sleeping_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-sleeping-task", rest_name="cpu-sleeping-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
def _get_cpu_stopped_task(self):
"""
Getter method for cpu_stopped_task, mapped from YANG variable /cpu_state/top/cpu_stopped_task (uint32)
YANG Description: Number of stopped tasks
"""
return self.__cpu_stopped_task
def _set_cpu_stopped_task(self, v, load=False):
"""
Setter method for cpu_stopped_task, mapped from YANG variable /cpu_state/top/cpu_stopped_task (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_stopped_task is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_stopped_task() directly.
YANG Description: Number of stopped tasks
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-stopped-task", rest_name="cpu-stopped-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_stopped_task must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-stopped-task", rest_name="cpu-stopped-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)""",
})
self.__cpu_stopped_task = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_stopped_task(self):
self.__cpu_stopped_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-stopped-task", rest_name="cpu-stopped-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
def _get_cpu_zombie_task(self):
"""
Getter method for cpu_zombie_task, mapped from YANG variable /cpu_state/top/cpu_zombie_task (uint32)
YANG Description: Number of zombie tasks
"""
return self.__cpu_zombie_task
def _set_cpu_zombie_task(self, v, load=False):
"""
Setter method for cpu_zombie_task, mapped from YANG variable /cpu_state/top/cpu_zombie_task (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_zombie_task is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_zombie_task() directly.
YANG Description: Number of zombie tasks
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-zombie-task", rest_name="cpu-zombie-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_zombie_task must be of a type compatible with uint32""",
'defined-type': "uint32",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-zombie-task", rest_name="cpu-zombie-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)""",
})
self.__cpu_zombie_task = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_zombie_task(self):
self.__cpu_zombie_task = YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="cpu-zombie-task", rest_name="cpu-zombie-task", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='uint32', is_config=False)
def _get_cpu_util_user(self):
"""
Getter method for cpu_util_user, mapped from YANG variable /cpu_state/top/cpu_util_user (decimal64)
YANG Description: CPU utlization % by user processes
"""
return self.__cpu_util_user
def _set_cpu_util_user(self, v, load=False):
"""
Setter method for cpu_util_user, mapped from YANG variable /cpu_state/top/cpu_util_user (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_util_user is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_util_user() directly.
YANG Description: CPU utlization % by user processes
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-user", rest_name="cpu-util-user", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_util_user must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-user", rest_name="cpu-util-user", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_util_user = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_util_user(self):
self.__cpu_util_user = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-user", rest_name="cpu-util-user", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_util_kernel(self):
"""
Getter method for cpu_util_kernel, mapped from YANG variable /cpu_state/top/cpu_util_kernel (decimal64)
YANG Description: CPU utlization % by kernel processes
"""
return self.__cpu_util_kernel
def _set_cpu_util_kernel(self, v, load=False):
"""
Setter method for cpu_util_kernel, mapped from YANG variable /cpu_state/top/cpu_util_kernel (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_util_kernel is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_util_kernel() directly.
YANG Description: CPU utlization % by kernel processes
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-kernel", rest_name="cpu-util-kernel", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_util_kernel must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-kernel", rest_name="cpu-util-kernel", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_util_kernel = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_util_kernel(self):
self.__cpu_util_kernel = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-kernel", rest_name="cpu-util-kernel", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_util_nice(self):
"""
Getter method for cpu_util_nice, mapped from YANG variable /cpu_state/top/cpu_util_nice (decimal64)
YANG Description: CPU utlization % by processes with nice value
"""
return self.__cpu_util_nice
def _set_cpu_util_nice(self, v, load=False):
"""
Setter method for cpu_util_nice, mapped from YANG variable /cpu_state/top/cpu_util_nice (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_util_nice is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_util_nice() directly.
YANG Description: CPU utlization % by processes with nice value
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-nice", rest_name="cpu-util-nice", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_util_nice must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-nice", rest_name="cpu-util-nice", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_util_nice = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_util_nice(self):
self.__cpu_util_nice = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-nice", rest_name="cpu-util-nice", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_util_idle(self):
"""
Getter method for cpu_util_idle, mapped from YANG variable /cpu_state/top/cpu_util_idle (decimal64)
YANG Description: CPU utlization % at idle state
"""
return self.__cpu_util_idle
def _set_cpu_util_idle(self, v, load=False):
"""
Setter method for cpu_util_idle, mapped from YANG variable /cpu_state/top/cpu_util_idle (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_util_idle is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_util_idle() directly.
YANG Description: CPU utlization % at idle state
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-idle", rest_name="cpu-util-idle", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_util_idle must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-idle", rest_name="cpu-util-idle", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_util_idle = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_util_idle(self):
self.__cpu_util_idle = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-idle", rest_name="cpu-util-idle", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_util_iowait(self):
"""
Getter method for cpu_util_iowait, mapped from YANG variable /cpu_state/top/cpu_util_iowait (decimal64)
YANG Description: CPU utlization % waiting for I/O
"""
return self.__cpu_util_iowait
def _set_cpu_util_iowait(self, v, load=False):
"""
Setter method for cpu_util_iowait, mapped from YANG variable /cpu_state/top/cpu_util_iowait (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_util_iowait is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_util_iowait() directly.
YANG Description: CPU utlization % waiting for I/O
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-iowait", rest_name="cpu-util-iowait", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_util_iowait must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-iowait", rest_name="cpu-util-iowait", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_util_iowait = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_util_iowait(self):
self.__cpu_util_iowait = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-iowait", rest_name="cpu-util-iowait", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_util_hi(self):
"""
Getter method for cpu_util_hi, mapped from YANG variable /cpu_state/top/cpu_util_hi (decimal64)
YANG Description: CPU utlization % for hardware interrupt
"""
return self.__cpu_util_hi
def _set_cpu_util_hi(self, v, load=False):
"""
Setter method for cpu_util_hi, mapped from YANG variable /cpu_state/top/cpu_util_hi (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_util_hi is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_util_hi() directly.
YANG Description: CPU utlization % for hardware interrupt
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-hi", rest_name="cpu-util-hi", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_util_hi must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-hi", rest_name="cpu-util-hi", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_util_hi = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_util_hi(self):
self.__cpu_util_hi = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-hi", rest_name="cpu-util-hi", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_util_si(self):
"""
Getter method for cpu_util_si, mapped from YANG variable /cpu_state/top/cpu_util_si (decimal64)
YANG Description: CPU utlization % for software interrupt
"""
return self.__cpu_util_si
def _set_cpu_util_si(self, v, load=False):
"""
Setter method for cpu_util_si, mapped from YANG variable /cpu_state/top/cpu_util_si (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_util_si is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_util_si() directly.
YANG Description: CPU utlization % for software interrupt
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-si", rest_name="cpu-util-si", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_util_si must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-si", rest_name="cpu-util-si", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_util_si = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_util_si(self):
self.__cpu_util_si = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-si", rest_name="cpu-util-si", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_util_st(self):
"""
Getter method for cpu_util_st, mapped from YANG variable /cpu_state/top/cpu_util_st (decimal64)
YANG Description: CPU utlization % for steal time
"""
return self.__cpu_util_st
def _set_cpu_util_st(self, v, load=False):
"""
Setter method for cpu_util_st, mapped from YANG variable /cpu_state/top/cpu_util_st (decimal64)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_util_st is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_util_st() directly.
YANG Description: CPU utlization % for steal time
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-st", rest_name="cpu-util-st", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_util_st must be of a type compatible with decimal64""",
'defined-type': "decimal64",
'generated-type': """YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-st", rest_name="cpu-util-st", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)""",
})
self.__cpu_util_st = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_util_st(self):
self.__cpu_util_st = YANGDynClass(base=RestrictedPrecisionDecimalType(precision=2), is_leaf=True, yang_name="cpu-util-st", rest_name="cpu-util-st", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='decimal64', is_config=False)
def _get_cpu_total_mem(self):
"""
Getter method for cpu_total_mem, mapped from YANG variable /cpu_state/top/cpu_total_mem (string)
YANG Description: Total memory
"""
return self.__cpu_total_mem
def _set_cpu_total_mem(self, v, load=False):
"""
Setter method for cpu_total_mem, mapped from YANG variable /cpu_state/top/cpu_total_mem (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_total_mem is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_total_mem() directly.
YANG Description: Total memory
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-total-mem", rest_name="cpu-total-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_total_mem must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-total-mem", rest_name="cpu-total-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_total_mem = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_total_mem(self):
self.__cpu_total_mem = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-total-mem", rest_name="cpu-total-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_used_mem(self):
"""
Getter method for cpu_used_mem, mapped from YANG variable /cpu_state/top/cpu_used_mem (string)
YANG Description: Total used memory
"""
return self.__cpu_used_mem
def _set_cpu_used_mem(self, v, load=False):
"""
Setter method for cpu_used_mem, mapped from YANG variable /cpu_state/top/cpu_used_mem (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_used_mem is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_used_mem() directly.
YANG Description: Total used memory
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-used-mem", rest_name="cpu-used-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_used_mem must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-used-mem", rest_name="cpu-used-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_used_mem = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_used_mem(self):
self.__cpu_used_mem = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-used-mem", rest_name="cpu-used-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_free_mem(self):
"""
Getter method for cpu_free_mem, mapped from YANG variable /cpu_state/top/cpu_free_mem (string)
YANG Description: Total Free memory
"""
return self.__cpu_free_mem
def _set_cpu_free_mem(self, v, load=False):
"""
Setter method for cpu_free_mem, mapped from YANG variable /cpu_state/top/cpu_free_mem (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_free_mem is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_free_mem() directly.
YANG Description: Total Free memory
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-free-mem", rest_name="cpu-free-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_free_mem must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-free-mem", rest_name="cpu-free-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_free_mem = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_free_mem(self):
self.__cpu_free_mem = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-free-mem", rest_name="cpu-free-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_buffer_mem(self):
"""
Getter method for cpu_buffer_mem, mapped from YANG variable /cpu_state/top/cpu_buffer_mem (string)
YANG Description: Total memory used for buffers
"""
return self.__cpu_buffer_mem
def _set_cpu_buffer_mem(self, v, load=False):
"""
Setter method for cpu_buffer_mem, mapped from YANG variable /cpu_state/top/cpu_buffer_mem (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_buffer_mem is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_buffer_mem() directly.
YANG Description: Total memory used for buffers
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-buffer-mem", rest_name="cpu-buffer-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_buffer_mem must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-buffer-mem", rest_name="cpu-buffer-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_buffer_mem = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_buffer_mem(self):
self.__cpu_buffer_mem = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-buffer-mem", rest_name="cpu-buffer-mem", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_total_mem_swap(self):
"""
Getter method for cpu_total_mem_swap, mapped from YANG variable /cpu_state/top/cpu_total_mem_swap (string)
YANG Description: Total swap memory
"""
return self.__cpu_total_mem_swap
def _set_cpu_total_mem_swap(self, v, load=False):
"""
Setter method for cpu_total_mem_swap, mapped from YANG variable /cpu_state/top/cpu_total_mem_swap (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_total_mem_swap is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_total_mem_swap() directly.
YANG Description: Total swap memory
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-total-mem-swap", rest_name="cpu-total-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_total_mem_swap must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-total-mem-swap", rest_name="cpu-total-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_total_mem_swap = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_total_mem_swap(self):
self.__cpu_total_mem_swap = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-total-mem-swap", rest_name="cpu-total-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_used_mem_swap(self):
"""
Getter method for cpu_used_mem_swap, mapped from YANG variable /cpu_state/top/cpu_used_mem_swap (string)
YANG Description: Total used swap memory
"""
return self.__cpu_used_mem_swap
def _set_cpu_used_mem_swap(self, v, load=False):
"""
Setter method for cpu_used_mem_swap, mapped from YANG variable /cpu_state/top/cpu_used_mem_swap (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_used_mem_swap is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_used_mem_swap() directly.
YANG Description: Total used swap memory
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-used-mem-swap", rest_name="cpu-used-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_used_mem_swap must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-used-mem-swap", rest_name="cpu-used-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_used_mem_swap = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_used_mem_swap(self):
self.__cpu_used_mem_swap = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-used-mem-swap", rest_name="cpu-used-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_free_mem_swap(self):
"""
Getter method for cpu_free_mem_swap, mapped from YANG variable /cpu_state/top/cpu_free_mem_swap (string)
YANG Description: Total Free swap memory
"""
return self.__cpu_free_mem_swap
def _set_cpu_free_mem_swap(self, v, load=False):
"""
Setter method for cpu_free_mem_swap, mapped from YANG variable /cpu_state/top/cpu_free_mem_swap (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_free_mem_swap is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_free_mem_swap() directly.
YANG Description: Total Free swap memory
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-free-mem-swap", rest_name="cpu-free-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_free_mem_swap must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-free-mem-swap", rest_name="cpu-free-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_free_mem_swap = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_free_mem_swap(self):
self.__cpu_free_mem_swap = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-free-mem-swap", rest_name="cpu-free-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_cache_mem_swap(self):
"""
Getter method for cpu_cache_mem_swap, mapped from YANG variable /cpu_state/top/cpu_cache_mem_swap (string)
YANG Description: Total memory used by cache
"""
return self.__cpu_cache_mem_swap
def _set_cpu_cache_mem_swap(self, v, load=False):
"""
Setter method for cpu_cache_mem_swap, mapped from YANG variable /cpu_state/top/cpu_cache_mem_swap (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_cache_mem_swap is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_cache_mem_swap() directly.
YANG Description: Total memory used by cache
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=unicode, is_leaf=True, yang_name="cpu-cache-mem-swap", rest_name="cpu-cache-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_cache_mem_swap must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-cache-mem-swap", rest_name="cpu-cache-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)""",
})
self.__cpu_cache_mem_swap = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_cache_mem_swap(self):
self.__cpu_cache_mem_swap = YANGDynClass(base=unicode, is_leaf=True, yang_name="cpu-cache-mem-swap", rest_name="cpu-cache-mem-swap", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='string', is_config=False)
def _get_cpu_top_process_information(self):
"""
Getter method for cpu_top_process_information, mapped from YANG variable /cpu_state/top/cpu_top_process_information (list)
YANG Description: Process information list from the top
"""
return self.__cpu_top_process_information
def _set_cpu_top_process_information(self, v, load=False):
"""
Setter method for cpu_top_process_information, mapped from YANG variable /cpu_state/top/cpu_top_process_information (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_cpu_top_process_information is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_cpu_top_process_information() directly.
YANG Description: Process information list from the top
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType("cpu_process_id",cpu_top_process_information.cpu_top_process_information, yang_name="cpu-top-process-information", rest_name="cpu-top-process-information", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='cpu-process-id', extensions={u'tailf-common': {u'callpoint': u'RAS-cpu-top-process-information', u'cli-suppress-show-path': None}}), is_container='list', yang_name="cpu-top-process-information", rest_name="cpu-top-process-information", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'RAS-cpu-top-process-information', u'cli-suppress-show-path': None}}, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='list', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """cpu_top_process_information must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("cpu_process_id",cpu_top_process_information.cpu_top_process_information, yang_name="cpu-top-process-information", rest_name="cpu-top-process-information", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='cpu-process-id', extensions={u'tailf-common': {u'callpoint': u'RAS-cpu-top-process-information', u'cli-suppress-show-path': None}}), is_container='list', yang_name="cpu-top-process-information", rest_name="cpu-top-process-information", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'RAS-cpu-top-process-information', u'cli-suppress-show-path': None}}, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='list', is_config=False)""",
})
self.__cpu_top_process_information = t
if hasattr(self, '_set'):
self._set()
def _unset_cpu_top_process_information(self):
self.__cpu_top_process_information = YANGDynClass(base=YANGListType("cpu_process_id",cpu_top_process_information.cpu_top_process_information, yang_name="cpu-top-process-information", rest_name="cpu-top-process-information", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='cpu-process-id', extensions={u'tailf-common': {u'callpoint': u'RAS-cpu-top-process-information', u'cli-suppress-show-path': None}}), is_container='list', yang_name="cpu-top-process-information", rest_name="cpu-top-process-information", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'callpoint': u'RAS-cpu-top-process-information', u'cli-suppress-show-path': None}}, namespace='urn:brocade.com:mgmt:brocade-RAS-operational', defining_module='brocade-RAS-operational', yang_type='list', is_config=False)
cpu_curr_time = __builtin__.property(_get_cpu_curr_time)
cpu_system_uptime = __builtin__.property(_get_cpu_system_uptime)
cpu_no_of_users = __builtin__.property(_get_cpu_no_of_users)
cpu_load_average_one_min = __builtin__.property(_get_cpu_load_average_one_min)
cpu_load_average_five_min = __builtin__.property(_get_cpu_load_average_five_min)
cpu_load_average_fifteen_min = __builtin__.property(_get_cpu_load_average_fifteen_min)
cpu_total_task = __builtin__.property(_get_cpu_total_task)
cpu_running_task = __builtin__.property(_get_cpu_running_task)
cpu_sleeping_task = __builtin__.property(_get_cpu_sleeping_task)
cpu_stopped_task = __builtin__.property(_get_cpu_stopped_task)
cpu_zombie_task = __builtin__.property(_get_cpu_zombie_task)
cpu_util_user = __builtin__.property(_get_cpu_util_user)
cpu_util_kernel = __builtin__.property(_get_cpu_util_kernel)
cpu_util_nice = __builtin__.property(_get_cpu_util_nice)
cpu_util_idle = __builtin__.property(_get_cpu_util_idle)
cpu_util_iowait = __builtin__.property(_get_cpu_util_iowait)
cpu_util_hi = __builtin__.property(_get_cpu_util_hi)
cpu_util_si = __builtin__.property(_get_cpu_util_si)
cpu_util_st = __builtin__.property(_get_cpu_util_st)
cpu_total_mem = __builtin__.property(_get_cpu_total_mem)
cpu_used_mem = __builtin__.property(_get_cpu_used_mem)
cpu_free_mem = __builtin__.property(_get_cpu_free_mem)
cpu_buffer_mem = __builtin__.property(_get_cpu_buffer_mem)
cpu_total_mem_swap = __builtin__.property(_get_cpu_total_mem_swap)
cpu_used_mem_swap = __builtin__.property(_get_cpu_used_mem_swap)
cpu_free_mem_swap = __builtin__.property(_get_cpu_free_mem_swap)
cpu_cache_mem_swap = __builtin__.property(_get_cpu_cache_mem_swap)
cpu_top_process_information = __builtin__.property(_get_cpu_top_process_information)
_pyangbind_elements = {'cpu_curr_time': cpu_curr_time, 'cpu_system_uptime': cpu_system_uptime, 'cpu_no_of_users': cpu_no_of_users, 'cpu_load_average_one_min': cpu_load_average_one_min, 'cpu_load_average_five_min': cpu_load_average_five_min, 'cpu_load_average_fifteen_min': cpu_load_average_fifteen_min, 'cpu_total_task': cpu_total_task, 'cpu_running_task': cpu_running_task, 'cpu_sleeping_task': cpu_sleeping_task, 'cpu_stopped_task': cpu_stopped_task, 'cpu_zombie_task': cpu_zombie_task, 'cpu_util_user': cpu_util_user, 'cpu_util_kernel': cpu_util_kernel, 'cpu_util_nice': cpu_util_nice, 'cpu_util_idle': cpu_util_idle, 'cpu_util_iowait': cpu_util_iowait, 'cpu_util_hi': cpu_util_hi, 'cpu_util_si': cpu_util_si, 'cpu_util_st': cpu_util_st, 'cpu_total_mem': cpu_total_mem, 'cpu_used_mem': cpu_used_mem, 'cpu_free_mem': cpu_free_mem, 'cpu_buffer_mem': cpu_buffer_mem, 'cpu_total_mem_swap': cpu_total_mem_swap, 'cpu_used_mem_swap': cpu_used_mem_swap, 'cpu_free_mem_swap': cpu_free_mem_swap, 'cpu_cache_mem_swap': cpu_cache_mem_swap, 'cpu_top_process_information': cpu_top_process_information, }
| 71.047337 | 1,094 | 0.755892 | 11,937 | 84,049 | 5.016252 | 0.019938 | 0.041083 | 0.054243 | 0.03487 | 0.953439 | 0.933599 | 0.904374 | 0.881995 | 0.863809 | 0.853905 | 0 | 0.008587 | 0.121548 | 84,049 | 1,182 | 1,095 | 71.107445 | 0.802419 | 0.186713 | 0 | 0.523052 | 0 | 0.044515 | 0.376235 | 0.226773 | 0 | 0 | 0 | 0 | 0 | 1 | 0.138315 | false | 0 | 0.014308 | 0 | 0.259141 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c104d14fe19c6e7ee0eb568a17da8d50fdfcc779 | 73 | py | Python | up/tasks/det/plugins/efl/utils/__init__.py | ModelTC/EOD | 164bff80486e9ae6a095a97667b365c46ceabd86 | [
"Apache-2.0"
] | 196 | 2021-10-30T05:15:36.000Z | 2022-03-30T18:43:40.000Z | eod/tasks/det/plugins/efl/utils/__init__.py | YZW-explorer/EOD | f10e64de86c0f356ebf5c7e923f4042eec4207b1 | [
"Apache-2.0"
] | 12 | 2021-10-30T11:33:28.000Z | 2022-03-31T14:22:58.000Z | eod/tasks/det/plugins/efl/utils/__init__.py | YZW-explorer/EOD | f10e64de86c0f356ebf5c7e923f4042eec4207b1 | [
"Apache-2.0"
] | 23 | 2021-11-01T07:26:17.000Z | 2022-03-27T05:55:37.000Z | from .optimizer_helper import * # noqa
from .hook_helper import * # noqa
| 24.333333 | 38 | 0.753425 | 10 | 73 | 5.3 | 0.6 | 0.45283 | 0.603774 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164384 | 73 | 2 | 39 | 36.5 | 0.868852 | 0.123288 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c108101d20a44e2229262e2d7cad756063c0af66 | 88 | py | Python | cloudmesh/sh/cm.py | JulienPalard/cloudmesh | 1759b88daef3a13917492d028fdabe08f03ca996 | [
"Apache-2.0"
] | null | null | null | cloudmesh/sh/cm.py | JulienPalard/cloudmesh | 1759b88daef3a13917492d028fdabe08f03ca996 | [
"Apache-2.0"
] | 4 | 2021-06-08T20:20:08.000Z | 2022-03-11T23:30:22.000Z | cloudmesh/sh/cm.py | JulienPalard/cloudmesh | 1759b88daef3a13917492d028fdabe08f03ca996 | [
"Apache-2.0"
] | null | null | null | from sh import cm as cm_sh
def shell(*args, **kwargs):
return cm_sh(args, kwargs)
| 14.666667 | 30 | 0.681818 | 16 | 88 | 3.625 | 0.625 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.204545 | 88 | 5 | 31 | 17.6 | 0.828571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
c1a628f6f27f8177750dd6224445ba7b9ce48960 | 96 | py | Python | pysatMadrigal/__init__.py | pysat/pysatMadrigal | 02681e2be6aff3a360d1bf8c48a11988d94ad3ce | [
"BSD-3-Clause"
] | 1 | 2022-01-19T16:37:56.000Z | 2022-01-19T16:37:56.000Z | pysatMadrigal/__init__.py | pysat/pysatMadrigal | 02681e2be6aff3a360d1bf8c48a11988d94ad3ce | [
"BSD-3-Clause"
] | 42 | 2020-04-24T04:33:49.000Z | 2022-03-01T19:11:16.000Z | pysatMadrigal/__init__.py | pysat/pysatMadrigal | 02681e2be6aff3a360d1bf8c48a11988d94ad3ce | [
"BSD-3-Clause"
] | 1 | 2021-08-19T21:21:18.000Z | 2021-08-19T21:21:18.000Z | from pysatMadrigal import instruments # noqa F401
from pysatMadrigal import utils # noqa F401
| 32 | 50 | 0.8125 | 12 | 96 | 6.5 | 0.583333 | 0.435897 | 0.589744 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 0.166667 | 96 | 2 | 51 | 48 | 0.9 | 0.197917 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a9c6e5b3bd1dcd49706f614e12bea5b880b991b6 | 46,407 | py | Python | preprocessing/frequent_words.py | vendi12/tweet_cluster | f84f0bb237f91822b4560952115011a3388b43bd | [
"MIT"
] | null | null | null | preprocessing/frequent_words.py | vendi12/tweet_cluster | f84f0bb237f91822b4560952115011a3388b43bd | [
"MIT"
] | null | null | null | preprocessing/frequent_words.py | vendi12/tweet_cluster | f84f0bb237f91822b4560952115011a3388b43bd | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
Top 5,000 lemmas list from http://www.wordfrequency.info
'''
STOPLIST = ['limited', 'similarity', 'magnetic', 'personally', 'dynamic', 'yellow', 'four', 'protest', 'sleep', 'controversial', 'mansion', 'grandparent', 'ridiculous', 'captain', 'hate', 'aggression', 'forget', 'whose', 'voter', 'violate', 'eligible', 'electricity', 'disability', 'bike', 'restriction', 'teaspoon', 'under', 'teaching', 'sorry', 'pride', 'worth', 'merchant', 'statute', 'risk', 'blanket', 'rise', 'every', 'govern', 'affect', 'demographic', 'vast', 'school', 'scholar', 'investigator', 'wooden', 'conceive', 'solution', 'frozen', 'convenience', 'debris', 'math', 'calendar', 'cholesterol', 'enhance', 'reliable', 'triumph', 'clothes', 'enjoy', 'disclose', 'charter', 'force', 'tired', 'awake', 'consistent', 'foreigner', 'direct', 'pulse', 'horn', 'chef', 'elegant', 'second', 'street', 'estimated', 'ideology', 'monster', 'even', 'employ', 'disk', 'hide', 'pace', 'integrity', 'cooking', 'troubled', 'liberty', 'spokesman', 'hostile', 'above', 'conduct', 'supplier', 'new', 'net', 'increasing', 'ever', 'specialist', 'hero', 'reporter', 'herb', 'never', 'here', 'protection', 'studio', 'pursuit', 'active', 'path', 'interpret', 'celebration', 'dry', 'voice', 'daughter', 'forum', 'auction', 'study', 'economics', 'jail', 'controversy', 'credit', 'mentor', 'smoke', 'permit', 'military', 'suitable', 'punishment', 'diplomat', 'criticism', 'golden', 'fantastic', 'divide', 'campaign', 'straw', 'replace', 'county', 'visible', 'moral', 'diabetes', 'glance', 'total', 'unit', 'plot', 'would', 'army', 'carpet', 'hospital', 'hungry', 'negative', 'bronze', 'afterward', 'elect', 'foster', 'call', 'therefore', 'recommend', 'strike', 'survive', 'sexy', 'type', 'tell', 'coin', 'breathe', 'holy', 'relax', 'successful', 'remark', 'expose', 'depression', 'aware', 'warn', 'phone', 'warm', 'adult', 'excellent', 'organic', 'hole', 'hold', 'squad', 'trait', 'must', 'shoot', 'join', 'room', 'characterize', 'pursue', 'work', 'gasoline', 'roof', 'obstacle', 'modify', 'era', 'install', 'elbow', 'my', 'example', 'phrase', 'boyfriend', 'impose', 'estate', 'give', 'household', 'organized', 'frown', 'involve', 'currency', 'hormone', 'want', 'counseling', 'attract', 'rid', 'guarantee', 'ceremony', 'autonomy', 'end', 'recovery', 'thing', 'provide', 'mouse', 'travel', 'awareness', 'feature', 'machine', 'how', 'hot', 'significance', 'answer', 'gate', 'ordinary', 'chase', 'beach', 'classify', 'massive', 'badly', 'regional', 'minority', 'beauty', 'mess', 'ladder', 'after', 'loyalty', 'lab', 'wrong', 'fragile', 'lay', 'curiosity', 'president', 'law', 'lap', 'excited', 'slavery', 'purchase', 'attempt', 'third', 'amid', 'appreciate', 'Japanese', 'greet', 'childhood', 'patent', 'complexity', 'maintain', 'green', 'ultimate', 'enter', 'diary', 'democratic', 'order', 'wind', 'wine', 'origin', 'interpretation', 'feedback', 'office', 'deck', 'consent', 'over', 'vary', 'suspend', 'oven', 'innovative', 'lightning', 'TRUE', 'mayor', 'before', 'forehead', 'heritage', 'fit', 'personal', 'fix', 'striking', 'writing', 'better', 'production', 'compelling', 'encounter', 'fade', 'persist', 'carve', 'hidden', 'overcome', 'virtually', 'versus', 'then', 'them', 'tourist', 'combination', 'spray', 'weakness', 'workout', 'safe', 'disturb', 'break', 'band', 'therapy', 'they', 'interrupt', 'altogether', 'one', 'tourism', 'silver', 'bank', 'bread', 'rhetoric', 'meat', 'oxygen', 'inventory', 'debut', 'leading', 'explode', 'victory', 'reasonable', 'each', 'side', 'bone', 'mean', 'prohibit', 'financial', 'telescope', 'fairly', 'series', 'initiate', 'aide', 'carry', 'used', 'trading', 'rental', 'laboratory', 'dawn', 'collector', 'ring', 'whip', 'contend', 'network', 'driving', 'specialty', 'scent', 'ha', 'crucial', 'forty', 'content', 're', 'encourage', 'rare', 'reader', 'medicine', 'surprise', 'newly', 'engineer', 'forth', 'independence', 'foundation', 'Cuban', 'perception', 'barrier', 'associate', 'rail', 'given', 'free', 'standard', 'missionary', 'ancient', 'formation', 'struggle', 'estimate', 'kit', 'publication', 'aisle', 'enormous', 'refugee', 'ritual', 'genre', 'economically', 'adjustment', 'oppose', 'signature', 'kingdom', 'onto', 'overnight', 'user', 'already', 'render', 'Palestinian', 'researcher', 'primary', 'rank', 'hearing', 'restrict', 'wash', 'instruct', 'alarm', 'another', 'AIDS', 'thick', 'electronic', 'illustrate', 'inmate', 'agriculture', 'basketball', 'fence', 'Congress', 'attach', 'top', 'engagement', 'historian', 'approximately', 'fiction', 'gut', 'master', 'too', 'architect', 'percentage', 'cycle', 'bitter', 'listen', 'urban', 'ceiling', 'murder', 'asset', 'radiation', 'happily', 'tool', 'serve', 'wisdom', 'ankle', 'western', 'somewhat', 'evil', 'crawl', 'symptom', 'distance', 'anxiety', 'FALSE', 'target', 'frankly', 'surrounding', 'hike', 'tree', 'likely', 'cite', 'project', 'matter', 'persuade', 'flame', 'cabin', 'historical', 'feeling', 'solely', 'brilliant', 'acquisition', 'bridge', 'fashion', 'willingness', 'runner', 'modern', 'mind', 'mine', 'raw', 'rat', 'seed', 'manner', 'plunge', 'crew', 'seem', 'regulate', 'seek', 'curve', 'seminar', 'relatively', 'cage', 'abroad', 'strength', 'Christian', 'allegedly', 'concrete', 'thoroughly', 'latter', 'responsible', 'snow', 'casual', 'chest', 'contact', 'educator', 'transmit', 'effectiveness', 'unfair', 'hook', 'quarterback', 'blue', 'nobody', 'lion', 'though', 'object', 'sue', 'landmark', 'regular', 'mouth', 'letter', 'entry', 'phase', 'panic', 'grave', 'singer', 'stupid', 'episode', 'observation', 'professor', 'camp', 'metal', 'dog', 'treaty', 'storm', 'aunt', 'principle', 'deeply', 'voting', 'consumer', 'notion', 'dot', 'reserve', 'incorporate', 'selected', 'bomb', 'inspire', 'ambassador', 'visitor', 'retire', 'anniversary', 'radio', 'participate', 'earth', 'availability', 'European', 'busy', 'proposed', 'spite', 'headline', 'elite', 'explain', 'unemployment', 'myth', 'sugar', 'upstairs', 'theme', 'screening', 'rich', 'integrate', 'announce', 'adequate', 'rice', 'personality', 'do', 'assignment', 'mixture', 'colorful', 'commodity', 'endorse', 'stop', 'perceive', 'coast', 'pocket', 'despite', 'report', 'oak', 'Soviet', 'comply', 'unfortunately', 'hall', 'earn', 'bar', 'noon', 'officially', 'passing', 'method', 'twice', 'bad', 'softly', 'troop', 'isolation', 'release', 'implication', 'steak', 'steal', 'steam', 'secretary', 'respond', 'ethical', 'human', 'fair', 'habit', 'bacteria', 'nut', 'testing', 'reporting', 'Christianity', 'resist', 'result', 'cattle', 'fail', 'simultaneously', 'midst', 'best', 'subject', 'peanut', 'instance', 'capacity', 'guilt', 'lots', 'away', 'sail', 'gentleman', 'artificial', 'dock', 'grandmother', 'mud', 'score', 'finger', 'cooperation', 'hopefully', 'approach', 'discovery', 'preserve', 'wage', 'we', 'terms', 'extend', 'nature', 'confusion', 'lover', 'weak', 'however', 'boss', 'southeast', 'retirement', 'wear', 'extent', 'news', 'debt', 'improve', 'cop', 'protect', 'accident', 'cow', 'country', 'ill', 'cup', 'against', 'pat', 'steer', 'receiver', 'distinction', 'contribution', 'argue', 'old-fashioned', 'negotiation', 'prospect', 'presumably', 'tough', 'royal', 'epidemic', 'height', 'hers', 'written', 'initiative', 'pregnancy', 'trust', 'speak', 'conference', 'bathroom', 'clip', 'beef', 'basis', 'union', 'patch', 'three', 'tiny', 'quickly', 'subsidy', 'commission', 'beer', 'much', 'monkey', 'interest', 'basic', 'privilege', 'flexible', 'lovely', 'Thanksgiving', 'warmth', 'teenager', 'life', 'innocent', 'mushroom', 'eastern', 'concerning', 'dismiss', 'worker', 'wish', 'sprinkle', 'lift', 'alley', 'child', 'catch', 'spin', 'chill', 'physician', 'exception', 'east', 'tank', 'conviction', 'publicly', 'intimate', 'strict', 'resume', 'air', 'aim', 'ugly', 'near', 'conspiracy', 'suppose', 'aid', 'property', 'elaborate', 'procedure', 'mistake', 'seven', 'metropolitan', 'dictate', 'it', 'player', 'tissue', 'brake', 'brutal', 'in', 'vendor', 'spectrum', 'ie', 'disappear', 'if', 'intact', 'spy', 'prior', 'perform', 'suggest', 'make', 'wound', 'airport', 'beside', 'complex', 'potentially', 'split', 'evaluate', 'vegetable', 'silly', 'several', 'bid', 'wheel', 'independent', 'satellite', 'swell', 'pick', 'hang', 'rain', 'hand', 'delight', 'suburb', 'depict', 'garlic', 'ownership', 'headquarters', 'opportunity', 'tune', 'kid', 'butter', 'specialize', 'shortly', 'scenario', 'suffering', 'inherit', 'ocean', 'academic', 'client', 'greatest', 'mother', 'clerk', 'the', 'corporate', 'musical', 'left', 'authorize', 'endless', 'Dutch', 'background', 'just', 'sentence', 'athletic', 'photo', 'distribute', 'terrorist', 'speculation', 'identify', 'thanks', 'victim', 'yes', 'yet', 'previous', 'terrific', 'unique', 'candidate', 'dining', 'photographer', 'ease', 'character', 'ideal', 'hay', 'shout', 'spread', 'board', 'easy', 'prison', 'consideration', 'save', 'hat', 'humanity', 'opt', 'desire', 'tactic', 'dignity', 'survival', 'possible', 'possibly', 'cultural', 'birth', 'destroy', 'judge', 'shadow', 'highly', 'advanced', 'apart', 'shoulder', 'psychological', 'gift', 'patience', 'gifted', 'manual', 'specific', 'remind', 'officer', 'night', 'security', 'soar', 'sometime', 'attorney', 'right', 'old', 'continuous', 'deal', 'people', 'successfully', 'somehow', 'dead', 'transaction', 'born', 'election', 'escape', 'dear', 'multiple', 'extended', 'guess', 'library', 'short-term', 'consensus', 'toxic', 'humor', 'for', 'bottom', 'purple', 'armed', 'opposite', 'ice', 'creative', 'everything', 'queen', 'fog', 'landscape', 'convict', 'hurricane', 'participation', 'cord', 'core', 'bold', 'peel', 'marketing', 'payment', 'burn', 'envision', 'deadly', 'confrontation', 'defensive', 'peer', 'post', 'manufacturing', 'bury', 'cemetery', 'chapter', 'limitation', 'acquire', 'rim', 'donation', 'surround', 'magazine', 'dedicate', 'ensure', 'afternoon', 'horizon', 'steadily', 'commit', 'permission', 'coastal', 'slightly', 'sweater', 'automatically', 'match', 'bath', 'nerve', 'motivate', 'respondent', 'backyard', 'formerly', 'facility', 'civil', 'material', 'puzzle', 'float', 'profession', 'intellectual', 'two', 'down', 'prisoner', 'doctrine', 'wrap', 'Korean', 'stumble', 'parade', 'rely', 'crowd', 'African-American', 'medal', 'support', 'initial', 'legislation', 'transform', 'sunlight', 'fight', 'fucking', 'clinical', 'editor', 'way', 'music', 'regain', 'war', 'happy', 'fork', 'head', 'medium', 'DNA', 'form', 'offer', 'heal', 'lawn', 'landing', 'failure', 'heat', 'hear', 'elderly', 'solar', 'analyst', 'counsel', 'detective', 'array', 'assessment', 'inside', 'maximum', 'until', 'crystal', 'decorate', 'emotional', 'accelerate', 'diagnose', 'passenger', 'adopt', 'liberal', 'classic', 'toss', 'wealthy', 'tournament', 'one-third', 'textbook', 'evidence', 'harmony', 'exist', 'prayer', 'accounting', 'ship', 'sympathy', 'trip', 'shit', 'physical', 'dying', 'floor', 'whereas', 'stake', 'generally', 'actor', 'reality', 'mm-hmm', 'interested', 'role', 'ambitious', 'digital', 'test', 'tie', 'smell', 'roll', 'picture', 'intend', "o'clock", 'football', 'update', 'diet', 'journey', 'intent', 'award', 'businessman', 'variable', 'dense', 'hurt', 'weekend', 'exhibition', 'billion', 'grief', 'regime', 'faster', 'bullet', 'glass', 'assume', 'typically', 'interval', 'flag', 'daily', 'jacket', 'rough', 'reception', 'time', 'push', 'serious', 'quantity', 'cooperate', 'hope', 'slope', 'coach', 'chain', 'impression', 'dance', 'skip', 'global', 'rod', 'focus', 'invent', 'integration', 'manager', 'selection', 'grateful', 'skin', 'battle', 'chair', 'milk', 'row', 'certainly', 'suicide', 'depend', 'zone', 'chew', 'technique', 'father', 'passage', 'environment', 'finally', 'behave', 'rhythm', 'terror', 'sovereignty', 'retreat', 'southwest', 'me', 'protective', 'string', 'advantage', 'recipe', 'seemingly', 'choice', 'liability', 'anonymous', 'cook', 'word', 'trouble', 'blast', 'exact', 'minute', 'cool', 'perceived', 'impressive', 'level', 'tear', 'die', 'dig', 'brother', 'invade', 'leave', 'item', 'settle', 'offering', 'theological', 'team', 'quick', 'guy', 'dip', 'round', 'added', 'prevent', 'spiritual', 'revolution', 'regret', 'trend', 'sigh', 'discover', 'sign', 'cost', 'dried', 'sexually', 'run', 'bake', 'chicken', 'corridor', 'cargo', 'educate', 'appear', 'candle', 'assistance', 'mall', 'melt', 'supporter', 'current', 'health', 'suspect', 'bombing', 'international', 'appeal', 'boost', 'corn', 'terrorism', 'satisfy', 'pose', 'jury', 'mechanic', 'heavily', 'funeral', 'makeup', 'understanding', 'water', 'contemplate', 'baseball', 'twentieth', 'Ms', 'Mr', 'alone', 'along', 'teacher', 'change', 'wait', 'box', 'boy', 'enroll', 'thirty', 'discount', 'healthy', 'shift', 'guilty', 'commonly', 'proud', 'trial', 'suggestion', 'usually', 'weird', 'pillow', 'bolt', 'emerge', 'teenage', 'treasure', 'love', 'extra', 'merely', 'prefer', 'shark', 'logical', 'bloody', 'flexibility', 'texture', 'divorce', 'touchdown', 'rarely', 'crisis', 'market', 'everybody', 'wealth', 'working', 'prove', 'sake', 'positive', 'angry', 'scan', 'visit', 'tightly', 'intense', 'live', 'opposed', 'memory', 'cocaine', 'scope', 'associated', 'theoretical', 'today', 'stomach', 'entrance', 'fierce', 'riot', 'irony', 'club', 'flying', 'acceptable', 'curriculum', 'envelope', 'validity', 'chocolate', 'downtown', 'fuel', 'visual', 'drown', 'everywhere', 'virtue', 'effort', 'behalf', 'fly', 'honey', 'instructional', 'German', 'car', 'originally', 'cap', 'abortion', 'soul', 'cat', 'counselor', 'soup', 'can', 'tragic', 'growing', 'cab', 'arrive', 'laughter', 'streak', 'claim', 'crazy', 'performer', 'figure', 'predict', 'attribute', 'chip', 'arise', 'agent', 'sample', 'drawer', 'critic', 'chin', 'council', 'sharp', 'offense', 'provoke', 'clothing', 'occur', 'pink', 'grandchild', 'productive', 'winter', 'fortunately', 'discussion', 'Persian', 'dinner', 'pine', 'write', 'immune', 'vital', 'fourth', 'elephant', 'tile', 'plus', 'absolute', 'economy', 'map', 'product', 'designer', 'laundry', 'huge', 'may', 'neat', 'sacrifice', 'southern', 'lucky', 'membership', 'produce', 'mad', 'date', 'such', 'suck', 'truly', 'data', 'grow', 'man', 'classroom', 'stress', 'natural', 'remember', 'adviser', 'beam', 'whenever', 'maybe', 'wheelchair', 'explicit', 'tale', 'inform', 'switch', 'so', 'silence', 'basket', 'tall', 'talk', 'typical', 'exclusive', 'serving', 'Olympics', 'indeed', 'mainly', 'denial', 'damage', 'stability', 'brain', 'statistical', 'shake', 'dilemma', 'cold', 'still', 'tendency', 'derive', 'group', 'thank', 'acknowledge', 'interesting', 'presence', 'amazing', 'sensitivity', 'platform', 'window', 'suspicion', 'farmer', 'policy', 'mail', 'main', 'instantly', 'lonely', 'garbage', 'finance', 'thereby', 'civilian', 'nod', 'killer', 'introduce', 'nation', 'interview', 'half', 'not', 'developmental', 'now', 'provision', 'discuss', 'nor', 'possess', 'administer', 'term', 'ancestor', 'equality', 'name', 'entrepreneur', 'opera', 'uncertain', 'drop', 'realistic', 'peasant', 'fisherman', 'rock', 'entirely', 'quarter', 'English', 'tide', 'domain', 'square', 'significantly', 'improved', 'yeah', 'balanced', 'fever', 'sponsor', 'year', 'assumption', 'girl', 'worried', 'description', 'album', 'living', 'ultimately', 'accomplish', 'container', 'space', 'profit', 'nonprofit', 'furthermore', 'factory', 'increase', 'seriously', 'investigation', 'trauma', 'rational', 'formula', 'dominate', 'sensor', 'correct', 'juror', 'inevitably', 'theory', 'steep', 'unexpected', 'lawmaker', 'million', 'seventh', 'possibility', 'quite', 'complicated', 'rebel', 'besides', 'uncover', 'obligation', 'marine', 'inevitable', 'card', 'care', 'advance', 'training', 'language', 'ministry', 'transition', 'programming', 'modest', 'honest', 'motion', 'turn', 'place', 'husband', 'swing', 'promotion', 'widow', 'think', 'frequent', 'first', 'emotion', 'cheese', 'constantly', 'saving', 'revenue', 'sock', 'coming', 'symbolic', 'yourself', 'long', 'directly', 'invasion', 'vote', 'impossible', 'message', 'accountability', 'open', 'tomorrow', 'accommodate', 'size', 'sheep', 'city', 'little', 'necessarily', 'sheet', 'silent', 'district', 'meaningful', 'bite', 'waist', 'plastic', 'anyone', 'indicate', 'conservation', 'draft', 'convention', 'legally', 'white', 'frame', 'friend', 'hug', 'fraction', 'mostly', 'that', 'season', 'universe', 'huh', 'accurately', 'artifact', 'broadcast', 'Hispanic', 'criticize', 'butt', 'copy', 'than', 'specify', 'population', 'wide', 'television', 'translate', 'effective', 'depressed', 'crack', 'require', 'crowded', 'diplomatic', 'recruit', 'future', 'venture', 'aesthetic', 'surprisingly', 'asleep', 'outcome', 'driveway', 'and', 'gathering', 'illness', 'angel', 'pro', 'parking', 'argument', 'slap', 'sad', 'realm', 'say', 'slam', 'rent', 'anger', 'breakfast', 'recover', 'any', 'elsewhere', 'publicity', 'descend', 'conversion', 'uncertainty', 'Jewish', 'veteran', 'isolate', 'aside', 'note', 'equipment', 'emphasis', 'potential', 'take', 'online', 'objective', 'performance', 'wonder', 'atop', 'channel', 'jungle', 'butterfly', 'urge', 'begin', 'sure', 'pain', 'shade', 'trace', 'normal', 'track', 'price', 'correlation', 'molecule', 'pastor', 'longtime', 'assault', 'importantly', 'mate', 'pair', 'knee', 'collar', 'mainstream', 'icon', 'forever', 'operate', 'especially', 'surprising', 'egg', 'average', 'later', 'steady', 'drive', 'tooth', 'managing', 'nuclear', 'professional', 'senior', 'salt', 'repeatedly', 'shop', 'rating', 'walking', 'shot', 'show', 'cheat', 'feminist', 'cheap', 'contemporary', 'merit', 'fifty', 'bright', 'therapist', 'shoe', 'pump', 'threshold', 'corner', 'aggressive', 'fifth', 'ground', 'slot', 'slow', 'ratio', 'stair', 'title', 'devote', 'proportion', 'enough', 'fluid', 'crime', 'only', 'wood', 'black', 'hockey', 'enthusiasm', 'congressional', 'dispute', 'explosion', 'correctly', 'get', 'hostage', 'assistant', 'horrible', 'expectation', 'mission', 'condemn', 'nearly', 'plaintiff', 'boundary', 'celebrate', 'secondary', 'prime', 'reveal', 'regarding', 'pistol', 'resource', 'skull', 'aluminum', 'artist', 'leather', 'seldom', 'middle-class', 'borrow', 'yield', 'morning', 'naked', 'scientist', 'dough', 'tent', 'where', 'vision', 'concert', 'enact', 'burst', 'physically', 'globe', 'seat', 'relative', 'infrastructure', 'college', 'existence', 'surgery', 'sport', 'dancing', 'concern', 'lost', 'detect', 'brave', 'vertical', 'mortgage', 'federal', 'subsequent', 'review', 'label', 'outside', 'bureau', 'between', 'import', 'reading', 'across', 'uh', 'arrival', 'notice', 'Bible', 'parent', 'parental', 'screen', 'preach', 'supermarket', 'killing', 'blame', 'concentrate', 'spare', 'article', 'spark', 'interfere', 'come', 'undermine', 'recall', 'reaction', 'installation', 'talented', 'tuck', 'many', 'region', 'according', 'contract', 'somewhere', 'tour', 'columnist', 'workplace', 'senator', 'expression', 'nearby', 'duty', 'among', 'cancer', 'color', 'pot', 'jeans', 'period', 'insist', 'satisfaction', 'twin', 'medication', 'learning', 'approval', 'cancel', 'moreover', 'poll', 'boat', 'logic', 'turkey', 'unusual', 'boil', 'teammate', 'capable', 'stretch', 'west', 'rebuild', 'vacation', 'undertake', 'mark', 'breath', 'workshop', 'combined', 'peaceful', 'helicopter', 'hardly', 'abstract', 'differently', 'engine', 'direction', 'enable', 'transportation', 'confuse', 'supervisor', 'filter', 'thousand', 'blessing', 'observe', 'wake', 'minister', 'instant', 'Greek', 'careful', 'spirit', 'those', 'pilot', 'case', 'alike', 'myself', 'developing', 'these', 'sauce', 'mount', 'conception', 'cash', "n't", 'cast', 'diverse', 'ongoing', 'pizza', 'residence', 'newspaper', 'situation', 'margin', 'cart', 'fool', 'orientation', 'eventually', 'determination', 'soil', 'mill', 'purse', 'coffee', 'quiet', 'middle', 'good', 'embrace', 'somebody', 'protein', 'technology', 'everyday', 'worry', 'helmet', 'different', 'participant', 'harsh', 'considerable', 'author', 'pay', 'nail', 'bowl', 'doorway', 'same', 'check', 'fragment', 'speech', 'reminder', 'document', 'pan', 'disappointment', 'week', 'exhaust', 'finish', 'closest', 'I', 'socially', 'nest', 'assist', 'driver', 'companion', 'director', 'running', 'fruit', 'widespread', 'delicate', 'statue', 'lemon', 'totally', 'tradition', 'mentally', 'weave', 'drain', 'theater', 'largely', 'minimize', 'no', 'constitutional', 'roughly', 'objection', 'severe', 'without', 'solve', 'relief', 'bottle', 'coordinate', 'model', 'reward', 'dimension', 'justify', 'summer', 'United', 'being', 'money', 'rest', 'civic', 'violent', 'kill', 'aspect', 'touch', 'flavor', 'speed', 'weekly', 'blow', 'announcement', 'death', 'flood', 'thinking', 'rose', 'and/or', 'except', 'improvement', 'instrument', 'setting', 'pile', 'treatment', 'republic', 'extensive', 'structural', 'pill', 'momentum', 'real', 'boast', 'around', 'spectacular', 'desperately', 'read', 'colony', 'bow', 'dark', 'warehouse', 'inflation', 'traffic', 'pop', 'mom', 'vacuum', 'world', 'railroad', 'execution', 'lady', 'calculate', 'dare', 'furniture', 'accusation', 'fortune', 'stranger', 'collective', 'pole', 'preference', 'shrink', 'annually', 'chamber', 'benefit', 'maker', 'either', 'fully', 'output', 'tower', 'sidewalk', 'bench', 'twelve', 'verbal', 'doll', 'competition', 'cognitive', 'shallow', 'astronomer', 'respect', 'root', 'racial', 'divine', 'deputy', 'provided', 'slice', 'mood', 'confirm', 'colonial', 'tube', 'legal', 'conservative', 'pioneer', 'throat', 'critical', 'exit', 'deficit', 'provider', 'moderate', 'decent', 'knife', 'refer', 'voluntary', 'welcome', 'assembly', 'scientific', 'power', 'quit', 'sixth', 'equivalent', 'fitness', 'communicate', 'broker', 'hunt', 'broken', 'evolve', 'leadership', 'inherent', 'assemble', 'exciting', 'throw', 'manufacturer', 'on', 'stone', 'ok', 'oh', 'island', 'industry', 'violence', 'favorite', 'drag', 'meal', 'practical', 'tremendous', 'wolf', 'stand', 'neighbor', 'act', 'mixed', 'bond', 'or', 'road', 'tribe', 'quietly', 'burning', 'communication', 'image', 'skilled', 'rolling', 'secure', 'involvement', 'concede', 'determine', 'strip', 'elementary', 'valuable', 'fantasy', 'your', 'intervention', 'stare', 'tropical', 'legacy', 'log', 'prepare', 'area', 'strictly', 'there', 'hey', 'start', 'gaze', 'low', 'lot', 'valley', 'bubble', 'fish', 'shower', 'complete', 'regard', 'pleased', 'cabinet', 'two-thirds', 'cottage', 'promote', 'with', 'feather', 'handsome', 'pull', 'hire', 'waste', 'romantic', 'rage', 'pond', 'potato', 'bucket', 'dirty', 'rehabilitation', 'organize', 'grass', 'agree', 'connect', 'strongly', 'toilet', 'ad', 'taste', 'certain', 'describe', 'deer', 'sales', 'deep', 'general', 'imagination', 'examine', 'reliability', 'at', 'file', 'girlfriend', 'politics', 'bishop', 'film', 'fill', 'again', 'cling', 'deserve', 'storage', 'casualty', 'insurance', 'rubber', 'field', 'valid', 'lifestyle', 'acceptance', 'you', 'trash', 'poor', 'briefly', 'championship', 'regardless', 'separate', 'shelter', 'symbol', 'narrative', 'drift', 'important', 'peak', 'coverage', 'tackle', 'spouse', 'pool', 'psychology', 'building', 'bulk', 'Mexican', 'wife', 'invest', 'odds', 'mask', 'gross', 'custody', 'dramatic', 'mass', 'behavioral', 'nutrient', 'resolution', 'original', 'external', 'represent', 'all', 'consider', 'founder', 'suburban', 'lack', 'month', 'concept', 'deadline', 'cooperative', 'welfare', 'dish', 'follow', 'settlement', 'religious', 'reluctant', 'glimpse', 'apartment', 'remarkable', 'hunting', 'whoever', 'former', 'to', 'tail', 'program', 'ecosystem', 'painter', 'scratch', 'smile', 'self-esteem', 'presentation', 'norm', 'adapt', 'sound', 'trim', 'woman', 'appointment', 'song', 'very', 'horror', 'fat', 'psychologist', 'fan', 'decide', 'fall', 'awful', 'ticket', 'difference', 'condition', 'trigger', 'delivery', 'heaven', 'cable', 'stimulus', 'evaluation', 'celebrity', 'frontier', 'list', 'mild', 'punish', 'grandfather', 'large', 'sand', 'adjust', 'consultant', 'small', 'biological', 'tribal', 'elevator', 'neighborhood', 'ten', 'tea', 'straighten', 'past', 'rate', 'invention', 'design', 'perspective', 'lawyer', 'pass', 'investment', 'ribbon', 'conscience', 'trick', 'what', 'Latin', 'darkness', 'clock', 'sun', 'section', 'dirt', 'fatigue', 'brief', 'devastating', 'crush', 'version', 'nurse', 'public', 'contrast', 'movement', 'full', 'loose', 'component', 'supposedly', 'forgive', 'operating', 'tolerance', 'trunk', 'grape', 'strong', 'tragedy', 'arena', 'publisher', 'search', 'substance', 'naval', 'ahead', 'extraordinary', 'compliance', 'qualify', 'experience', 'youngster', 'soldier', 'amount', 'advertising', 'social', 'action', 'narrow', 'trainer', 'via', 'depart', 'family', 'suddenly', 'transit', 'ask', 'sanction', 'actress', 'establish', 'select', 'readily', 'conventional', 'attendance', 'eye', 'proceed', 'injure', 'distinct', 'revelation', 'etc', 'charge', 'T-shirt', 'almost', 'competitor', 'Russian', 'injury', 'minor', 'more', 'teen', 'flat', 'diamond', 'door', 'incentive', 'division', 'company', 'emission', 'telephone', 'excuse', 'American', 'stick', 'impulse', 'particular', 'known', 'commander', 'glad', 'town', 'none', 'fatal', 'shared', 'hour', 'cluster', 'science', 'equation', 'coalition', 'dramatically', 'plea', 'remain', 'sudden', 'nine', 'learn', 'abandon', 'male', 'scramble', 'history', 'beautiful', 'compare', 'prompt', 'brown', 'credibility', 'share', 'accept', 'sphere', 'minimum', 'sense', 'guitar', 'station', 'airplane', 'dress', 'species', 'biography', 'reputation', 'information', 'cure', 'court', 'goal', 'liver', 'rather', 'technological', 'comfort', 'utilize', 'fame', 'occasionally', 'earnings', 'sacred', 'reject', 'stir', 'goat', 'creature', 'recording', 'plant', 'sandwich', 'okay', 'circuit', 'flip', 'advice', 'catalog', 'ego', 'reflect', 'interior', 'plane', 'lighting', 'blood', 'faculty', 'develop', 'ethics', 'response', 'nonetheless', 'a', 'refuse', 'short', 'resemble', 'coat', 'doctor', 'departure', 'coal', 'shore', 'responsibility', 'fundamental', 'media', 'pleasure', 'dream', 'infant', 'help', 'pant', 'essence', 'soon', 'trade', 'disabled', 'radar', 'attitude', 'paper', 'through', 'committee', 'hell', 'suffer', 'its', 'register', 'romance', 'style', 'rapidly', 'starting', 'taxpayer', 'sodium', 'pray', 'sale', 'unity', 'actually', 'late', 'grade', 'absence', 'projection', 'radical', 'inquiry', 'pork', 'soccer', 'might', 'alter', 'ally', 'colleague', 'evening', 'return', 'calm', 'severely', 'food', 'propose', 'adoption', 'hunter', 'predator', 'encouraging', 'framework', 'compound', 'foot', 'speculate', 'complain', 'association', 'mystery', 'easily', 'holiday', 'pregnant', 'always', 'policeman', 'normally', 'capability', 'someone', 'eager', 'boring', 'found', 'friendship', 'button', 'trailer', 'heavy', 'status', 'harm', 'everyone', 'mental', 'weight', 'generation', 'house', 'energy', 'hard', 'reduce', 'idea', 'gun', 'kneel', 'oil', 'expect', 'fist', 'measurement', 'practitioner', 'operation', 'beyond', 'event', 'really', 'anxious', 'flower', 'funding', 'alcohol', 'since', 'prosecution', 'testify', 'publish', 'research', 'safety', 'hill', 'wagon', 'print', 'issue', 'highway', 'ass', 'fiber', 'strain', 'belief', 'rumor', 'circumstance', 'disagree', 'confidence', 'story', 'pleasant', 'difficulty', 'reason', 'base', 'cookie', 'imagine', 'put', 'teach', 'weed', 'beginning', 'generate', 'temperature', 'revolutionary', 'definition', 'service', 'thread', 'launch', 'similarly', 'terrible', 'enforce', 'terribly', 'uncle', 'threat', 'twenty', 'consequence', 'undergo', 'assign', 'feed', 'major', 'upper', 'feel', 'relate', 'number', 'well-known', 'elder', 'Indian', 'construct', 'changing', 'halfway', 'traveler', 'miss', 'horse', 'guest', 'jet', 'script', 'introduction', 'together', 'interact', 'bay', 'least', 'cartoon', 'paint', 'regulation', 'wonderful', 'expand', 'statement', 'marble', 'protocol', 'compromise', 'hundred', 'banana', 'store', 'headache', 'option', 'relationship', 'behind', 'hotel', 'park', 'immediate', 'prediction', 'appreciation', 'part', 'favorable', 'rifle', 'servant', 'translation', 'believe', 'convinced', 'grace', 'reflection', 'king', 'kind', 'scheme', 'dam', 'mortality', 'double', 'anticipate', 'instruction', 'vocal', 'youth', 'nevertheless', 'marriage', 'supposed', 'sole', 'toward', 'declare', 'risky', 'motivation', 'ranch', 'basically', 'outstanding', 'strengthen', 'God', 'plead', 'relieve', 'juice', 'substantial', 'contest', 'blond', 'concentration', 'vulnerable', 'collapse', 'sell', 'lie', 'breathing', 'offender', 'drunk', 'depending', 'superior', 'essentially', 'self', 'cave', 'port', 'Arab', 'majority', 'ironically', 'internal', 'lip', 'chairman', 'finding', 'donor', 'gym', 'exploit', 'province', 'play', 'towards', 'electric', 'retired', 'needle', 'quote', 'reach', 'chart', 'react', 'most', 'virus', 'shareholder', 'plan', 'significant', 'nothing', 'faint', 'extremely', 'achievement', 'constitution', 'salary', 'mineral', 'bug', 'seize', 'mobile', 'clear', 'sometimes', 'cover', 'traditional', 'artistic', 'clean', 'weigh', 'discourage', 'scholarship', 'physics', 'sector', 'particularly', 'phenomenon', 'gold', 'consult', 'disappointed', 'session', 'relation', 'carefully', 'routinely', 'uniform', 'fine', 'find', 'occupation', 'impact', 'access', 'giant', 'northern', 'justice', 'nervous', 'ruin', 'writer', 'French', 'penalty', 'jazz', 'pretty', 'factor', 'circle', 'hip', 'his', 'hit', 'meanwhile', 'dependent', 'secular', 'express', 'sunny', 'grocery', 'blank', 'famous', 'courage', 'breast', 'closely', 'reply', 'during', 'silk', 'him', 'enemy', 'Olympic', 'resolve', 'progressive', 'cry', 'remove', 'proclaim', 'banking', 'investigate', 'common', 'activity', 'sustainable', 'cease', 'river', 'rear', 'spell', 'set', 'art', 'dump', 'principal', 'intelligence', 'sex', 'culture', 'see', 'defense', 'bare', 'migration', 'sea', 'tender', 'close', 'expertise', 'arm', 'barn', 'seal', 'preparation', 'stance', 'expert', 'bride', 'movie', 'wow', 'exchange', 'please', 'pickup', 'various', 'happiness', 'probably', 'donate', 'numerous', 'corruption', 'thrive', 'endure', 'available', 'recently', 'iron', 'missing', 'initially', 'attention', 'punch', 'incident', 'succeed', 'accessible', 'opposition', 'African', 'distinguish', 'observer', 'legislature', 'both', 'full-time', 'last', 'reverse', 'license', 'restaurant', 'oversee', 'drinking', 'annual', 'foreign', 'sensitive', 'connection', 'context', 'residential', 'long-term', 'poverty', 'whole', 'experimental', 'load', 'consume', 'point', 'simple', 'sweet', 'loan', 'address', 'sweep', 'community', 'dancer', 'simply', 'village', 'vessel', 'throughout', 'expensive', 'belt', 'decline', 'devil', 'raise', 'monthly', 'create', 'overlook', 'political', 'due', 'strategy', 'whom', 'secret', 'damn', 'maintenance', 'threaten', 'brick', 'territory', 'meeting', 'premise', 'empty', 'firm', 'dialogue', 'alongside', 'flight', 'champion', 'gay', 'buck', 'fire', 'wet', 'gas', 'convert', 'else', 'fund', 'understand', 'specifically', 'demand', 'efficient', 'wedding', 'instructor', 'politician', 'look', 'solid', 'belly', 'straight', 'bill', 'tolerate', 'budget', 'governor', 'admire', 'technical', 'while', 'sexuality', 'kick', 'behavior', 'error', 'fun', 'fleet', 'guide', 'loop', 'pack', 'exclude', 'swim', 'pound', 'costly', 'emerging', 'century', 'wherever', 'rip', 'itself', 'ready', 'carbon', 'vanish', 'rib', 'chop', 'increased', 'funny', 'shirt', 'anymore', 'grant', 'belong', 'discourse', 'emphasize', 'widely', 'grand', 'advise', 'dozen', 'composition', 'conflict', 'development', 'hallway', 'calculation', 'literature', 'temporary', 'fiscal', 'regulator', 'mountain', 'bonus', 'comprehensive', 'gallery', 'defeat', 'yesterday', 'high-tech', 'moment', 'traditionally', 'purpose', 'necessity', 'genuine', 'yours', 'implement', 'recent', 'early', 'lower', 'task', 'database', 'substantially', 'cheek', 'suspicious', 'anybody', 'analysis', 'obviously', 'person', 'cheer', 'edge', 'withdraw', 'grip', 'organization', 'chemistry', 'prize', 'spend', 'prevention', 'know', 'coup', 'overwhelm', 'exotic', 'competitive', 'isolated', 'shape', 'openly', 'Internet', 'useful', 'alternative', 'couch', 'fur', 'continent', 'timber', 'discipline', 'porch', 'cut', 'presidency', 'also', 'informal', 'recipient', 'admission', 'danger', 'automobile', 'organism', 'exclusively', 'contractor', 'source', 'cue', 'deliberately', 'location', 'homeland', 'snap', 'input', 'administrative', 'remaining', 'surprised', 'build', 'shove', 'complaint', 'confession', 'march', 'emergency', 'format', 'big', 'couple', 'advocate', 'game', 'quest', 'insert', 'towel', 'bit', 'buddy', 'formal', 'insect', 'knock', 'promising', 'OK', 'onion', 'spine', 'confront', 'signal', 'clay', 'glove', 'rider', 'ignore', 'collect', 'destination', 'continue', 'indication', 'popular', 'disorder', 'essential', 'privately', 'premium', 'scare', 'often', 'absolutely', 'spring', 'deposit', 'creation', 'some', 'back', 'bounce', 'ah', 'marker', 'economic', 'palm', 'fabric', 'sight', 'mirror', 'curious', 'accomplishment', 'pale', 'ourselves', 'gradually', 'scale', 'jewelry', 'pet', 'decision', 'shall', 'audience', 'per', 'seller', 'religion', 'pen', 'civilization', 'eliminate', 'innovation', 'temple', 'minimal', 'nose', 'be', 'patient', 'rub', 'salmon', 'processing', 'continuing', 'agreement', 'stem', 'outdoor', 'step', 'nowhere', 'metaphor', 'by', 'vaccine', 'shine', 'faith', 'wildlife', 'anything', 'cigarette', 'graduation', 'drama', 'range', 'bless', 'spit', 'charm', 'consequently', 'block', 'pollution', 'repair', 'plenty', 'reinforce', 'into', 'within', 'retailer', 'shed', 'compel', 'appropriate', 'racism', 'primarily', 'contributor', 'lesson', 'guideline', 'bankruptcy', 'arrow', 'statistics', 'classical', 'eleven', 'spending', 'apparently', 'question', 'fast', 'custom', 'occupy', 'doubt', 'refrigerator', 'suit', 'forward', 'analyze', 'athlete', 'opponent', 'considerably', 'himself', 'invite', 'poster', 'precise', 'criteria', 'immigration', 'cloth', 'manipulate', 'properly', 'link', 'alive', 'line', 'junior', 'motive', 'Irish', 'bicycle', 'consist', 'characteristic', 'up', 'us', 'planet', 'greatly', 'exploration', 'prosecutor', 'sum', 'highlight', 'weaken', 'similar', 'medical', 'bell', 'possession', 'popularity', 'constant', 'expedition', 'flow', 'influence', 'commissioner', 'package', 'coordinator', 'single', 'warning', 'defender', 'guidance', 'organizational', 'rally', 'legend', 'prevail', 'questionnaire', 'cotton', 'echo', 'loyal', 'supportive', 'Jew', 'TV', 'ballot', 'peace', 'legislator', 'occasional', 'application', 'definitely', 'electrical', 'transport', 'income', 'department', 'nice', 'actively', 'draw', 'apparent', 'reportedly', 'lend', 'tablespoon', 'enterprise', 'AM', 'resign', 'helpful', 'lens', 'correspondent', 'meaning', 'reservation', 'clue', 'desert', 'structure', 'ago', 'lane', 'land', 'fighter', 'practically', 'blend', 'age', 'required', 'orbit', 'scream', 'depth', 'summit', 'Asian', 'literally', 'far', 'fresh', 'hello', 'once', 'essay', 'code', 'partial', 'rabbit', 'hint', 'resistance', 'existing', 'alien', 'tip', 'gang', 'go', 'golf', 'moon', 'compose', 'carrot', 'concerned', 'hunger', 'young', 'send', 'literary', 'stable', 'suite', 'include', 'friendly', 'ingredient', 'random', 'garden', 'Canadian', 'dessert', 'genius', 'wave', 'sensation', 'spoon', 'accuse', 'homework', 'surgeon', 'continued', 'wipe', 'arrange', 'entire', 'magic', 'stiff', 'transmission', 'gender', 'marry', 'shock', 'fewer', 'try', 'tunnel', 'race', 'verdict', 'comprise', 'inspection', 'challenge', 'flash', 'someday', 'pretend', 'biology', 'crop', 'jump', 'fold', 'imply', 'video', 'booth', 'uncomfortable', 'subtle', 'wheat', 'dynamics', 'acid', 'odd', 'click', 'folk', 'index', 'poke', 'business', 'dealer', 'squeeze', 'slight', 'cell', 'consistently', 'experiment', 'eating', 'jurisdiction', 'capital', 'bird', 'exercise', 'body', 'freely', 'nominee', 'drill', 'degree', 'leg', 'respectively', 'commercial', 'indicator', 'following', 'explore', 'northeast', 'let', 'sink', 'separation', 'others', 'sing', 'fifteen', 'extreme', 'great', 'engage', 'talent', 'Chinese', 'receive', 'involved', 'casino', 'dissolve', 'implementation', 'survey', 'sword', 'menu', 'opinion', 'climb', 'gene', 'honor', 'Muslim', 'apple', 'heart', 'forbid', 'win', 'manage', 'private', 'pasta', 'decrease', 'unhappy', 'motor', 'duck', 'limb', 'apply', 'surveillance', 'cloud', 'standing', 'use', 'fee', 'from', 'stream', 'consumption', 'illegal', 'bush', 'next', 'few', 'camera', 'examination', 'vehicle', 'timing', 'toy', 'themselves', 'panel', 'historically', 'defendant', 'sort', 'pencil', 'comparison', 'tennis', 'patrol', 'musician', 'impress', 'about', 'infection', 'missile', 'trail', 'train', 'sharply', 'baby', 'central', 'harvest', 'courtroom', 'starter', 'charity', 'customer', 'topic', 'account', 'salad', 'this', 'ride', 'when', 'stroke', 'pour', 'recession', 'anywhere', 'obvious', 'thin', 'praise', 'of', 'industrial', 'meet', 'grin', 'scatter', 'bend', 'control', 'Israeli', 'tap', 'patron', 'plate', 'process', 'lock', 'pad', 'tax', 'high', 'effectively', 'tag', 'something', 'slip', 'parish', 'dominant', 'shrimp', 'skirt', 'Christmas', 'rape', 'native', 'sir', 'educational', 'syndrome', 'democracy', 'sit', 'six', 'poetry', 'arrangement', 'delay', 'lamp', 'swear', 'forest', 'animal', 'instead', 'comedy', 'establishment', 'sin', 'stock', 'profile', 'tension', 'administrator', 'attend', 'hesitate', 'farm', 'watch', 'franchise', 'philosophy', 'collection', 'Mrs', 'Roman', 'abuse', 'wrist', 'ethnic', 'engineering', 'tomato', 'light', 'counter', 'robot', 'element', 'chief', 'honestly', 'allow', 'swimming', 'transformation', 'Catholic', 'thigh', 'producer', 'institutional', 'ambition', 'move', 'meter', 'alliance', 'including', 'costume', 'cruise', 'agricultural', 'nasty', 'galaxy', 'bunch', 'perfect', 'chemical', 'outer', 'till', 'labor', 'meantime', 'owe', 'willing', 'firmly', 'choose', 'undergraduate', 'orange', 'criminal', 'theology', 'dad', 'crash', 'pure', 'auto', 'practice', 'bat', 'operator', 'invitation', 'mention', 'snake', 'kiss', 'mandate', 'front', 'republican', 'investor', 'day', 'presidential', 'Supreme', 'lid', 'sleeve', 'chronic', 'edit', 'intelligent', 'university', 'playoff', 'slide', 'magnitude', 'mode', 'truth', 'shortage', 'lung', 'beneath', 'tray', 'experienced', 'fare', 'homeless', 'soften', 'density', 'adventure', 'apologize', 'related', 'constitute', 'society', 'frequency', 'measure', 'our', 'differ', 'sexual', 'wander', 'special', 'out', 'category', 'sentiment', 'bag', 'CEO', 'defend', 'chaos', 'rival', 'upset', 'activist', 'freshman', 'lecture', 'electronics', 'cause', 'integrated', 'red', 'architecture', 'afford', 'shut', 'await', 'alleged', 'spot', 'withdrawal', 'toll', 'ban', 'completely', 'surely', 'British', 'regulatory', 'ridge', 'distinctive', 'likelihood', 'organ', 'scary', 'interaction', 'yard', 'her', 'spill', 'automatic', 'could', 'stereotype', 'keep', 'counterpart', 'conversation', 'length', 'birthday', 'hence', 'retain', 'retail', 'facilitate', 'housing', 'south', 'exam', 'powerful', 'scene', 'sack', 'strategic', 'embarrassed', 'submit', 'owner', 'equity', 'precisely', 'quality', 'disaster', 'legislative', 'bull', 'management', 'reference', 'privacy', 'festival', 'unknown', 'demonstration', 'accent', 'system', 'eighth', 'priority', 'their', 'attack', 'developer', 'apology', 'philosophical', 'perfectly', 'final', 'painful', 'beard', 'enforcement', 'shell', 'accompany', 'gear', 'ash', 'shorts', 'shelf', 'explanation', 'exactly', 'harassment', 'environmental', 'diminish', 'overwhelming', 'herself', 'institution', 'blind', 'neck', 'steel', 'photograph', 'liquid', 'employer', 'bother', 'conscious', 'beg', 'bed', 'bee', 'compensation', 'individual', 'bind', 'dumb', 'sculpture', 'bet', 'exhibit', 'lightly', 'viewer', 'partnership', 'gently', 'comfortable', 'tonight', 'gentle', 'unlikely', 'have', 'portrait', 'need', 'border', 'clearly', 'screw', 'afraid', 'angle', 'soak', 'agency', 'able', 'gravity', 'mechanism', 'mix', 'which', 'truck', 'soap', 'accuracy', 'worldwide', 'unless', 'agenda', 'combat', 'deploy', 'pitcher', 'collaboration', 'who', 'eight', 'palace', 'representation', 'device', 'sophisticated', 'segment', 'why', 'so-called', 'deny', 'placement', 'neighboring', 'gather', 'request', 'disease', 'face', 'pipe', 'hi', 'gesture', 'mechanical', 'occasion', 'Islam', 'painting', 'fact', 'son', 'affair', 'atmosphere', 'currently', 'super', 'cute', 'violation', 'text', 'stove', 'anyway', 'bring', 'planning', 'well-being', 'bedroom', 'portfolio', 'fear', 'economist', 'debate', 'decade', 'staff', 'pause', 'knowledge', 'tire', 'jaw', 'winner', 'jar', 'should', 'unable', 'tape', 'employee', 'piano', 'partially', 'local', 'achieve', 'regularly', 'insight', 'handle', 'beat', 'familiar', 'photography', 'overall', 'rush', 'bear', 'listener', 'joint', 'bean', 'fraud', 'legitimate', 'consecutive', 'buyer', 'entity', 'Italian', 'gray', 'evolution', 'glory', 'tobacco', 'course', 'shopping', 'shy', 'married', 'eyebrow', 'stuff', 'she', 'allegation', 'contain', 'grab', 'fixed', 'builder', 'view', 'toe', 'requirement', 'warrior', 'national', 'pitch', 'terrain', 'immigrant', 'edition', 'intensity', 'computer', 'powder', 'desperate', 'equip', 'stack', 'tumor', 'battery', 'closer', 'wire', 'nationwide', 'closet', 'reform', 'pattern', 'nightmare', 'bias', 'thumb', 'tend', 'favor', 'state', 'proof', 'identification', 'closed', 'routine', 'progress', 'neither', 'Republican', 'midnight', 'PM', 'comparable', 'handful', 'ability', 'opening', 'importance', 'joy', 'breeze', 'deliver', 'Democrat', 'vitamin', 'efficiency', 'job', 'hypothesis', 'PC', 'key', 'police', 'precious', 'outfit', 'lawsuit', 'distribution', 'monitor', 'constraint', 'politically', 'drum', 'career', 'joke', 'equal', 'drug', 'assure', 'admit', 'grain', 'swallow', 'safely', 'otherwise', 'comment', 'detailed', 'relevant', 'unfold', 'conclude', 'wall', 'removal', 'walk', 'invisible', 'laugh', 'cousin', 'disturbing', 'table', 'nomination', 'citizenship', 'blink', 'poem', 'outsider', 'attraction', 'addition', 'discrimination', 'shame', 'mathematics', 'genetic', 'permanent', 'slowly', 'treat', 'entitle', 'whisper', 'poet', 'curtain', 'proposal', 'kitchen', 'summary', 'define', 'diversity', 'league', 'preliminary', 'route', 'sufficient', 'climate', 'assert', 'planner', 'resort', 'ours', 'prescription', 'fellow', 'cruel', 'present', 'volunteer', 'carbohydrate', 'novel', 'laser', 'unlike', 'plain', 'notebook', 'appearance', 'as', 'value', 'will', 'influential', 'deem', 'fault', 'wild', 'balance', 'layer', 'barely', 'resident', 'thus', 'site', 'surface', 'hardware', 'vs', 'optimistic', 'lifetime', 'partner', 'inspector', 'miracle', 'portray', 'capture', 'shooting', 'productivity', 'perhaps', 'balloon', 'administration', 'cross', 'unite', 'campus', 'member', 'cope', 'strange', 'speaker', 'northwest', 'flour', 'shuttle', 'party', 'negotiate', 'cream', 'grasp', 'difficult', 'disc', 'tighten', 'ball', 'slave', 'emotionally', 'flesh', 'absorb', 'drink', 'upon', 'effect', 'beast', 'student', 'dust', 'frequently', 'whale', 'indigenous', 'judgment', 'lobby', 'identity', 'destruction', 'expense', 'off', 'center', 'lunch', 'Senate', 'approve', 'weapon', 'incredible', 'well', 'rack', 'fighting', 'thought', 'sheer', 'banker', 'command', 'personnel', 'hazard', 'position', 'muscle', 'drawing', 'usual', 'restore', 'excessive', 'rocket', 'less', 'increasingly', 'offensive', 'accurate', 'executive', 'domestic', 'obtain', 'dose', 'clinic', 'underlying', 'distant', 'fishing', 'tone', 'skill', 'graduate', 'rescue', 'mysterious', 'web', 'generous', 'rapid', 'tight', 'flee', 'supply', 'sky', 'lake', 'realize', 'arrest', 'add', 'other', 'combine', 'attractive', 'stimulate', 'citizen', 'hurry', 'smart', 'ski', 'unprecedented', 'ought', 'boom', 'identical', 'branch', 'fate', 'government', 'priest', 'execute', 'haul', 'historic', 'five', 'whatever', 'burden', 'desk', 'press', 'immediately', 'prominent', 'loss', 'incredibly', 'necessary', 'like', 'success', 'sofa', 'profound', 'journalism', 'candy', 'testimony', 'garage', 'journalist', 'lose', 'become', 'works', 'soft', 'page', 'heel', 'replacement', 'amendment', 'exceed', 'because', 'functional', 'habitat', 'sequence', 'municipal', 'scared', 'church', 'authority', 'hair', 'growth', 'export', 'convey', 'recommendation', 'proper', 'home', 'empire', 'blade', 'employment', 'recognition', 'shrug', 'cliff', 'leaf', 'lead', 'broad', 'avoid', 'lean', 'e-mail', 'leap', 'sustain', 'passion', 'leader', 'trap', 'locate', 'survivor', 'pepper', 'noise', 'schedule', 'journal', 'expansion', 'likewise', 'pressure', 'host', 'although', 'instinct', 'frustrate', 'refuge', 'stage', 'sister', 'actual', 'extension', 'column', 'freedom', 'consciousness', 'ecological', 'carrier', 'sweat', 'tongue', 'documentary', 'software', 'equally', 'own', 'bulb', 'wilderness', 'previously', 'creativity', 'commitment', 'articulate', 'assess', 'Iraqi', 'guard', 'weather', 'promise', 'brush', 'female', 'freeze', 'luck', 'mere', 'utility', 'judicial', 'processor', 'van', 'additional', 'adolescent', 'confident', 'transfer', 'museum', 'outline', 'reduction', 'intention', 'airline', 'inner', 'frustration', 'inspiration', 'chunk', 'naturally', 'function', 'ghost', 'buy', 'north', 'stadium', 'aircraft', 'bus', 'brand', 'mutter', 'but', 'volume', 'expected', 'neutral', 'construction', 'gain', 'technician', 'remote', 'ear', 'eat', 'he', 'count', 'wise', 'evident', 'whether', 'dangerous', 'official', 'smooth', 'particle', 'distract', 'excitement', 'record', 'below', 'convince', 'limit', 'ruling', 'monument', 'cake', 'demonstrate', 'problem', 'piece', 'display', 'marketplace', 'pin', 'recognize', 'universal', 'contribute', 'pie', 'pig', 'twist', 'sibling', 'partly', 'sneak', 'education', 'pit', 'happen', 'mutual', 'loud', 'variety', 'fascinating', 'corporation', 'boot', 'detail', 'virtual', 'lately', 'Islamic', 'book', 'witness', 'sick', 'outlet', 'wing', 'scandal', 'illusion', 'conclusion', 'repeat', 'star', 'class', 'ideological', 'canvas', 'variation', 'stay', 'chance', 'basement', 'gap', 'exposure', 'appoint', 'rope', 'oral', 'entertainment', 'earthquake', 'rule', 'portion', 'craft', 'compete', 'pension', 'diagnosis', 'Spanish', 'rural', 'yell', 'confess', 'representative']
| 5,156.333333 | 46,293 | 0.623462 | 4,389 | 46,407 | 6.592162 | 0.992709 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000119 | 0.094188 | 46,407 | 8 | 46,294 | 5,800.875 | 0.688172 | 0.002133 | 0 | 0 | 0 | 0 | 0.623642 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | false | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 9 |
e714c7f5a1729beb41544a4d4ca2bd137c502b43 | 10,350 | py | Python | refresh.py | aruntemme/Covid-19-API | 456d986f5e38bf1fc42a6d7943518d2020ef4759 | [
"MIT"
] | null | null | null | refresh.py | aruntemme/Covid-19-API | 456d986f5e38bf1fc42a6d7943518d2020ef4759 | [
"MIT"
] | null | null | null | refresh.py | aruntemme/Covid-19-API | 456d986f5e38bf1fc42a6d7943518d2020ef4759 | [
"MIT"
] | null | null | null | import pandas as pd
import io
import requests
from country_codes import country_code
import json
import math
import sys
import os
url_confirmed="https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_series_19-covid-Confirmed.csv"
url_deaths="https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_series_19-covid-Deaths.csv"
url_recovered="https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_series_19-covid-Recovered.csv"
def update():
s=requests.get(url_confirmed).content
df_confirmed=pd.read_csv(io.StringIO(s.decode('utf-8')))
s=requests.get(url_deaths).content
df_deaths=pd.read_csv(io.StringIO(s.decode('utf-8')))
s=requests.get(url_recovered).content
df_recovered=pd.read_csv(io.StringIO(s.decode('utf-8')))
json_data_final = {}
json_data_final_countries = {}
#json_data_list = []
json_data_final['confirmed'] = {}
json_data_final['confirmed']['locations'] = []
json_data_final['deaths'] = {}
json_data_final['deaths']['locations'] = []
json_data_final['recovered'] = {}
json_data_final['recovered']['locations'] = []
# Confirmed
tmp_latest_confirmed = int(df_confirmed.sum(axis=0)[-1])
for index, row in df_confirmed.iterrows():
tmp_element = {}
tmp_province = str(row['Province/State'])
tmp_country_name = row['Country/Region']
if(tmp_country_name == 'Brunei'):
tmp_country_name = 'Brunei Darussalam'
elif(tmp_country_name == 'Taiwan*'):
tmp_country_name = 'Taiwan, Province of China'
elif(tmp_country_name == 'Moldova'):
tmp_country_name = 'Moldova, Republic of'
elif(tmp_country_name == 'Venezuala'):
tmp_country_name = 'Venezuela, Bolivarian Republic of'
elif(tmp_country_name == 'Bolivia'):
tmp_country_name = 'Bolivia, Plurinational State of'
elif(tmp_country_name == 'The Bahamas'):
tmp_country_name = 'Bahamas'
elif(tmp_country_name == 'Congo (Kinshasa)' or tmp_country_name == 'Congo (Brazzaville)' ):
tmp_country_name = 'Congo'
elif(tmp_country_name == 'Tanzania'):
tmp_country_name = 'Tanzania, United Republic of'
elif(tmp_country_name == 'Republic of the Congo'):
tmp_country_name = 'Congo, the Democratic Republic of the'
elif(tmp_country_name == "Cote d'Ivoire"):
tmp_country_name = "Côte d'Ivoire"
elif(tmp_country_name == "Holy See"):
tmp_country_name = "Holy See (Vatican City State)"
elif(tmp_country_name == "occupied Palestinian territory"):
tmp_country_name = "Palestine, State of"
tmp_country_code = country_code(tmp_country_name)
tmp_country_latest = row[-1]
tmp_country_position = {}
tmp_country_position['latitude'] = row['Lat']
tmp_country_position['longitude'] = row['Long']
tmp_country_history = {}
for i in range (4,df_confirmed.shape[1]):
tmp_country_history[list(df_confirmed.columns.values)[i]] = row[i]
tmp_element['coordinates'] = tmp_country_position
tmp_element['country'] = tmp_country_name
tmp_element['country_code'] = tmp_country_code
tmp_element['history'] = tmp_country_history
tmp_element['latest'] = tmp_country_latest
tmp_element['province'] = tmp_province
json_data_final['confirmed']['locations'].append(tmp_element)
############
json_data_final_countries[tmp_country_code] = {}
json_data_final_countries[tmp_country_code]['country'] = {}
json_data_final_countries[tmp_country_code]['country']['name'] = tmp_country_name
json_data_final_countries[tmp_country_code]['country']['position'] = tmp_country_position
json_data_final_countries[tmp_country_code]['confirmed'] = {}
json_data_final_countries[tmp_country_code]['confirmed']['history'] = tmp_country_history
json_data_final_countries[tmp_country_code]['confirmed']['latest'] = tmp_country_latest
############
json_data_final['confirmed']['latest'] = tmp_latest_confirmed
# Deaths
tmp_latest_deaths = int(df_deaths.sum(axis=0)[-1])
for index, row in df_deaths.iterrows():
tmp_element = {}
tmp_province = str(row['Province/State'])
tmp_country_name = row['Country/Region']
if(tmp_country_name == 'Brunei'):
tmp_country_name = 'Brunei Darussalam'
elif(tmp_country_name == 'Taiwan*'):
tmp_country_name = 'Taiwan, Province of China'
elif(tmp_country_name == 'Moldova'):
tmp_country_name = 'Moldova, Republic of'
elif(tmp_country_name == 'Venezuala'):
tmp_country_name = 'Venezuela, Bolivarian Republic of'
elif(tmp_country_name == 'Bolivia'):
tmp_country_name = 'Bolivia, Plurinational State of'
elif(tmp_country_name == 'The Bahamas'):
tmp_country_name = 'Bahamas'
elif(tmp_country_name == 'Congo (Kinshasa)' or tmp_country_name == 'Congo (Brazzaville)' ):
tmp_country_name = 'Congo'
elif(tmp_country_name == 'Tanzania'):
tmp_country_name = 'Tanzania, United Republic of'
elif(tmp_country_name == 'Republic of the Congo'):
tmp_country_name = 'Congo, the Democratic Republic of the'
elif(tmp_country_name == "Cote d'Ivoire"):
tmp_country_name = "Côte d'Ivoire"
elif(tmp_country_name == "Holy See"):
tmp_country_name = "Holy See (Vatican City State)"
elif(tmp_country_name == "occupied Palestinian territory"):
tmp_country_name = "Palestine, State of"
tmp_country_code = country_code(tmp_country_name)
tmp_country_latest = row[-1]
tmp_country_position = {}
tmp_country_position['latitude'] = row['Lat']
tmp_country_position['longitude'] = row['Long']
tmp_country_history = {}
for i in range (4,df_deaths.shape[1]):
tmp_country_history[list(df_deaths.columns.values)[i]] = row[i]
tmp_element['coordinates'] = tmp_country_position
tmp_element['country'] = tmp_country_name
tmp_element['country_code'] = tmp_country_code
tmp_element['history'] = tmp_country_history
tmp_element['latest'] = tmp_country_latest
tmp_element['province'] = tmp_province
json_data_final['deaths']['locations'].append(tmp_element)
############
json_data_final_countries[tmp_country_code]['deaths'] = {}
json_data_final_countries[tmp_country_code]['deaths']['history'] = tmp_country_history
json_data_final_countries[tmp_country_code]['deaths']['latest'] = tmp_country_latest
############
json_data_final['deaths']['latest'] = tmp_latest_deaths
# Recovered
tmp_latest_recovered = int(df_recovered.sum(axis=0)[-1])
for index, row in df_recovered.iterrows():
tmp_element = {}
tmp_province = str(row['Province/State'])
tmp_country_name = row['Country/Region']
if(tmp_country_name == 'Brunei'):
tmp_country_name = 'Brunei Darussalam'
elif(tmp_country_name == 'Taiwan*'):
tmp_country_name = 'Taiwan, Province of China'
elif(tmp_country_name == 'Moldova'):
tmp_country_name = 'Moldova, Republic of'
elif(tmp_country_name == 'Venezuala'):
tmp_country_name = 'Venezuela, Bolivarian Republic of'
elif(tmp_country_name == 'Bolivia'):
tmp_country_name = 'Bolivia, Plurinational State of'
elif(tmp_country_name == 'The Bahamas'):
tmp_country_name = 'Bahamas'
elif(tmp_country_name == 'Congo (Kinshasa)' or tmp_country_name == 'Congo (Brazzaville)' ):
tmp_country_name = 'Congo'
elif(tmp_country_name == 'Tanzania'):
tmp_country_name = 'Tanzania, United Republic of'
elif(tmp_country_name == 'Republic of the Congo'):
tmp_country_name = 'Congo, the Democratic Republic of the'
elif(tmp_country_name == "Cote d'Ivoire"):
tmp_country_name = "Côte d'Ivoire"
elif(tmp_country_name == "Holy See"):
tmp_country_name = "Holy See (Vatican City State)"
elif(tmp_country_name == "occupied Palestinian territory"):
tmp_country_name = "Palestine, State of"
tmp_country_code = country_code(tmp_country_name)
tmp_country_latest = row[-1]
tmp_country_position = {}
tmp_country_position['latitude'] = row['Lat']
tmp_country_position['longitude'] = row['Long']
tmp_country_history = {}
for i in range (4,df_recovered.shape[1]):
tmp_country_history[list(df_recovered.columns.values)[i]] = row[i]
tmp_element['coordinates'] = tmp_country_position
tmp_element['country'] = tmp_country_name
tmp_element['country_code'] = tmp_country_code
tmp_element['history'] = tmp_country_history
tmp_element['latest'] = tmp_country_latest
tmp_element['province'] = tmp_province
json_data_final['recovered']['locations'].append(tmp_element)
############
json_data_final_countries[tmp_country_code]['recovered'] = {}
json_data_final_countries[tmp_country_code]['recovered']['history'] = tmp_country_history
json_data_final_countries[tmp_country_code]['recovered']['latest'] = tmp_country_latest
############
json_data_final['recovered']['latest'] = tmp_latest_recovered
# Latest
json_data_final['latest'] = {}
json_data_final['latest']['confirmed'] = tmp_latest_confirmed
json_data_final['latest']['deaths'] = tmp_latest_deaths
json_data_final['latest']['recovered'] = tmp_latest_recovered
with open('data.json', 'w') as f:
json.dump(json_data_final, f)
sys.stdout.flush()
print("Data Updated !")
with open('dataCountry.json', 'w') as f:
json.dump(json_data_final_countries, f)
sys.stdout.flush()
print("Data Updated !")
return 0
| 41.566265 | 160 | 0.655845 | 1,253 | 10,350 | 5.045491 | 0.098962 | 0.218285 | 0.188232 | 0.093958 | 0.867131 | 0.838026 | 0.838026 | 0.797849 | 0.743277 | 0.717969 | 0 | 0.005336 | 0.221353 | 10,350 | 248 | 161 | 41.733871 | 0.779129 | 0.005121 | 0 | 0.647059 | 0 | 0.016043 | 0.239335 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005348 | false | 0 | 0.042781 | 0 | 0.053476 | 0.010695 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e74b42c98acb8ca08270d4b754d984064e8518bd | 2,236 | py | Python | lotus_predict/rule_entertainment.py | BuiNgocHai/youtube-8m | 8aa922b02b81821655f9dbd78a575b732ed27b77 | [
"Apache-2.0"
] | null | null | null | lotus_predict/rule_entertainment.py | BuiNgocHai/youtube-8m | 8aa922b02b81821655f9dbd78a575b732ed27b77 | [
"Apache-2.0"
] | null | null | null | lotus_predict/rule_entertainment.py | BuiNgocHai/youtube-8m | 8aa922b02b81821655f9dbd78a575b732ed27b77 | [
"Apache-2.0"
] | null | null | null | import pandas as pd
labels = pd.read_csv('label_names.csv',sep=",")
data = pd.read_csv('solution_115.csv')
a = data.LabelConfidencePairs
i = 0
for item in a:
print(data.VideoId[i])
name_of_class = ''
s = item.split()
if s[0] == '342' and float(s[1]) > 0.99995:
print(labels.label_name[labels.label_id[labels.label_id == int(s[0])].index[0]])
name_of_class = labels.label_name[labels.label_id[labels.label_id == int(s[0])].index[0]]
elif s[0] == '342' and float(s[1]) > 0.997:
if float(s[3]) > 0.01 and s[2] != '0' and s[2] !='1':
print(labels.label_name[labels.label_id[labels.label_id == int(s[2])].index[0]])
name_of_class = labels.label_name[labels.label_id[labels.label_id == int(s[2])].index[0]]
elif float(s[3]) > 0.1:
print(labels.label_name[labels.label_id[labels.label_id == int(s[2])].index[0]])
name_of_class = labels.label_name[labels.label_id[labels.label_id == int(s[2])].index[0]]
else:
print(labels.label_name[labels.label_id[labels.label_id == int(s[0])].index[0]])
name_of_class = labels.label_name[labels.label_id[labels.label_id == int(s[0])].index[0]]
elif s[0] == '342' and float(s[1]) < 0.997:
# if s[2] != '0' and s[2] != '1':
# print(labels.label_name[labels.label_id[labels.label_id == int(s[2])].index[0]])
# name_of_class = labels.label_name[labels.label_id[labels.label_id == int(s[2])].index[0]]
# else:
# print(labels.label_name[labels.label_id[labels.label_id == int(s[0])].index[0]])
# name_of_class = labels.label_name[labels.label_id[labels.label_id == int(s[0])].index[0]]
name_of_class = labels.label_name[labels.label_id[labels.label_id == int(s[2])].index[0]]
else:
print(labels.label_name[labels.label_id[labels.label_id == int(s[0])].index[0]])
name_of_class = labels.label_name[labels.label_id[labels.label_id == int(s[0])].index[0]]
if name_of_class != 'News program':
data.LabelConfidencePairs[i] = 'Entertainment'
else:
data.LabelConfidencePairs[i] = name_of_class
i+=1
data.to_csv('solution_115_final.csv',index=False)
| 47.574468 | 103 | 0.624776 | 368 | 2,236 | 3.595109 | 0.130435 | 0.37415 | 0.294785 | 0.238095 | 0.755858 | 0.755858 | 0.755858 | 0.755858 | 0.743764 | 0.743764 | 0 | 0.045379 | 0.19186 | 2,236 | 46 | 104 | 48.608696 | 0.686774 | 0.176655 | 0 | 0.424242 | 0 | 0 | 0.0491 | 0.012002 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.030303 | 0 | 0.030303 | 0.181818 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e79b03182d448edaf5425b081a6ad366506beb48 | 2,856 | py | Python | Sholat.py | dizart-y/Jadwal | 05583626fa8da299bddd11888751f2fcd6f95913 | [
"Apache-2.0"
] | 1 | 2019-07-30T21:42:22.000Z | 2019-07-30T21:42:22.000Z | Sholat.py | dizart-y/Jadwal | 05583626fa8da299bddd11888751f2fcd6f95913 | [
"Apache-2.0"
] | null | null | null | Sholat.py | dizart-y/Jadwal | 05583626fa8da299bddd11888751f2fcd6f95913 | [
"Apache-2.0"
] | null | null | null | #RecodeLah Biar Aku Dapet Uang :v
#Recode?Tidak Akan Menjadi Loe Programmer
#Recode Harus Ijin Sama Pembuat Script
#BelajarLah Menghargai" Orang
import marshal,zlib,base64
exec(marshal.loads(zlib.decompress(base64.b16decode("789C9556EB6F1347109FF53376E2040225A43C7A146853689C90144829A50D09058A9A567605952314ADBD27DFFA79DCEE2918251212F9E0AF7C693FF48FEAFFD07FA4CCCEDD39E7D0E3718FD1CCEC6F66E7B17B7B0D08AF14BE3FE2ABFE4622F061D001A88D78063516F129A8A5223E0DB574C467A0960181A35910A8CF81404D1E44166A132072502B80C843AD48FA491013509B0251805A0944116AD32026E100279D01FB18882968A7C0FB0BEC197805C07A344775A184F1C9FFF0DA5253C86E3E6CB407D606DF1295759547C599ED6BDF7D7BAD7BC8AFC4F8D518FF4D8CBF1EE36FC4F89B5D95217EB9DB30559AC4378BEF86A9543E4D21DD7F802923491B9231246B48CE90BC21134834408B995C5F316098AC4EC19C4EC31C0E4FC604A498F79CCEC09C28054A319D34309334702C69E0384D386BE2C99A519D43E58970F06400C600F7018CF527E4662C70633AF887E93CB4B0A3A7600F020306BA00AD22EC31684D1AE87E0AF6708DCCC1BC380DF3FBE991348F5266247D8A5276249D41293792CEA29427E91C4AE7519A1849F8EE1748FA0C250BA522C413BE10E6F47920EFBD55DC087031046492009742403609703904E492005F84807C12E0CB103091045808018524C05721A09804B81202B06057613E047D6DD04F7A7720A371AF15C14BA519639139AE19B1189A97C9EF34B466402CD14AD0C7824DC9E00FDCFCD58565DC105B6AEB70CFFDFBE7EBF1E7809E97C94F0018B3B2AC6497C3B766487A861FE4F2BDFE0EC6431C7E84CB23A6711FC318EC035C1E44A878A946257C3D3EC3300E7B7F942F897F07FDC82887EF6CFAC1382CD9E528E561643A8CB9F95F7E184FDFB224339FEBE348B6AD9FB9D8E59DC5AAD3EF706D3D55370F67BB2B3BDCAA363CE96AEB9ED7F7AC2A2A9C36EF591B0E621FF1AEB436A5F50405C5B96B69E3769BE8457509E9536BDDD70E1A5AD62D2B741A3F9FCE6CFF6AA93201ABBEEBF63D1D032E951FF73B9AEFFDE66B8F6FCA46BBBF774F4832592693FB523FF0EB71DF8ED6AEBAB5B4D494DAF1EBE546BFBB24E40BEEE9C54164F4D8F694B4E246CBE5552BE9D2E68CA384AEAA02D93FEA6B8EB66AD6A47A05E3E675EE59779B4A58E572599D47751485965DBBACA4DAE542522CAEC7074BDA9C9F826B4ECC4FBCE56973806E3ABEE3E1B705605D79DA9CB7BFF0A6E3C93AC11E2A27C01B9F91075B4F20D3E937B896FD1ED970213C5B298AF80AFD0E98CAD67D87120E159B2F7CC7F7628A75F4EEC511E1D4878A876AC0AD311F38FD98E2770C6C4CB11E8472A8A862CABE32083C6571F991A828F77ECF2E53CC17A8AEC69BE06D5C5BDAEEFAB8DECAEA36AA9F8BE662DFB57B565460EECAF22E2D3ED7A5022BBB277E701D74F7FD8D95E5B5B56BCB2B37D6D6AEAFACAE2E98C55E299A0A9A1955C7B65DE21CD9E23E71DAD7BE4BB5ED98229B767B7C7747F65C5F93BA2E3D9F4AEED9CF7C5B6945ED6ADAC1684B610FB2D4102E54D02CFBB9268347F6E09E871B88C6BBB6C71D6D7E2FFB2AA8C440619E64613F977AC1FC5391D4E09EA4B278449F05A19B76929DA03E92929B0E92B2CB9BD1A291A8AC4C4629EFF2366698A300DB5CC905B3B48928138B68924D97CB9E2995AAD13F70968DDDA952BAC44A6C3EBCB3CC620594B32CCD4EB059367DE4BE70E4BECC02EB12A3AAECEC98D9767682DE989D4305C57250A32A7386CC1BB5899AD397A472CE284F46CAB6DF93BD66E5545431BFD7F42BB39154EFF836E58E1D944EE58431BD6A0683D97BBC6BE3ECA33A8C15C3406E77FBC2EFD87728AAB3482EB1D32C97CAB12377AAB03AC5DE008076621D"))))
| 408 | 2,682 | 0.983894 | 33 | 2,856 | 85.151515 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.571328 | 0.008403 | 2,856 | 6 | 2,683 | 476 | 0.420904 | 0.047969 | 0 | 0 | 0 | 0 | 0.966839 | 0.966839 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 11 |
e7a2cd2685e00d51970bb03558a89891dcc7a439 | 32,717 | py | Python | cluster_pval/tests/test_pvalmod.py | Cluster-Club/Cluster_Pval | 613b8a3292a960ac3e7856ff1a5aa90189be0d59 | [
"BSD-2-Clause"
] | null | null | null | cluster_pval/tests/test_pvalmod.py | Cluster-Club/Cluster_Pval | 613b8a3292a960ac3e7856ff1a5aa90189be0d59 | [
"BSD-2-Clause"
] | null | null | null | cluster_pval/tests/test_pvalmod.py | Cluster-Club/Cluster_Pval | 613b8a3292a960ac3e7856ff1a5aa90189be0d59 | [
"BSD-2-Clause"
] | 1 | 2022-02-18T17:23:27.000Z | 2022-02-18T17:23:27.000Z | """
Smoke, Edge, and One-Shot Tests for pval module
"""
import unittest
import numpy as np
from sklearn.cluster import AgglomerativeClustering
from cluster_pval.pval_module.stattests import stattest_clusters_approx
from cluster_pval.pval_module.stattests import wald_test
class TestPvalModule(unittest.TestCase):
""" Unittest class holding tests for pval module
Args:
None in addition to those inherited from unittest.Testcase
Attributes:
None in addition to those inherited from unittest.Testcase
Functions:
test_smoke_gao(self)
test_smoke_wald(self)
test_penguin_gao_10000(self)
test_penguin_wald(self)
test_penguin_gao_200(self)
test_insig_cells(self)
test_sig_cells(self)
test_penguin_gao_isoFalse_sigNone_siginvNone_200(self)
test_penguin_gao_isoFalse_sigNone_siginvqxqndarray_200(self)
test_penguin_gao_isoTrue_sig5_200(self)
test_gao_survives0(self)
test_gao_ndraws_valueerror(self)
test_x(self)
test_clusterlabels(self)
test_iso_bool(self)
test_iso_sig_siginv(self)
test_k1k2(self)
"""
def test_smoke_gao(self):
"""
simple smoke test to make sure stattest_clusters_approx function runs
:return: nothing
"""
passing = True
x = np.array([[5, 3],
[10, 15],
[15, 12],
[24, 10],
[30, 30],
[85, 70],
[71, 80],
[60, 78],
[70, 55],
[80, 91], ])
k = 2
cl_fun = AgglomerativeClustering
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = cl_fun(*positional_arguments, **keyword_arguments)
cluster.fit_predict(x)
k1 = 0
k2 = 1
stattest_clusters_approx(x, k1, k2, cluster.labels_, cl_fun,
positional_arguments, keyword_arguments)
self.assertTrue(passing)
def test_smoke_wald(self):
"""
simple smoke test to make sure wald_test function runs
:return: nothing
"""
passing = True
x = np.array([[5, 3],
[10, 15],
[15, 12],
[24, 10],
[30, 30],
[85, 70],
[71, 80],
[60, 78],
[70, 55],
[80, 91], ])
k = 2
cl_fun = AgglomerativeClustering
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = cl_fun(*positional_arguments, **keyword_arguments)
cluster.fit_predict(x)
k1 = 0
k2 = 1
wald_test(x, k1, k2, cluster.labels_)
self.assertTrue(passing)
def test_penguin_gao_10000(self):
"""
One-shot test using Penguin data used in R tutorial with ndraws same
as shown in R tutorial.
:return: nothing so long as function yields same results as when
using R stattest_clusters_approx function:
stat = 10.11433, stderr ~ .01084133, pval > .5
"""
penguin_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests/penguin_data_subset.txt',
delimiter=' ', skip_header=1)
k = 5
# set linkage to average to match R script
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = AgglomerativeClustering(**keyword_arguments)
cluster.fit_predict(penguin_data)
k1 = 0
k2 = 1
stat, pval, stderr = stattest_clusters_approx(penguin_data, k1, k2,
cluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
ndraws=10000)
passing = True
assert np.isclose(stat, 10.11433)
try:
assert np.isclose(stderr, 0.01084133, atol=.001)
except AssertionError:
passing = False
print("stderr is {}, should be within .001 of "
"0.01084133".format(stderr))
try:
assert pval > .5
except AssertionError:
passing = False
print("pval is {}, should be > .5".format(pval))
self.assertTrue(passing)
def test_penguin_wald(self):
"""
One-shot test using Penguin data used in R tutorial
:return: nothing so long as function yields same results as when
using R wald_test function:
stat = 10.11433; pval = 0.006226331
"""
passing = True
penguin_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests/penguin_data_subset.txt',
delimiter=' ',
skip_header=1)
k = 5
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = AgglomerativeClustering(*positional_arguments,
**keyword_arguments)
cluster.fit_predict(penguin_data)
k1 = 0
k2 = 1
stat, pval = wald_test(penguin_data, k1, k2, cluster.labels_)
assert np.isclose(stat, 10.11433)
assert np.isclose(pval, 0.006226331)
self.assertTrue(passing)
def test_penguin_gao_200(self):
"""
One-shot test using Penguin data used in R tutorial with
consistent parameters except ndraws=200 (to expedite function running
while testing)
:return: nothing so long as function yields same results as when
using R stattest_clusters_approx function:
stat = 10.11433; stderr ~ .07; p > .3
"""
penguin_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests/penguin_data_subset.txt',
delimiter=' ', skip_header=1)
k = 5
# set linkage to average to match R script
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = AgglomerativeClustering(**keyword_arguments)
cluster.fit_predict(penguin_data)
k1 = 0
k2 = 1
stat, pval, stderr = stattest_clusters_approx(penguin_data, k1, k2,
cluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
ndraws=200)
passing = True
assert np.isclose(stat, 10.11433)
try:
assert np.isclose(stderr, 0.07, atol=.05)
except AssertionError:
passing = False
print("stderr is {}, should be within .05 of "
"0.07".format(stderr))
try:
assert pval > .3
except AssertionError:
passing = False
print("pval is {}, should be >.3".format(pval))
self.assertTrue(passing)
def test_insig_cells(self):
"""
One shot test to see the wald test yield significant results for
clustering cells of the same type and the adjusted p value method
yield insignificant results
:return: Tests pass if functions find same stat values as those
calculated using R functions, significant p values for the wald tests,
and insignificant p values for the adjusted p value functions.
"""
insig_cell_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests/600tcells.csv',
delimiter=',', skip_header=1)
k = 3
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'ward'}
insigcluster = AgglomerativeClustering(*positional_arguments,
**keyword_arguments)
insigcluster.fit_predict(insig_cell_data)
# Using same siginv matrix as was used in R package (importing here
# instead of recalculating)
siginv1 = np.genfromtxt(
'cluster_pval/tests/data_for_tests/SigInv1_600tcells.csv',
delimiter=',', skip_header=1)
# wald tests negative control
stat, pval = wald_test(insig_cell_data, 0, 1, insigcluster.labels_,
iso=False, siginv=siginv1)
assert np.isclose(stat, 4.054059) and np.isclose(pval, 0)
stat, pval = wald_test(insig_cell_data, 0, 2, insigcluster.labels_,
iso=False, siginv=siginv1)
assert np.isclose(stat, 2.961156) and np.isclose(pval, 9.282575e-13)
stat, pval = wald_test(insig_cell_data, 1, 2, insigcluster.labels_,
iso=False, siginv=siginv1)
assert np.isclose(stat, 4.760857) and np.isclose(pval, 0)
# stattest_clusters_approx negative controls
stat, pval, stderr = stattest_clusters_approx(insig_cell_data, 0, 1,
insigcluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
iso=False,
siginv=siginv1,
ndraws=200)
assert np.isclose(stat, 4.054059) and (pval > .05) and \
(stderr > .05)
stat, pval, stderr = stattest_clusters_approx(insig_cell_data, 0, 2,
insigcluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
iso=False,
siginv=siginv1,
ndraws=200)
assert np.isclose(stat, 2.961156) and (pval > .05) and \
(stderr > .05)
stat, pval, stderr = stattest_clusters_approx(insig_cell_data, 1, 2,
insigcluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
iso=False,
siginv=siginv1,
ndraws=200)
assert np.isclose(stat, 4.760857) and (pval > .05) and \
(stderr > .05)
def test_sig_cells(self):
"""
One shot test to see the wald test yield significant results for both
cell datasets.
:return: nothing so long as function yields same results as when
using R wald_test function:
"""
sig_cell_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests'
'/200tcells_200bcells_200memorycells.csv',
delimiter=',', skip_header=1)
k = 3
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'ward'}
sigcluster = AgglomerativeClustering(*positional_arguments,
**keyword_arguments)
sigcluster.fit_predict(sig_cell_data)
# Using same siginv matrix as was used in R package (importing here
# instead of recalculating)
siginv1 = np.genfromtxt(
'cluster_pval/tests/data_for_tests/SigInv2_200t_200b_200mem.csv',
delimiter=',', skip_header=1)
# wald tests negative control
stat, pval = wald_test(sig_cell_data, 0, 1, sigcluster.labels_,
iso=False, siginv=siginv1)
assert np.isclose(stat, 4.274537) and np.isclose(pval, 0)
stat, pval = wald_test(sig_cell_data, 0, 2, sigcluster.labels_,
iso=False, siginv=siginv1)
assert np.isclose(stat, 4.3801) and np.isclose(pval, 0)
stat, pval = wald_test(sig_cell_data, 1, 2, sigcluster.labels_,
iso=False, siginv=siginv1)
assert np.isclose(stat, 3.042581) and np.isclose(pval, 0)
# stattest_clusters_approx negative controls
stat, pval, stderr = stattest_clusters_approx(sig_cell_data, 0, 1,
sigcluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
iso=False,
siginv=siginv1,
ndraws=200)
assert np.isclose(stat, 4.274537) and (pval < .05)
stat, pval, stderr = stattest_clusters_approx(sig_cell_data, 0, 2,
sigcluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
iso=False,
siginv=siginv1,
ndraws=200)
assert np.isclose(stat, 4.3801) and (pval < .05)
stat, pval, stderr = stattest_clusters_approx(sig_cell_data, 1, 2,
sigcluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
iso=False,
siginv=siginv1,
ndraws=200)
assert np.isclose(stat, 3.042581) and (pval < .05)
def test_penguin_gao_isoFalse_sigNone_siginvNone_200(self):
"""
One-shot test using Penguin data used in R tutorial with
consistent parameters except iso is False, and ndraws=200 (to expedite
function running while testing)
:return: Nothing so long as function returns same results as when
using R stattest_clusters_approx function:
stat = 1.223436; stderr ~ .07; p > .3 (with ndraw=200 there can be a
lot of variability here)
"""
penguin_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests/penguin_data_subset.txt',
delimiter=' ', skip_header=1)
k = 5
# set linkage to average to match R script
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = AgglomerativeClustering(**keyword_arguments)
cluster.fit_predict(penguin_data)
k1 = 0
k2 = 1
stat, pval, stderr = stattest_clusters_approx(penguin_data, k1, k2,
cluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
iso=False,
ndraws=200)
passing = True
assert np.isclose(stat, 1.223436)
try:
assert np.isclose(stderr, 0.07, atol=.03)
except AssertionError:
passing = False
print("stderr is {}, should be within .03 of "
"0.07".format(stderr))
try:
assert pval > .3
except AssertionError:
passing = False
print("pval is {}, should be greater than .3".format(pval))
self.assertTrue(passing)
def test_penguin_gao_isoFalse_sigNone_siginvqxqndarray_200(self):
"""
One-shot test using Penguin data used in R tutorial with
consistent parameters except iso is False, ndraws=200 (to expedite
function running while testing), and siginv provided
:return: Nothing so long as function gets same results as when using R
stattest_clusters_approx function
with these parameters:
stat = 8.134167; stderr < .009; p < .05 (with ndraw=200 there can be a
lot of variability here, these may be a bad stderr and pval thresholds)
"""
penguin_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests/penguin_data_subset.txt',
delimiter=' ', skip_header=1)
k = 5
# set linkage to average to match R script
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = AgglomerativeClustering(**keyword_arguments)
cluster.fit_predict(penguin_data)
k1 = 0
k2 = 1
siginv = np.array([[1, 1], [1, 1]])
stat, pval, stderr = stattest_clusters_approx(penguin_data, k1, k2,
cluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
iso=False,
siginv=siginv,
ndraws=2000)
passing = True
assert np.isclose(stat, 8.134167)
try:
assert stderr < .009
except AssertionError:
passing = False
print("stderr is {}, should be less than "
"0.009".format(stderr))
try:
assert pval < .05
except AssertionError:
passing = False
print("pval is {}, should be less than .05".format(pval))
self.assertTrue(passing)
def test_penguin_gao_isoTrue_sig5_200(self):
"""
One-shot test using Penguin data used in R tutorial with
consistent parameters except ndraws=200 (to expedite function running
while testing), and sig is 5
:return: Nothing so long as function gets same results as when using
R stattest_clusters_approx function with these parameters:
stat = 10.11433; stderr < .2; p > .1 (with ndraw=200 there can be a
lot of variability here, these may be a bad stderr and pval thresholds)
"""
penguin_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests/penguin_data_subset.txt',
delimiter=' ', skip_header=1)
k = 5
# set linkage to average to match R script
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = AgglomerativeClustering(**keyword_arguments)
cluster.fit_predict(penguin_data)
k1 = 0
k2 = 1
stat, pval, stderr = stattest_clusters_approx(penguin_data, k1, k2,
cluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
iso=True,
sig=5,
ndraws=200)
passing = True
assert np.isclose(stat, 10.11433)
try:
assert stderr < .2
except AssertionError:
passing = False
print("stderr is {}, should be less than "
"0.2".format(stderr))
try:
assert pval > .1
except AssertionError:
passing = False
print("pval is {}, should be greater than .1".format(pval))
self.assertTrue(passing)
def test_gao_survives0(self):
"""
Edge test to make sure Runtime Error raised if survives == 0
Running same code as test_penguin_gao_200 but ndraws = 1, running
until it prompts runtime error
:return: Nothing so long as runtime error is raised
"""
penguin_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests/penguin_data_subset.txt',
delimiter=' ', skip_header=1)
k = 5
# set linkage to average to match R script
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = AgglomerativeClustering(**keyword_arguments)
cluster.fit_predict(penguin_data)
k1 = 0
k2 = 1
passing = False
for _ in range(100):
try:
stattest_clusters_approx(penguin_data, k1, k2, cluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments, ndraws=1)
except RuntimeError:
passing = True
break
self.assertTrue(passing)
def test_gao_ndraws_valueerror(self):
"""
Edge test to make sure ValueError raised if ndraws < 0
Running same code as test_penguin_gao_200 but ndraws = -1
:return: Nothing so long as value error is raised
"""
penguin_data = np.genfromtxt(
'cluster_pval/tests/data_for_tests/penguin_data_subset.txt',
delimiter=' ', skip_header=1)
k = 5
# set linkage to average to match R script
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = AgglomerativeClustering(**keyword_arguments)
cluster.fit_predict(penguin_data)
k1 = 0
k2 = 1
with self.assertRaises(ValueError):
stattest_clusters_approx(penguin_data, k1, k2, cluster.labels_,
AgglomerativeClustering,
positional_arguments,
keyword_arguments,
ndraws=-1)
def test_x(self):
"""
Edge test to make sure ValueError raised when x is not 2d numpy array
Tests first with 3d numpy array, then 1d numpy array, than 2d list,
last checks if x contains any nans
:return: Nothing so long as value error is raised
"""
# 3d numpy array
x = np.array([[[1, 2], [3, 4]], [[5, 6], [7, 8]]])
k1 = 0
k2 = 1
cluster_labels = np.array([0, 1, 0, 1])
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster_labels)
# 1d numpy array
x = np.array([1, 2, 3, 4, 5, 6, 7, 8])
cluster_labels = [0, 1, 0, 1, 0, 1, 0, 1]
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster_labels)
# 2d list
x = [[5, 3],
[10, 15],
[15, 12],
[24, 10],
[30, 30],
[85, 70],
[71, 80],
[60, 78],
[70, 55],
[80, 91], ]
cluster_labels = np.array([1, 0, 1, 0, 1, 0, 1, 0, 1, 0])
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster_labels)
# contains nans
x = np.array([[5., 3.],
[10., 15],
[15., 12],
[24., 10],
[30., 30],
[85, 70],
[71, 80],
[60, 78],
[70, 55],
[80, 91], ])
x[1, 1] = np.nan
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster_labels)
def test_clusterlabels(self):
"""
Edge test to make sure ValueError raised when cluster_labels is wrong.
Checks that ValueError is thrown if cluster_labels is too short,
too long, not a numpy ndarray, not one-dimensional, or contains nans,
or has fewer than 2 clusters.
:return: Nothing so long as value error is raised
"""
x = np.array([[5, 3],
[10, 15],
[15, 12],
[24, 10],
[30, 30],
[85, 70],
[71, 80],
[60, 78],
[70, 55],
[80, 91], ])
k = 2
cl_fun = AgglomerativeClustering
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = cl_fun(*positional_arguments, **keyword_arguments)
cluster.fit_predict(x)
k1 = 0
k2 = 1
# cluster_labels too short
with self.assertRaises(ValueError):
wald_test(x, k1, k2, np.array([1, 0]))
# cluster_labels too long
with self.assertRaises(ValueError):
wald_test(x, k1, k2, np.zeros(100))
# cluster_labels list (not numpy array)
with self.assertRaises(ValueError):
wald_test(x, k1, k2, [1, 0, 1, 0, 1, 0, 1, 0, 1, 0])
# cluster_labels not one dimensional
with self.assertRaises(ValueError):
wald_test(x, k1, k2, np.zeros((5, 2)))
# cluster_labels contains nan values
cluster_labels = np.array([1., 0, 1, 0, 1, 0, 1, 0, 1, 0])
cluster_labels[0] = np.nan
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster_labels)
# cluster_labesl not between 2 and n:
cluster_labels = np.zeros(10)
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster_labels)
def test_iso_bool(self):
"""
Edge test to make sure ValueError raised when iso isn't boolean
:return: Nothing so long as value error is raised
"""
x = np.array([[5, 3],
[10, 15],
[15, 12],
[24, 10],
[30, 30],
[85, 70],
[71, 80],
[60, 78],
[70, 55],
[80, 91], ])
k = 2
cl_fun = AgglomerativeClustering
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = cl_fun(*positional_arguments, **keyword_arguments)
cluster.fit_predict(x)
k1 = 0
k2 = 1
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster.labels_, iso="hello")
def test_iso_sig_siginv(self):
""" Edge test to make sure sig and siginv are correct.
Checks to make sure errors thrown if iso = True and siginv != None,
if iso = False and sig != None, if sig is not a float or int,
and if siginv is not a qxq ndarray.
:return: Nothing so long as value error is raised
"""
x = np.array([[5, 3],
[10, 15],
[15, 12],
[24, 10],
[30, 30],
[85, 70],
[71, 80],
[60, 78],
[70, 55],
[80, 91], ])
k = 2
cl_fun = AgglomerativeClustering
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = cl_fun(*positional_arguments, **keyword_arguments)
cluster.fit_predict(x)
k1 = 0
k2 = 1
siginv = np.array([[1, 1], [1, 1]])
# iso True siginv not None
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster.labels_, iso=True, siginv=siginv)
# iso False, sig not None
sig = 5
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster.labels_, iso=False, sig=sig)
# iso True, siginv None, sig not float or int
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster.labels_, iso=True, sig="Hello")
# iso False, sig None, siginv 2x2 list
thissiginv = [[1, 1], [1, 1]]
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster.labels_, iso=False, siginv=thissiginv)
# iso False, sig None, siginv 1x1 numpy array (incorrect dimensions)
thissiginv = np.array([1, 2])
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster.labels_, iso=False, siginv=thissiginv)
# iso False, sig None, siginv 2x2x2 numpy array (incorrect dimensions)
thissiginv = np.array([[1, 1, 1], [1, 1, 1]])
with self.assertRaises(ValueError):
wald_test(x, k1, k2, cluster.labels_, iso=False, siginv=thissiginv)
def test_k1k2(self):
"""
Edge test to make sure k1 and k2 are ints
:return: Nothing so long as value error is raised
"""
x = np.array([[5, 3],
[10, 15],
[15, 12],
[24, 10],
[30, 30],
[85, 70],
[71, 80],
[60, 78],
[70, 55],
[80, 91], ])
k = 2
cl_fun = AgglomerativeClustering
positional_arguments = []
keyword_arguments = {'n_clusters': k, 'affinity': 'euclidean',
'linkage': 'average'}
cluster = cl_fun(*positional_arguments, **keyword_arguments)
cluster.fit_predict(x)
k1 = "hello"
k2 = True
# k1 not an int
with self.assertRaises(ValueError):
stattest_clusters_approx(x, k1, 2, cluster.labels_, cl_fun,
positional_arguments, keyword_arguments)
# k2 not an int
with self.assertRaises(ValueError):
stattest_clusters_approx(x, 1, k2, cluster.labels_, cl_fun,
positional_arguments, keyword_arguments)
k1 = -1
k2 = 2
# k1 not between 0 and k-1
with self.assertRaises(ValueError):
stattest_clusters_approx(x, k1, 1, cluster.labels_, cl_fun,
positional_arguments, keyword_arguments)
# k2 not between 0 and k-1
with self.assertRaises(ValueError):
stattest_clusters_approx(x, 0, k2, cluster.labels_, cl_fun,
positional_arguments, keyword_arguments)
| 43.219287 | 79 | 0.496317 | 3,257 | 32,717 | 4.829905 | 0.089039 | 0.050855 | 0.07107 | 0.095671 | 0.863009 | 0.834976 | 0.818257 | 0.789142 | 0.75507 | 0.704723 | 0 | 0.055908 | 0.422135 | 32,717 | 756 | 80 | 43.276455 | 0.776156 | 0.183483 | 0 | 0.759124 | 0 | 0 | 0.06759 | 0.026911 | 0 | 0 | 0 | 0 | 0.127737 | 1 | 0.031022 | false | 0.05292 | 0.009124 | 0 | 0.041971 | 0.018248 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
68ee0db09e18a9c6eb2ad4fe9c088d207ec2b474 | 8,452 | py | Python | tests/test_predict.py | wellcometrust/grants_tagger | b0701a0752d6196bf9aa64958843029390c82ef4 | [
"MIT"
] | 2 | 2021-06-15T10:20:01.000Z | 2022-02-23T16:13:42.000Z | tests/test_predict.py | wellcometrust/grants_tagger | b0701a0752d6196bf9aa64958843029390c82ef4 | [
"MIT"
] | 168 | 2020-07-01T10:06:36.000Z | 2022-03-31T13:47:42.000Z | tests/test_predict.py | wellcometrust/grants_tagger | b0701a0752d6196bf9aa64958843029390c82ef4 | [
"MIT"
] | null | null | null | from unittest.mock import patch
import tempfile
import shutil
import pickle
import json
import os
import numpy as np
import pytest
from grants_tagger.train import create_label_binarizer, train_and_evaluate
from grants_tagger.predict import predict_tags, predict
X = [
"all",
"one two",
"two",
"four",
"twenty four"
]
Y = [
[str(i) for i in range(24)],
["1", "2"],
["2"],
["4"],
["23"]
]
Y_mesh = [
[str(i) for i in range(5000)],
["1", "2"],
["2"],
["200"],
["1000"]
]
def create_data(X, Y, data_path):
with open(data_path, "w") as f:
for x, y in zip(X, Y):
f.write(json.dumps({"text": x, "tags": y, "meta": {}}))
f.write("\n")
@pytest.fixture
def tfidf_svm_path(tmp_path):
data_path = os.path.join(tmp_path, "data.jsonl")
create_data(X, Y, data_path)
label_binarizer_path = os.path.join(tmp_path, "label_binarizer.pkl")
label_binarizer = create_label_binarizer(data_path, label_binarizer_path)
# TODO: Replace approach with science-ensemble when fit implemented
tfidf_svm_path = os.path.join(tmp_path, "tfidf-svm.pkl")
parameters = {
'tfidf__min_df': 1,
'tfidf__stop_words': None
}
train_and_evaluate(data_path, label_binarizer_path,
approach="tfidf-svm", model_path=tfidf_svm_path,
parameters=str(parameters), verbose=False)
return tfidf_svm_path
@pytest.fixture
def scibert_path(tmp_path):
data_path = os.path.join(tmp_path, "data.jsonl")
create_data(X, Y, data_path)
label_binarizer_path = os.path.join(tmp_path, "label_binarizer.pkl")
label_binarizer = create_label_binarizer(data_path, label_binarizer_path)
scibert_path = os.path.join(tmp_path, "scibert")
parameters = {"epochs": 1}
train_and_evaluate(data_path, label_binarizer_path,
approach="scibert", model_path=scibert_path,
parameters=str(parameters), verbose=False)
return scibert_path
@pytest.fixture
def science_ensemble_path(tfidf_svm_path, scibert_path):
science_ensemble_path = f"{tfidf_svm_path},{scibert_path}"
return science_ensemble_path
@pytest.fixture
def mesh_tfidf_svm_path(tmp_path):
mesh_data_path = os.path.join(tmp_path, "mesh_data.jsonl")
create_data(X, Y_mesh, mesh_data_path)
label_binarizer_path = os.path.join(tmp_path, "label_binarizer.pkl")
model_path = os.path.join(tmp_path, "mesh_tfidf_svm")
parameters = {
'tfidf__min_df': 1,
'tfidf__stop_words': None,
'svm__estimator__loss': 'log',
'model_path': model_path
}
train_and_evaluate(mesh_data_path, label_binarizer_path,
approach="mesh-tfidf-svm", model_path=model_path,
parameters=str(parameters), sparse_labels=True,
verbose=False)
return model_path
@pytest.fixture
def mesh_cnn_path(tmp_path):
mesh_data_path = os.path.join(tmp_path, "mesh_data.jsonl")
create_data(X, Y_mesh, mesh_data_path)
label_binarizer_path = os.path.join(tmp_path, "label_binarizer.pkl")
model_path = os.path.join(tmp_path, "mesh_cnn")
train_and_evaluate(mesh_data_path, label_binarizer_path,
approach="mesh-cnn", model_path=model_path,
sparse_labels=True, verbose=False)
return model_path
@pytest.fixture
def label_binarizer_path(tmp_path):
data_path = os.path.join(tmp_path, "mesh_data.jsonl")
create_data(X, Y, data_path)
label_binarizer_path = os.path.join(tmp_path, "label_binarizer.pkl")
create_label_binarizer(data_path, label_binarizer_path)
return label_binarizer_path
@pytest.fixture
def mesh_label_binarizer_path(tmp_path):
mesh_data_path = os.path.join(tmp_path, "mesh_data.jsonl")
create_data(X, Y_mesh, mesh_data_path)
mesh_label_binarizer_path = os.path.join(tmp_path, "mesh_label_binarizer.pkl")
create_label_binarizer(mesh_data_path, mesh_label_binarizer_path)
return mesh_label_binarizer_path
def test_predict_tags_tfidf_svm(tfidf_svm_path, label_binarizer_path):
tags = predict_tags(
X, model_path=tfidf_svm_path,
label_binarizer_path=label_binarizer_path,
approach="tfidf-svm")
assert len(tags) == 5
tags = predict_tags(
X, model_path=tfidf_svm_path,
label_binarizer_path=label_binarizer_path,
approach="tfidf-svm", probabilities=True)
for tags_ in tags:
for tag, prob in tags_.items():
assert 0 <= prob <= 1.0
tags = predict_tags(
X, model_path=tfidf_svm_path,
label_binarizer_path=label_binarizer_path,
approach="tfidf-svm", threshold=0)
for tags_ in tags:
assert len(tags_) == 24
tags = predict_tags(
X, model_path=tfidf_svm_path,
label_binarizer_path=label_binarizer_path,
approach="tfidf-svm", threshold=1)
for tags_ in tags:
assert len(tags_) == 0
def test_predict_tags_scibert(scibert_path, label_binarizer_path):
tags = predict_tags(
X, model_path=scibert_path,
label_binarizer_path=label_binarizer_path,
approach="scibert")
assert len(tags) == 5
tags = predict_tags(
X, model_path=scibert_path,
label_binarizer_path=label_binarizer_path,
approach="scibert", probabilities=True)
for tags_ in tags:
for tag, prob in tags_.items():
assert 0 <= prob <= 1.0
tags = predict_tags(
X, model_path=scibert_path,
label_binarizer_path=label_binarizer_path,
approach="scibert", threshold=0)
for tags_ in tags:
assert len(tags_) == 24
tags = predict_tags(
X, model_path=scibert_path,
label_binarizer_path=label_binarizer_path,
approach="scibert", threshold=1)
for tags_ in tags:
assert len(tags_) == 0
def test_predict_tags_science_ensemble(science_ensemble_path, label_binarizer_path):
tags = predict_tags(
X, model_path=science_ensemble_path,
label_binarizer_path=label_binarizer_path,
approach="science-ensemble")
assert len(tags) == 5
tags = predict_tags(
X, model_path=science_ensemble_path,
label_binarizer_path=label_binarizer_path,
approach="science-ensemble", probabilities=True)
for tags_ in tags:
for tag, prob in tags_.items():
assert 0 <= prob <= 1.0
tags = predict_tags(
X, model_path=science_ensemble_path,
label_binarizer_path=label_binarizer_path,
approach="science-ensemble", threshold=0)
for tags_ in tags:
assert len(tags_) == 24
tags = predict_tags(
X, model_path=science_ensemble_path,
label_binarizer_path=label_binarizer_path,
approach="science-ensemble", threshold=1)
for tags_ in tags:
assert len(tags_) == 0
def test_predict_tags_mesh_tfidf_svm(mesh_tfidf_svm_path, mesh_label_binarizer_path):
tags = predict_tags(
X, mesh_tfidf_svm_path, mesh_label_binarizer_path,
approach="mesh-tfidf-svm")
assert len(tags) == 5
tags = predict_tags(
X, mesh_tfidf_svm_path, mesh_label_binarizer_path,
approach="mesh-tfidf-svm", probabilities=True)
for tags_ in tags:
for tag, prob in tags_.items():
assert 0 <= prob <= 1.0
tags = predict_tags(
X, mesh_tfidf_svm_path, mesh_label_binarizer_path,
approach="mesh-tfidf-svm", threshold=0)
for tags_ in tags:
assert len(tags_) == 5000
tags = predict_tags(
X, mesh_tfidf_svm_path, mesh_label_binarizer_path,
approach="mesh-tfidf-svm", threshold=1)
for tags_ in tags:
assert len(tags_) == 0
def test_predict_tags_mesh_cnn(mesh_cnn_path, mesh_label_binarizer_path):
tags = predict_tags(
X, mesh_cnn_path, mesh_label_binarizer_path,
approach="mesh-cnn")
assert len(tags) == 5
tags = predict_tags(
X, mesh_cnn_path, mesh_label_binarizer_path,
approach="mesh-cnn", probabilities=True)
for tags_ in tags:
for tag, prob in tags_.items():
assert 0 <= prob <= 1.0
tags = predict_tags(
X, mesh_cnn_path, mesh_label_binarizer_path,
approach="mesh-cnn", threshold=0)
for tags_ in tags:
assert len(tags_) == 5000
tags = predict_tags(
X, mesh_cnn_path, mesh_label_binarizer_path,
approach="mesh-cnn", threshold=1)
for tags_ in tags:
assert len(tags_) == 0
| 32.259542 | 85 | 0.679839 | 1,169 | 8,452 | 4.557742 | 0.087254 | 0.178679 | 0.185811 | 0.161036 | 0.845533 | 0.814377 | 0.783971 | 0.75488 | 0.737425 | 0.701389 | 0 | 0.01121 | 0.219001 | 8,452 | 261 | 86 | 32.383142 | 0.79594 | 0.00769 | 0 | 0.591111 | 0 | 0 | 0.081584 | 0.00656 | 0 | 0 | 0 | 0.003831 | 0.088889 | 1 | 0.057778 | false | 0 | 0.044444 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
68f00cf893cf0a076516eb2e764fd4a409554117 | 108 | py | Python | pwnlib/adb/__init__.py | cclauss/pwntools | 899baec7048559db65a2e9ad784be4cb60c181da | [
"MIT"
] | 7 | 2017-07-11T01:12:02.000Z | 2017-09-21T23:39:54.000Z | pwnlib/adb/__init__.py | cclauss/pwntools | 899baec7048559db65a2e9ad784be4cb60c181da | [
"MIT"
] | null | null | null | pwnlib/adb/__init__.py | cclauss/pwntools | 899baec7048559db65a2e9ad784be4cb60c181da | [
"MIT"
] | 3 | 2018-03-21T11:48:05.000Z | 2021-10-16T15:38:01.000Z | from __future__ import absolute_import
from pwnlib.adb.adb import *
from pwnlib.adb.protocol import Client
| 21.6 | 38 | 0.833333 | 16 | 108 | 5.3125 | 0.5 | 0.235294 | 0.376471 | 0.447059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12037 | 108 | 4 | 39 | 27 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
ec2c3a7c346dec7cc17d8f94980003f60cc33822 | 358 | py | Python | textformer/models/decoders/__init__.py | gugarosa/textformer | cccc670d48995fa0bfbdf9fc8013d13a90ea5e84 | [
"Apache-2.0"
] | 3 | 2020-07-26T03:51:56.000Z | 2020-10-04T18:42:18.000Z | textformer/models/decoders/__init__.py | gugarosa/textformer | cccc670d48995fa0bfbdf9fc8013d13a90ea5e84 | [
"Apache-2.0"
] | null | null | null | textformer/models/decoders/__init__.py | gugarosa/textformer | cccc670d48995fa0bfbdf9fc8013d13a90ea5e84 | [
"Apache-2.0"
] | null | null | null | """A package for already-implemented decoder models.
"""
from textformer.models.decoders.bi_gru import BiGRUDecoder
from textformer.models.decoders.conv import ConvDecoder
from textformer.models.decoders.gru import GRUDecoder
from textformer.models.decoders.lstm import LSTMDecoder
from textformer.models.decoders.self_attention import SelfAttentionDecoder
| 39.777778 | 74 | 0.857542 | 44 | 358 | 6.931818 | 0.5 | 0.229508 | 0.327869 | 0.459016 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078212 | 358 | 8 | 75 | 44.75 | 0.924242 | 0.136872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ec45f9d83d7e663054f1e8200ae4b95ce5230ed6 | 5,587 | py | Python | argopy/tests/test_fetchers_index_erddap.py | mnichol3/argopy | ca33bc266b2df71f4e4d7524980909262648e265 | [
"Apache-2.0"
] | 111 | 2020-03-20T12:04:21.000Z | 2022-03-30T10:33:50.000Z | argopy/tests/test_fetchers_index_erddap.py | mnichol3/argopy | ca33bc266b2df71f4e4d7524980909262648e265 | [
"Apache-2.0"
] | 123 | 2020-03-18T08:42:30.000Z | 2022-03-31T10:50:34.000Z | argopy/tests/test_fetchers_index_erddap.py | mnichol3/argopy | ca33bc266b2df71f4e4d7524980909262648e265 | [
"Apache-2.0"
] | 31 | 2020-03-21T23:34:59.000Z | 2022-03-24T15:14:56.000Z | import pandas as pd
import pytest
import tempfile
import argopy
from argopy import IndexFetcher as ArgoIndexFetcher
from argopy.errors import (
FileSystemHasNoCache,
CacheFileNotFound
)
from . import requires_connected_erddap_index, safe_to_server_errors, ci_erddap_index
ERDDAP_TIMEOUT = 3 * 60
safe_to_no_cache = pytest.mark.skipif(True, reason="Cache disabled for erddap index fetcher")
@ci_erddap_index
@requires_connected_erddap_index
class Test_Backend_WMO:
""" Test ERDDAP index fetching backend for WMO access point"""
# caplog.set_level(logging.DEBUG)
src = "erddap"
requests = {
"float": [[2901623], [2901623, 6901929]]
}
def test_nocache(self):
with tempfile.TemporaryDirectory() as testcachedir:
with argopy.set_options(cachedir=testcachedir):
fetcher = ArgoIndexFetcher(src=self.src, cache=False).float(self.requests['float'][0]).fetcher
with pytest.raises(FileSystemHasNoCache):
fetcher.cachepath
@safe_to_no_cache
def test_cachepath_notfound(self):
with tempfile.TemporaryDirectory() as testcachedir:
with argopy.set_options(cachedir=testcachedir):
fetcher = ArgoIndexFetcher(src=self.src, cache=True).float(self.requests['float'][0]).fetcher
with pytest.raises(CacheFileNotFound):
fetcher.cachepath
@safe_to_no_cache
@safe_to_server_errors
def test_cached(self):
with tempfile.TemporaryDirectory() as testcachedir:
with argopy.set_options(cachedir=testcachedir, api_timeout=ERDDAP_TIMEOUT):
fetcher = ArgoIndexFetcher(src=self.src, cache=True).float(self.requests['float'][0]).fetcher
df = fetcher.to_dataframe()
assert isinstance(df, pd.core.frame.DataFrame)
assert isinstance(fetcher.cachepath, str)
@safe_to_no_cache
@safe_to_server_errors
def test_clearcache(self):
with tempfile.TemporaryDirectory() as testcachedir:
with argopy.set_options(cachedir=testcachedir, api_timeout=ERDDAP_TIMEOUT):
fetcher = ArgoIndexFetcher(src=self.src, cache=True).float(self.requests['float'][0]).fetcher
fetcher.to_dataframe()
fetcher.clear_cache()
with pytest.raises(CacheFileNotFound):
fetcher.cachepath
def test_url(self):
for arg in self.requests["float"]:
fetcher = ArgoIndexFetcher(src=self.src).float(arg).fetcher
assert isinstance(fetcher.url, str)
@safe_to_server_errors
def test_phy_float(self):
for arg in self.requests["float"]:
with argopy.set_options(api_timeout=ERDDAP_TIMEOUT):
fetcher = ArgoIndexFetcher(src=self.src).float(arg).fetcher
df = fetcher.to_dataframe()
assert isinstance(df, pd.core.frame.DataFrame)
@ci_erddap_index
@requires_connected_erddap_index
class Test_Backend_BOX:
""" Test ERDDAP index fetching backend for the BOX access point """
src = "erddap"
requests = {
"region": [
[-60, -50, 40.0, 50.0],
[-60, -55, 40.0, 45.0, "2007-08-01", "2007-09-01"],
],
}
def test_nocache(self):
with tempfile.TemporaryDirectory() as testcachedir:
with argopy.set_options(cachedir=testcachedir):
fetcher = ArgoIndexFetcher(src=self.src, cache=False).region(self.requests['region'][-1]).fetcher
with pytest.raises(FileSystemHasNoCache):
fetcher.cachepath
@safe_to_no_cache
def test_cachepath_notfound(self):
with tempfile.TemporaryDirectory() as testcachedir:
with argopy.set_options(cachedir=testcachedir):
fetcher = ArgoIndexFetcher(src=self.src, cache=True).region(self.requests['region'][-1]).fetcher
with pytest.raises(CacheFileNotFound):
fetcher.cachepath
@safe_to_no_cache
@safe_to_server_errors
def test_cached(self):
with tempfile.TemporaryDirectory() as testcachedir:
with argopy.set_options(cachedir=testcachedir, api_timeout=ERDDAP_TIMEOUT):
fetcher = ArgoIndexFetcher(src=self.src, cache=True).region(self.requests['region'][-1]).fetcher
df = fetcher.to_dataframe()
assert isinstance(df, pd.core.frame.DataFrame)
assert isinstance(fetcher.cachepath, str)
@safe_to_no_cache
@safe_to_server_errors
def test_clearcache(self):
with tempfile.TemporaryDirectory() as testcachedir:
with argopy.set_options(cachedir=testcachedir, api_timeout=ERDDAP_TIMEOUT):
fetcher = ArgoIndexFetcher(src=self.src, cache=True).region(self.requests['region'][-1]).fetcher
fetcher.to_dataframe()
fetcher.clear_cache()
with pytest.raises(CacheFileNotFound):
fetcher.cachepath
def test_url(self):
for arg in self.requests["region"]:
fetcher = ArgoIndexFetcher(src=self.src).region(arg).fetcher
assert isinstance(fetcher.url, str)
@safe_to_server_errors
def test_phy_region(self):
for arg in self.requests["region"]:
with argopy.set_options(api_timeout=ERDDAP_TIMEOUT):
fetcher = ArgoIndexFetcher(src=self.src).region(arg).fetcher
df = fetcher.to_dataframe()
assert isinstance(df, pd.core.frame.DataFrame)
| 39.907143 | 113 | 0.654018 | 621 | 5,587 | 5.708535 | 0.144928 | 0.023695 | 0.088011 | 0.101551 | 0.860931 | 0.860931 | 0.842313 | 0.822567 | 0.797743 | 0.784203 | 0 | 0.016198 | 0.248613 | 5,587 | 139 | 114 | 40.194245 | 0.828252 | 0.026669 | 0 | 0.782609 | 0 | 0 | 0.027281 | 0 | 0 | 0 | 0 | 0 | 0.069565 | 1 | 0.104348 | false | 0 | 0.06087 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6b61f4bd320ff6c2e27fdb506a192f9b4e756e8b | 211,057 | py | Python | spacy/lang/pl/tag_map.py | tdzienniak/spaCy | 2791ce96dcf276b4bfe0a0a05ce1474abfd0783d | [
"MIT"
] | null | null | null | spacy/lang/pl/tag_map.py | tdzienniak/spaCy | 2791ce96dcf276b4bfe0a0a05ce1474abfd0783d | [
"MIT"
] | null | null | null | spacy/lang/pl/tag_map.py | tdzienniak/spaCy | 2791ce96dcf276b4bfe0a0a05ce1474abfd0783d | [
"MIT"
] | null | null | null | from ...symbols import POS, PUNCT, SYM, ADJ, CCONJ, NUM, DET, ADV, ADP, X, VERB
from ...symbols import NOUN, PROPN, PART, INTJ, SPACE, PRON, SCONJ
TAG_MAP = {
"num:pl:acc:m3:congr__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc3": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"subst:pl:acc:m3__Case=Acc|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Plur"},
"praet:pl:m3:perf__Aspect=Perf|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc3|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"qub__PronType=Prs|Reflex=Yes": {POS: PRON, "PronType": "Prs", "Reflex": "Yes"},
"num:pl:nom:m3:congr__Case=Nom|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc3": {POS: NUM, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"adj:pl:nom:m3:pos__Case=Nom|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"subst:pl:nom:m3__Case=Nom|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Plur"},
"interp__PunctType=Peri": {POS: PUNCT, "PunctType": "Peri"},
"interp__PunctType=Dash": {POS: PUNCT, "PunctType": "Dash"},
"conj___": {POS: CCONJ},
"subst:sg:acc:n__Case=Acc|Gender=Neut|Number=Sing|PronType=Int": {POS: NOUN, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"fin:sg:pri:imperf__Aspect=Imp|Mood=Ind|Number=Sing|Person=1|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Sing", "Person": "1", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"interp__PunctType=Qest": {POS: PUNCT, "PunctType": "Qest"},
"adv__PronType=Int": {POS: ADV, "PronType": "Int"},
"praet:sg:m2:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc2|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"inf:perf__Aspect=Perf|VerbForm=Inf|Voice=Act": {POS: VERB, "Aspect": "Perf", "VerbForm": "Inf", "Voice": "Act"},
"fin:sg:ter:imperf__Aspect=Imp|Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Sing", "Person": "3", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"adv:pos__Degree=Pos|PronType=Int": {POS: ADV, "Degree": "Pos", "PronType": "Int"},
"fin:sg:sec:imperf__Aspect=Imp|Mood=Ind|Number=Sing|Person=2|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Sing", "Person": "2", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"subst:sg:nom:m1__Case=Nom|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Sing"},
"adv___": {POS: ADV},
"praet:sg:m1:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc1|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"prep:loc:nwok__AdpType=Prep|Variant=Short": {POS: ADP, "AdpType": "Prep", "Variant": "Short"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"subst:sg:loc:m3__Case=Loc|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: NOUN, "Case": "Loc", "Gender": "Masc", "Number": "Sing"},
"qub___": {POS: PRON},
"aglt:sg:pri:imperf:wok__Aspect=Imp|Number=Sing|Person=1|Variant=Long": {POS: VERB, "Aspect": "Imp", "Number": "Sing", "Person": "1", "Variant": "Long"},
"prep:acc__AdpType=Prep": {POS: ADP, "AdpType": "Prep"},
"subst:sg:acc:n__Case=Acc|Gender=Neut|Number=Sing": {POS: NOUN, "Case": "Acc", "Gender": "Neut", "Number": "Sing"},
"interp__PunctType=Comm": {POS: PUNCT, "PunctType": "Comm"},
"pred__Mood=Ind|Tense=Pres|VerbForm=Fin|VerbType=Quasi": {POS: VERB, "Mood": "Ind", "Tense": "Pres", "VerbForm": "Fin", "VerbType": "Quasi"},
"fin:sg:ter:imperf__Aspect=Imp|Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Sing", "Person": "3", "Tense": "Pres", "VerbForm": "Fin"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "PronType": "Dem"},
"adj:sg:nom:f:pos__Case=Nom|Degree=Pos|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Fem", "Number": "Sing"},
"subst:sg:nom:f__Case=Nom|Gender=Fem|Number=Sing": {POS: NOUN, "Case": "Nom", "Gender": "Fem", "Number": "Sing"},
"subst:sg:nom:m1__Case=Nom|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc1": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"ppron3:sg:acc:m1:ter:nakc:npraep__Case=Acc|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Short": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"adv__PronType=Dem": {POS: ADV, "PronType": "Dem"},
"ppron3:sg:dat:m2:ter:akc:npraep__Case=Dat|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:gen:f__Case=Gen|Gender=Fem|Number=Sing": {POS: NOUN, "Case": "Gen", "Gender": "Fem", "Number": "Sing"},
"pant:perf__Aspect=Perf|Tense=Past|VerbForm=Conv|Voice=Act": {POS: VERB, "Aspect": "Perf", "Tense": "Past", "VerbForm": "Conv", "Voice": "Act"},
"interp__PunctType=Excl": {POS: PUNCT, "PunctType": "Excl"},
"ppron12:pl:acc:m1:pri__Case=Acc|Gender=Masc|Number=Plur|Person=1|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"subst:sg:nom:n__Case=Nom|Gender=Neut|Number=Sing|PronType=Dem": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"adj:sg:acc:n:pos__Case=Acc|Degree=Pos|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Neut", "Number": "Sing"},
"praet:pl:m1:perf__Aspect=Perf|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc1|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"subst:pl:nom:m1__Case=Nom|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Plur"},
"praet:sg:m1:perf__Aspect=Perf|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc1|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"comp___": {POS: SCONJ},
"qub__Polarity=Neg": {POS: PRON, "Polarity": "Neg"},
"adj:sg:gen:f:pos__Case=Gen|Degree=Pos|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Fem", "Number": "Sing"},
"subst:sg:acc:f__Case=Acc|Gender=Fem|Number=Sing": {POS: NOUN, "Case": "Acc", "Gender": "Fem", "Number": "Sing"},
"subst:sg:nom:m3__Case=Nom|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Sing"},
"fin:sg:ter:perf__Aspect=Perf|Mood=Ind|Number=Sing|Person=3|Tense=Fut|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Ind", "Number": "Sing", "Person": "3", "Tense": "Fut", "VerbForm": "Fin", "Voice": "Act"},
"adj:pl:acc:f:pos__Case=Acc|Degree=Pos|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Fem", "Number": "Plur"},
"subst:pl:acc:f__Case=Acc|Gender=Fem|Number=Plur": {POS: NOUN, "Case": "Acc", "Gender": "Fem", "Number": "Plur"},
"fin:pl:pri:imperf__Aspect=Imp|Mood=Ind|Number=Plur|Person=1|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Plur", "Person": "1", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"ppron3:pl:acc:n:ter:akc:npraep__Case=Acc|Gender=Neut|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"prep:loc__AdpType=Prep": {POS: ADP, "AdpType": "Prep"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"subst:pl:acc:m2__Case=Acc|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Plur"},
"praet:sg:n:perf__Aspect=Perf|Gender=Neut|Mood=Ind|Number=Sing|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Neut", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"adj:sg:gen:m2:sup__Case=Gen|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: ADJ, "Case": "Gen", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"subst:sg:gen:m2__Case=Gen|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Sing"},
"aglt:sg:sec:imperf:wok__Aspect=Imp|Number=Sing|Person=2|Variant=Long": {POS: VERB, "Aspect": "Imp", "Number": "Sing", "Person": "2", "Variant": "Long"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"adv:pos__Degree=Pos|PronType=Ind": {POS: ADV, "Degree": "Pos", "PronType": "Ind"},
"subst:sg:acc:m3__Case=Acc|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Sing"},
"subst:sg:inst:f__Case=Ins|Gender=Fem|Number=Sing": {POS: NOUN, "Case": "Ins", "Gender": "Fem", "Number": "Sing"},
"praet:pl:f:imperf__Aspect=Imp|Gender=Fem|Mood=Ind|Number=Plur|Tense=Past|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Gender": "Fem", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin"},
"adj:pl:nom:f:pos__Case=Nom|Degree=Pos|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Fem", "Number": "Plur"},
"subst:pl:nom:f__Case=Nom|Gender=Fem|Number=Plur": {POS: NOUN, "Case": "Nom", "Gender": "Fem", "Number": "Plur"},
"prep:gen:nwok__AdpType=Prep|Variant=Short": {POS: ADP, "AdpType": "Prep", "Variant": "Short"},
"ppron3:sg:gen:m1:ter:akc:npraep__Case=Gen|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:nom:n__Case=Nom|Gender=Neut|Number=Sing": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Sing"},
"subst:sg:inst:n__Case=Ins|Gender=Neut|Number=Sing": {POS: NOUN, "Case": "Ins", "Gender": "Neut", "Number": "Sing"},
"num:pl:acc:m2:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|PronType=Int|SubGender=Masc2": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Int"},
"praet:sg:f:imperf__Aspect=Imp|Gender=Fem|Mood=Ind|Number=Sing|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Fem", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppron12:sg:acc:m1:pri:akc__Case=Acc|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Long"},
"qub__Mood=Cnd": {POS: PRON, "Mood": "Cnd"},
"num:pl:acc:m1:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc1": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"subst:pl:gen:m1__Case=Gen|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Plur"},
"adv:pos__Degree=Pos": {POS: ADV, "Degree": "Pos"},
"subst:pl:acc:m1__Case=Acc|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Plur"},
"subst:sg:gen:m3__Case=Gen|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Sing"},
"adj:pl:acc:m3:pos__Case=Acc|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"adj:sg:acc:m3:pos__Case=Acc|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"qub__PartType=Int": {POS: PRON, "PartType": "Int"},
"bedzie:sg:pri:imperf__Aspect=Imp|Mood=Ind|Number=Sing|Person=1|Tense=Fut|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Sing", "Person": "1", "Tense": "Fut", "VerbForm": "Fin"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"fin:pl:ter:imperf__Aspect=Imp|Mood=Ind|Number=Plur|Person=3|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Plur", "Person": "3", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"prep:gen__AdpType=Prep": {POS: ADP, "AdpType": "Prep"},
"adj:sg:gen:m3:pos__Case=Gen|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:sg:loc:m3:pos__Case=Loc|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:pl:acc:m1:pos__Case=Acc|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"praet:sg:f:imperf__Aspect=Imp|Gender=Fem|Mood=Ind|Number=Sing|Tense=Past|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Gender": "Fem", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin"},
"praet:sg:n:imperf__Aspect=Imp|Gender=Neut|Mood=Ind|Number=Sing|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Neut", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppron3:sg:dat:m1:ter:nakc:npraep__Case=Dat|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"prep:inst:nwok__AdpType=Prep|Variant=Short": {POS: ADP, "AdpType": "Prep", "Variant": "Short"},
"subst:sg:inst:n__Case=Ins|Gender=Neut|Number=Sing|PronType=Dem": {POS: NOUN, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"fin:pl:pri:perf__Aspect=Perf|Mood=Ind|Number=Plur|Person=1|Tense=Fut|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Ind", "Number": "Plur", "Person": "1", "Tense": "Fut", "VerbForm": "Fin", "Voice": "Act"},
"subst:sg:nom:m2__Case=Nom|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Sing"},
"subst:sg:loc:f__Case=Loc|Gender=Fem|Number=Sing": {POS: NOUN, "Case": "Loc", "Gender": "Fem", "Number": "Sing"},
"ppron12:sg:dat:m1:pri:nakc__Case=Dat|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc1|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Short"},
"aglt:sg:pri:imperf:nwok__Aspect=Imp|Number=Sing|Person=1|Variant=Short": {POS: VERB, "Aspect": "Imp", "Number": "Sing", "Person": "1", "Variant": "Short"},
"adj:sg:nom:m1:pos__Case=Nom|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"subst:pl:loc:n__Case=Loc|Gender=Neut|Number=Plur": {POS: NOUN, "Case": "Loc", "Gender": "Neut", "Number": "Plur"},
"adj:pl:gen:m1:pos__Case=Gen|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"praet:pl:m1:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc1|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppron12:sg:nom:m1:pri__Case=Nom|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs"},
"subst:sg:acc:n__Case=Acc|Gender=Neut|Number=Sing|PronType=Dem": {POS: NOUN, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"fin:sg:pri:perf__Aspect=Perf|Mood=Ind|Number=Sing|Person=1|Tense=Fut|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Ind", "Number": "Sing", "Person": "1", "Tense": "Fut", "VerbForm": "Fin", "Voice": "Act"},
"adj:sg:nom:n:pos__Case=Nom|Degree=Pos|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Neut", "Number": "Sing"},
"ppron12:sg:dat:f:pri:nakc__Case=Dat|Gender=Fem|Number=Sing|Person=1|PronType=Prs|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Short"},
"num:pl:gen:n:congr__Case=Gen|Gender=Neut|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"subst:sg:gen:m1__Case=Gen|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Sing"},
"subst:pl:gen:m2__Case=Gen|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Plur"},
"adv__PronType=Ind": {POS: ADV, "PronType": "Ind"},
"adj:pl:nom:n:pos__Case=Nom|Degree=Pos|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Neut", "Number": "Plur"},
"adj:sg:gen:m3:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"ppron3:sg:nom:m1:ter:akc:npraep__Case=Nom|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:loc:n:pos__Case=Loc|Gender=Neut|Number=Sing|PronType=Int": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"subst:sg:loc:n__Case=Loc|Gender=Neut|Number=Sing": {POS: NOUN, "Case": "Loc", "Gender": "Neut", "Number": "Sing"},
"adj:sg:nom:m3:pos__Case=Nom|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"praet:sg:m3:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc3|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"adj:pl:nom:m1:pos__Case=Nom|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"subst:sg:gen:n__Case=Gen|Gender=Neut|Number=Sing|PronType=Neg": {POS: NOUN, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Neg"},
"praet:sg:m3:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc3|Tense=Past|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin"},
"adj:sg:gen:m1:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Neg|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Neg"},
"ppas:pl:gen:m1:perf:aff__Aspect=Perf|Case=Gen|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:loc:n:pos__Case=Loc|Degree=Pos|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Loc", "Degree": "Pos", "Gender": "Neut", "Number": "Sing"},
"praet:sg:f:perf__Aspect=Perf|Gender=Fem|Mood=Ind|Number=Sing|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Fem", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"num:pl:acc:f:congr__Case=Acc|Gender=Fem|Number=Plur|NumType=Card": {POS: NUM, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "NumType": "Card"},
"subst:sg:acc:n__Case=Acc|Gender=Neut|Number=Sing|PronType=Ind": {POS: NOUN, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Ind"},
"impt:pl:pri:perf__Aspect=Perf|Mood=Imp|Number=Plur|Person=1|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Imp", "Number": "Plur", "Person": "1", "VerbForm": "Fin", "Voice": "Act"},
"aglt:pl:pri:imperf:nwok__Aspect=Imp|Number=Plur|Person=1|Variant=Short": {POS: VERB, "Aspect": "Imp", "Number": "Plur", "Person": "1", "Variant": "Short"},
"subst:sg:acc:m1__Case=Acc|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Sing"},
"adv__PronType=Tot": {POS: ADV, "PronType": "Tot"},
"ppron3:sg:dat:f:ter:akc:npraep__Case=Dat|Gender=Fem|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:pl:acc:n__Case=Acc|Gender=Neut|Number=Plur": {POS: NOUN, "Case": "Acc", "Gender": "Neut", "Number": "Plur"},
"subst:pl:inst:n__Case=Ins|Gender=Neut|Number=Plur": {POS: NOUN, "Case": "Ins", "Gender": "Neut", "Number": "Plur"},
"subst:pl:loc:f__Case=Loc|Gender=Fem|Number=Plur": {POS: NOUN, "Case": "Loc", "Gender": "Fem", "Number": "Plur"},
"adj:pl:loc:f:pos__Case=Loc|Degree=Pos|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Loc", "Degree": "Pos", "Gender": "Fem", "Number": "Plur"},
"interp__PunctSide=Ini|PunctType=Quot": {POS: PUNCT, "PunctSide": "Ini", "PunctType": "Quot"},
"subst:sg:inst:m1__Case=Ins|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Sing"},
"adj:sg:inst:m1:pos__Case=Ins|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"ppas:sg:nom:f:perf:aff__Aspect=Perf|Case=Nom|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:acc:m3:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|PronType=Ind|SubGender=Masc3": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"subst:pl:gen:m3__Case=Gen|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Plur"},
"praet:sg:m3:perf__Aspect=Perf|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc3|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppas:sg:nom:m3:perf:aff__Aspect=Perf|Case=Nom|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"fin:sg:sec:perf__Aspect=Perf|Mood=Ind|Number=Sing|Person=2|Tense=Fut|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Ind", "Number": "Sing", "Person": "2", "Tense": "Fut", "VerbForm": "Fin", "Voice": "Act"},
"ppron12:sg:nom:f:sec__Case=Nom|Gender=Fem|Number=Sing|Person=2|PronType=Prs": {POS: PRON, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Person": "2", "PronType": "Prs"},
"ppron3:pl:acc:m1:ter:akc:npraep__Case=Acc|Gender=Masc|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron3:sg:acc:f:ter:akc:npraep__Case=Acc|Gender=Fem|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:sg:acc:m1:sec:nakc__Case=Acc|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc1|Variant=Short": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Short"},
"adj:pl:nom:f:pos__Case=Nom|Gender=Fem|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"praet:pl:f:imperf__Aspect=Imp|Gender=Fem|Mood=Ind|Number=Plur|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Fem", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"inf:imperf__Aspect=Imp|VerbForm=Inf|Voice=Act": {POS: VERB, "Aspect": "Imp", "VerbForm": "Inf", "Voice": "Act"},
"subst:pl:gen:n__Case=Gen|Gender=Neut|Number=Plur": {POS: NOUN, "Case": "Gen", "Gender": "Neut", "Number": "Plur"},
"ppron3:pl:dat:f:ter:akc:npraep__Case=Dat|Gender=Fem|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:pl:nom:m1:perf:aff__Aspect=Perf|Case=Nom|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"impt:sg:sec:imperf__Aspect=Imp|Mood=Imp|Number=Sing|Person=2|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Imp", "Number": "Sing", "Person": "2", "VerbForm": "Fin", "Voice": "Act"},
"adj:pl:acc:f:pos__Case=Acc|Gender=Fem|Number=Plur|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"interp__PunctSide=Fin|PunctType=Quot": {POS: PUNCT, "PunctSide": "Fin", "PunctType": "Quot"},
"imps:imperf__Aspect=Imp|Mood=Ind|Person=0|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Person": "0", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"num:pl:acc:m2:congr__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc2": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"subst:pl:dat:m1__Case=Dat|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: NOUN, "Case": "Dat", "Gender": "Masc", "Number": "Plur"},
"ppron3:sg:acc:m2:ter:nakc:npraep__Case=Acc|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc2|Variant=Short": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"subst:sg:loc:n__Case=Loc|Gender=Neut|Number=Sing|PronType=Dem": {POS: NOUN, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"prep:acc:nwok__AdpType=Prep|Variant=Short": {POS: ADP, "AdpType": "Prep", "Variant": "Short"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "PronType": "Dem"},
"adj:sg:loc:n:pos__Case=Loc|Gender=Neut|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"subst:sg:loc:n__Case=Loc|Gender=Neut|Number=Sing|PronType=Tot": {POS: NOUN, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"subst:sg:loc:n__Case=Loc|Gender=Neut|Number=Sing|PronType=Int": {POS: NOUN, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"ppron3:sg:nom:f:ter:akc:npraep__Case=Nom|Gender=Fem|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron3:pl:gen:m1:ter:akc:npraep__Case=Gen|Gender=Masc|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:dat:m3__Case=Dat|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: NOUN, "Case": "Dat", "Gender": "Masc", "Number": "Sing"},
"pcon:imperf__Aspect=Imp|Tense=Pres|VerbForm=Conv|Voice=Act": {POS: VERB, "Aspect": "Imp", "Tense": "Pres", "VerbForm": "Conv", "Voice": "Act"},
"adj:pl:gen:f:pos__Case=Gen|Degree=Pos|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Fem", "Number": "Plur"},
"subst:pl:gen:f__Case=Gen|Gender=Fem|Number=Plur": {POS: NOUN, "Case": "Gen", "Gender": "Fem", "Number": "Plur"},
"num:pl:acc:m1:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|PronType=Ind|SubGender=Masc1": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"adj:pl:gen:m1:pos__Case=Gen|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"ppron3:sg:loc:f:ter:akc:praep__Case=Loc|Gender=Fem|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:sg:nom:m1:perf:aff__Aspect=Perf|Case=Nom|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"prep:nom__AdpType=Prep": {POS: ADP, "AdpType": "Prep"},
"ger:sg:loc:n:perf:aff__Aspect=Perf|Case=Loc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Perf", "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"praet:sg:n:imperf__Aspect=Imp|Gender=Neut|Mood=Ind|Number=Sing|Tense=Past|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Gender": "Neut", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin"},
"pact:sg:gen:m3:imperf:aff__Aspect=Imp|Case=Gen|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ger:sg:acc:n:perf:aff__Aspect=Perf|Case=Acc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"bedzie:sg:ter:imperf__Aspect=Imp|Mood=Ind|Number=Sing|Person=3|Tense=Fut|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Sing", "Person": "3", "Tense": "Fut", "VerbForm": "Fin"},
"bedzie:pl:pri:imperf__Aspect=Imp|Mood=Ind|Number=Plur|Person=1|Tense=Fut|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Plur", "Person": "1", "Tense": "Fut", "VerbForm": "Fin"},
"adj:pl:gen:m3:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"adj:pl:gen:m3:pos__Case=Gen|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"adj:pl:inst:m1:pos__Case=Ins|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"subst:pl:inst:m1__Case=Ins|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Plur"},
"adj:pl:acc:f:pos__Case=Acc|Gender=Fem|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"subst:pl:inst:m2__Case=Ins|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Plur"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"prep:inst__AdpType=Prep": {POS: ADP, "AdpType": "Prep"},
"subst:pl:inst:f__Case=Ins|Gender=Fem|Number=Plur": {POS: NOUN, "Case": "Ins", "Gender": "Fem", "Number": "Plur"},
"subst:sg:inst:m3__Case=Ins|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Sing"},
"subst:sg:voc:m1__Case=Voc|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: NOUN, "Case": "Voc", "Gender": "Masc", "Number": "Sing"},
"ppas:sg:nom:n:perf:aff__Aspect=Perf|Case=Nom|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"subst:sg:dat:f__Case=Dat|Gender=Fem|Number=Sing": {POS: NOUN, "Case": "Dat", "Gender": "Fem", "Number": "Sing"},
"adj:sg:dat:f:pos__Case=Dat|Degree=Pos|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Dat", "Degree": "Pos", "Gender": "Fem", "Number": "Sing"},
"ppron12:sg:inst:m1:sec__Case=Ins|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs"},
"adj:sg:inst:f:pos__Case=Ins|Gender=Fem|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "PronType": "Dem"},
"ppron3:sg:inst:m1:ter:akc:praep__Case=Ins|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ger:sg:acc:n:imperf:aff__Aspect=Imp|Case=Acc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"ppron12:sg:acc:f:pri:akc__Case=Acc|Gender=Fem|Number=Sing|Person=1|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Long"},
"ppron12:pl:dat:m1:pri__Case=Dat|Gender=Masc|Number=Plur|Person=1|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"prep:inst:wok__AdpType=Prep|Variant=Long": {POS: ADP, "AdpType": "Prep", "Variant": "Long"},
"adj:sg:acc:f:com__Case=Acc|Degree=Cmp|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Acc", "Degree": "Cmp", "Gender": "Fem", "Number": "Sing"},
"impt:pl:pri:imperf__Aspect=Imp|Mood=Imp|Number=Plur|Person=1|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Imp", "Number": "Plur", "Person": "1", "VerbForm": "Fin", "Voice": "Act"},
"subst:sg:gen:n__Case=Gen|Gender=Neut|Number=Sing": {POS: NOUN, "Case": "Gen", "Gender": "Neut", "Number": "Sing"},
"adj:sg:acc:f:pos__Case=Acc|Degree=Pos|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Fem", "Number": "Sing"},
"adj:sg:acc:m2:pos__Case=Acc|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"subst:sg:acc:m2__Case=Acc|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Sing"},
"subst:sg:acc:n__Case=Acc|Gender=Neut|Number=Sing|PronType=Tot": {POS: NOUN, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"ppron3:sg:acc:m1:ter:akc:praep__Case=Acc|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:sg:inst:f:imperf:neg__Aspect=Imp|Case=Ins|Gender=Fem|Number=Sing|Polarity=Neg|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:nom:m1:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"num:pl:nom:m1:congr__Case=Nom|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc1": {POS: NUM, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"siebie:acc__Case=Acc|PronType=Prs|Reflex=Yes": {POS: PRON, "Case": "Acc", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"bedzie:sg:sec:imperf__Aspect=Imp|Mood=Ind|Number=Sing|Person=2|Tense=Fut|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Sing", "Person": "2", "Tense": "Fut", "VerbForm": "Fin"},
"subst:sg:nom:n__Case=Nom|Gender=Neut|Number=Sing|PronType=Int": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"ppron3:sg:acc:m3:ter:nakc:npraep__Case=Acc|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc3|Variant=Short": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"ppron3:pl:acc:f:ter:akc:npraep__Case=Acc|Gender=Fem|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:pl:dat:m1:sec__Case=Dat|Gender=Masc|Number=Plur|Person=2|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "Person": "2", "PronType": "Prs"},
"subst:sg:nom:n__Case=Nom|Gender=Neut|Number=Sing|PronType=Ind": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Ind"},
"adj:pl:gen:n:pos__Case=Gen|Degree=Pos|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Neut", "Number": "Plur"},
"adv:com__Degree=Cmp": {POS: ADV, "Degree": "Cmp"},
"fin:pl:sec:imperf__Aspect=Imp|Mood=Ind|Number=Plur|Person=2|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Plur", "Person": "2", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"ppron3:pl:nom:m1:ter:akc:npraep__Case=Nom|Gender=Masc|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"num:pl:nom:f:congr__Case=Nom|Gender=Fem|Number=Plur|NumType=Card": {POS: NUM, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "NumType": "Card"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|Poss=Yes|PronType=Ind": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Poss": "Yes", "PronType": "Ind"},
"siebie:loc__Case=Loc|PronType=Prs|Reflex=Yes": {POS: PRON, "Case": "Loc", "PronType": "Prs", "Reflex": "Yes"},
"fin:pl:ter:perf__Aspect=Perf|Mood=Ind|Number=Plur|Person=3|Tense=Fut|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Ind", "Number": "Plur", "Person": "3", "Tense": "Fut", "VerbForm": "Fin", "Voice": "Act"},
"prep:gen:wok__AdpType=Prep|Variant=Long": {POS: ADP, "AdpType": "Prep", "Variant": "Long"},
"subst:sg:gen:n__Case=Gen|Gender=Neut|Number=Sing|PronType=Dem": {POS: NOUN, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"subst:sg:nom:m1__Case=Nom|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc1": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"ger:sg:nom:n:imperf:aff__Aspect=Imp|Case=Nom|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppron12:pl:acc:f:pri__Case=Acc|Gender=Fem|Number=Plur|Person=1|PronType=Prs": {POS: PRON, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"adv:pos__Degree=Pos|PronType=Dem": {POS: ADV, "Degree": "Pos", "PronType": "Dem"},
"subst:sg:inst:n__Case=Ins|Gender=Neut|Number=Sing|PronType=Int": {POS: NOUN, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"praet:sg:m2:perf__Aspect=Perf|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc2|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppron3:pl:gen:m1:ter:akc:praep__Case=Gen|Gender=Masc|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:acc:n__Case=Acc|Emphatic=Yes|Gender=Neut|Number=Sing|PronType=Int": {POS: NOUN, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"impt:sg:sec:perf__Aspect=Perf|Mood=Imp|Number=Sing|Person=2|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Imp", "Number": "Sing", "Person": "2", "VerbForm": "Fin", "Voice": "Act"},
"subst:sg:dat:m1__Case=Dat|Gender=Masc|Number=Sing|PronType=Neg|SubGender=Masc1": {POS: NOUN, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "PronType": "Neg"},
"impt:pl:sec:perf__Aspect=Perf|Mood=Imp|Number=Plur|Person=2|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Imp", "Number": "Plur", "Person": "2", "VerbForm": "Fin", "Voice": "Act"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppas:sg:acc:m3:imperf:aff__Aspect=Imp|Case=Acc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"imps:perf__Aspect=Perf|Mood=Ind|Person=0|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Ind", "Person": "0", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"adj:sg:loc:n:pos__Case=Loc|Gender=Neut|Number=Sing|PronType=Rel": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Rel"},
"interp__PunctSide=Ini|PunctType=Brck": {POS: PUNCT, "PunctSide": "Ini", "PunctType": "Brck"},
"interp__PunctSide=Fin|PunctType=Brck": {POS: PUNCT, "PunctSide": "Fin", "PunctType": "Brck"},
"ppron12:sg:acc:f:sec:nakc__Case=Acc|Gender=Fem|Number=Sing|Person=2|PronType=Prs|Variant=Short": {POS: PRON, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Short"},
"ppron12:pl:gen:m1:pri__Case=Gen|Gender=Masc|Number=Plur|Person=1|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"adj:pl:gen:m3:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|PronType=Ind": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "PronType": "Ind"},
"ppron3:sg:gen:f:ter:akc:npraep__Case=Gen|Gender=Fem|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:gen:n__Case=Gen|Gender=Neut|Number=Sing|PronType=Int": {POS: NOUN, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"subst:pl:inst:m3__Case=Ins|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Plur"},
"praet:pl:n:imperf__Aspect=Imp|Gender=Neut|Mood=Ind|Number=Plur|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Neut", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"subst:pl:nom:n__Case=Nom|Gender=Neut|Number=Plur": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Plur"},
"ger:sg:gen:n:perf:aff__Aspect=Perf|Case=Gen|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"num:pl:gen:m3:congr__Case=Gen|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc3": {POS: NUM, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"siebie:gen__Case=Gen|PronType=Prs|Reflex=Yes": {POS: PRON, "Case": "Gen", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:inst:n:pos__Case=Ins|Degree=Pos|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Neut", "Number": "Sing"},
"adj:sg:loc:f:com__Case=Loc|Degree=Cmp|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Loc", "Degree": "Cmp", "Gender": "Fem", "Number": "Sing"},
"adj:sg:acc:m1:pos__Case=Acc|Gender=Masc|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"praet:pl:f:perf__Aspect=Perf|Gender=Fem|Mood=Ind|Number=Plur|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Fem", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"subst:pl:nom:m1__Case=Nom|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc1": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adj:sg:acc:m3:com__Case=Acc|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"praet:pl:n:perf__Aspect=Perf|Gender=Neut|Mood=Ind|Number=Plur|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Neut", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Neg|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Neg"},
"adj:sg:gen:n:pos__Case=Gen|Degree=Pos|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Neut", "Number": "Sing"},
"adj:pl:nom:n:com__Case=Nom|Degree=Cmp|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Nom", "Degree": "Cmp", "Gender": "Neut", "Number": "Plur"},
"aglt:sg:sec:imperf:nwok__Aspect=Imp|Number=Sing|Person=2|Variant=Short": {POS: VERB, "Aspect": "Imp", "Number": "Sing", "Person": "2", "Variant": "Short"},
"num:pl:nom:n:congr__Case=Nom|Gender=Neut|Number=Plur|NumType=Card": {POS: NUM, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "NumType": "Card"},
"ger:sg:inst:n:perf:aff__Aspect=Perf|Case=Ins|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"adj:pl:acc:m1:pos__Case=Acc|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"num:pl:acc:n:rec__Case=Acc|Gender=Neut|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"prep:dat__AdpType=Prep": {POS: ADP, "AdpType": "Prep"},
"ppron3:sg:dat:f:ter:akc:praep__Case=Dat|Gender=Fem|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:dat:m1__Case=Dat|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: NOUN, "Case": "Dat", "Gender": "Masc", "Number": "Sing"},
"praet:pl:m3:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc3|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"num:pl:loc:m3:congr__Case=Loc|Gender=Masc|Number=Plur|NumType=Card|PronType=Ind|SubGender=Masc3": {POS: NUM, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"subst:pl:loc:m3__Case=Loc|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: NOUN, "Case": "Loc", "Gender": "Masc", "Number": "Plur"},
"adj:sg:loc:f:pos__Case=Loc|Degree=Pos|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Loc", "Degree": "Pos", "Gender": "Fem", "Number": "Sing"},
"ppron12:sg:dat:f:sec:nakc__Case=Dat|Gender=Fem|Number=Sing|Person=2|PronType=Prs|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Short"},
"praet:sg:m1:imperf:nagl__Agglutination=Nagl|Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc1|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"ppron12:sg:acc:m1:sec:akc__Case=Acc|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Long"},
"adj:sg:inst:m3:pos__Case=Ins|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"subst:sg:inst:m2__Case=Ins|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Sing"},
"interj___": {POS: INTJ},
"adj:pl:inst:f:pos__Case=Ins|Degree=Pos|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Fem", "Number": "Plur"},
"ppron3:pl:acc:m3:ter:akc:npraep__Case=Acc|Gender=Masc|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:loc:m3:pos__Case=Loc|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"adj:pl:dat:m1:pos__Case=Dat|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:inst:f:pos__Case=Ins|Degree=Pos|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Fem", "Number": "Sing"},
"adjp__PrepCase=Pre": {POS: ADJ, },
"siebie:dat__Case=Dat|PronType=Prs|Reflex=Yes": {POS: PRON, "Case": "Dat", "PronType": "Prs", "Reflex": "Yes"},
"ppron3:sg:acc:f:ter:akc:praep__Case=Acc|Gender=Fem|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:pl:inst:m1:sec__Case=Ins|Gender=Masc|Number=Plur|Person=2|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Person": "2", "PronType": "Prs"},
"praet:sg:m1:perf:nagl__Agglutination=Nagl|Aspect=Perf|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc1|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"adj:sg:acc:m1:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"adj:sg:inst:n:pos__Case=Ins|Gender=Neut|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "PronType": "Dem"},
"adj:sg:inst:f:pos__Case=Ins|Gender=Fem|Number=Sing|PronType=Ind": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "PronType": "Ind"},
"adv__PronType=Neg": {POS: ADV, "PronType": "Neg"},
"subst:pl:loc:m1__Case=Loc|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: NOUN, "Case": "Loc", "Gender": "Masc", "Number": "Plur"},
"adj:sg:nom:n:pos__Case=Nom|Gender=Neut|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"ppron3:sg:gen:n:ter:akc:npraep__Case=Gen|Gender=Neut|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:sg:nom:f:pri__Case=Nom|Gender=Fem|Number=Sing|Person=1|PronType=Prs": {POS: PRON, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Person": "1", "PronType": "Prs"},
"winien:sg:m1:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc1|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"ppron12:pl:nom:m1:pri__Case=Nom|Gender=Masc|Number=Plur|Person=1|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"adj:pl:nom:m1:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"ppron3:sg:acc:n:ter:akc:npraep__Case=Acc|Gender=Neut|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"praet:sg:m1:imperf:agl__Agglutination=Agl|Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc1|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"num:pl:acc:f:rec__Case=Acc|Gender=Fem|Number=Plur|NumType=Frac": {POS: NUM, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "NumType": "Frac"},
"adj:sg:nom:n:pos__Case=Nom|Gender=Neut|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"adj:pl:nom:m3:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Int|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Int"},
"interp___": {POS: PUNCT},
"adj:sg:nom:n:pos__Case=Nom|Gender=Neut|Number=Sing|PronType=Int": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"prep:loc:wok__AdpType=Prep|Variant=Long": {POS: ADP, "AdpType": "Prep", "Variant": "Long"},
"ppron12:sg:loc:m1:pri__Case=Loc|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs"},
"adj:sg:inst:m3:pos__Case=Ins|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"adj:pl:nom:f:pos__Case=Nom|Gender=Fem|Number=Plur|PronType=Rel": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "PronType": "Rel"},
"adv__Emphatic=Yes|PronType=Int": {POS: ADV, "PronType": "Int"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adv:sup__Degree=Sup": {POS: ADV, "Degree": "Sup"},
"adj:pl:gen:n:pos__Case=Gen|Gender=Neut|Number=Plur|PronType=Neg": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "PronType": "Neg"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|PronType=Ind": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "PronType": "Ind"},
"adj:sg:gen:m1:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"adj:sg:gen:m1:pos__Case=Gen|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:nom:f:pos__Case=Nom|Gender=Fem|Number=Plur|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"winien:pl:f:imperf__Aspect=Imp|Gender=Fem|Mood=Ind|Number=Plur|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Fem", "Mood": "Ind", "Number": "Plur", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"ppas:pl:nom:f:perf:aff__Aspect=Perf|Case=Nom|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:acc:f:rec__Case=Acc|Gender=Fem|Number=Plur|NumType=Card": {POS: NUM, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "NumType": "Card"},
"ppron3:sg:gen:m3:ter:akc:npraep__Case=Gen|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:acc:f:pos__Case=Acc|Gender=Fem|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"bedzie:pl:ter:imperf__Aspect=Imp|Mood=Ind|Number=Plur|Person=3|Tense=Fut|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Plur", "Person": "3", "Tense": "Fut", "VerbForm": "Fin"},
"adj:pl:gen:m1:sup__Case=Gen|Degree=Sup|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Degree": "Sup", "Gender": "Masc", "Number": "Plur"},
"adj:pl:loc:n:pos__Case=Loc|Degree=Pos|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Loc", "Degree": "Pos", "Gender": "Neut", "Number": "Plur"},
"adj:sg:acc:m3:sup__Case=Acc|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"subst:sg:gen:m1__Case=Gen|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc1": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"pact:pl:acc:m1:imperf:aff__Aspect=Imp|Case=Acc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:sg:dat:m1:pos__Case=Dat|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"ppron3:pl:gen:m3:ter:akc:praep__Case=Gen|Gender=Masc|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:nom:n__Case=Nom|Gender=Neut|Number=Sing|PronType=Tot": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"ppron12:sg:gen:m1:pri:akc__Case=Gen|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Long"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"adj:sg:nom:m2:pos__Case=Nom|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"praet:pl:m2:perf__Aspect=Perf|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc2|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppron3:pl:gen:m2:ter:akc:npraep__Case=Gen|Gender=Masc|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:acc:m1__Case=Acc|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc1": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"ppas:sg:nom:m1:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"interp__PunctType=Semi": {POS: PUNCT, "PunctType": "Semi"},
"ppron12:pl:loc:m3:sec__Case=Loc|Gender=Masc|Number=Plur|Person=2|PronType=Prs|SubGender=Masc3": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Person": "2", "PronType": "Prs"},
"praet:pl:m2:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc2|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"pact:pl:gen:n:imperf:aff__Aspect=Imp|Case=Gen|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"pact:sg:acc:m3:imperf:aff__Aspect=Imp|Case=Acc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|PronType=Int": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "PronType": "Int"},
"subst:sg:nom:m1__Case=Nom|Emphatic=Yes|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc1": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppas:pl:nom:n:perf:aff__Aspect=Perf|Case=Nom|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:inst:f:pos__Case=Ins|Gender=Fem|Number=Plur|PronType=Int": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Plur", "PronType": "Int"},
"subst:sg:nom:m1__Case=Nom|Gender=Masc|Number=Sing|PronType=Neg|SubGender=Masc1": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Neg"},
"adj:pl:nom:m2:pos__Case=Nom|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"subst:pl:nom:m2__Case=Nom|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Plur"},
"adj:pl:inst:m2:pos__Case=Ins|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc2": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ger:pl:nom:n:perf:aff__Aspect=Perf|Case=Nom|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Vnoun"},
"ppas:pl:inst:f:perf:aff__Aspect=Perf|Case=Ins|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ger:sg:inst:n:imperf:aff__Aspect=Imp|Case=Ins|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"num:pl:acc:m3:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc3": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"ppas:pl:acc:m3:perf:aff__Aspect=Perf|Case=Acc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"subst:sg:dat:n__Case=Dat|Gender=Neut|Number=Sing|PronType=Tot": {POS: NOUN, "Case": "Dat", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"adj:sg:acc:m1:pos__Case=Acc|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:gen:m3:pos__Case=Gen|Gender=Masc|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"siebie:inst__Case=Ins|PronType=Prs|Reflex=Yes": {POS: PRON, "Case": "Ins", "PronType": "Prs", "Reflex": "Yes"},
"ppas:sg:acc:m3:perf:aff__Aspect=Perf|Case=Acc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:acc:n:pos__Case=Acc|Degree=Pos|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Neut", "Number": "Plur"},
"adj:sg:nom:m2:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"ppron12:sg:dat:m1:pri:akc__Case=Dat|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Long"},
"adj:pl:inst:n:pos__Case=Ins|Degree=Pos|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Neut", "Number": "Plur"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppron3:sg:inst:f:ter:akc:praep__Case=Ins|Gender=Fem|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:loc:n:pos__Case=Loc|Gender=Neut|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:acc:n:pos__Case=Acc|Gender=Neut|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppron3:sg:gen:n:ter:akc:praep__Case=Gen|Gender=Neut|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:loc:f:pos__Case=Loc|Gender=Fem|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"subst:sg:acc:m1__Case=Acc|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc1": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:gen:n:pos__Case=Gen|Gender=Neut|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"num:pl:inst:f:congr__Case=Ins|Gender=Fem|Number=Plur|NumType=Card": {POS: NUM, "Case": "Ins", "Gender": "Fem", "Number": "Plur", "NumType": "Card"},
"ppron3:sg:dat:m2:ter:nakc:npraep__Case=Dat|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc2|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"adj:sg:voc:m1:pos__Case=Voc|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Voc", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:inst:n:pos__Case=Ins|Gender=Neut|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:loc:m2:pos__Case=Loc|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: ADJ, "Case": "Loc", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"subst:sg:loc:m2__Case=Loc|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: NOUN, "Case": "Loc", "Gender": "Masc", "Number": "Sing"},
"ppron12:sg:dat:m1:sec:nakc__Case=Dat|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc1|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Short"},
"pact:sg:acc:f:imperf:aff__Aspect=Imp|Case=Acc|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron3:sg:loc:m1:ter:akc:praep__Case=Loc|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:sg:gen:f:pri:akc__Case=Gen|Gender=Fem|Number=Sing|Person=1|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Long"},
"adj:sg:gen:m1:pos__Case=Gen|Gender=Masc|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:nom:m1:com__Case=Nom|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"adj:sg:inst:m3:pos__Case=Ins|Gender=Masc|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppas:sg:acc:f:perf:aff__Aspect=Perf|Case=Acc|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:inst:m3:pos__Case=Ins|Gender=Masc|Number=Plur|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "PronType": "Ind"},
"ppas:sg:acc:n:imperf:aff__Aspect=Imp|Case=Acc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adja__Hyph=Yes": {POS: ADJ, "Hyph": "Yes"},
"adj:pl:gen:m3:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Rel|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Rel"},
"adj:pl:acc:f:pos__Case=Acc|Gender=Fem|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppas:sg:loc:f:perf:aff__Aspect=Perf|Case=Loc|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:gen:m1:congr__Case=Gen|Gender=Masc|Number=Plur|NumType=Card|PronType=Ind|SubGender=Masc1": {POS: NUM, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"adj:sg:acc:n:sup__Case=Acc|Degree=Sup|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Acc", "Degree": "Sup", "Gender": "Neut", "Number": "Sing"},
"adj:pl:nom:f:sup__Case=Nom|Degree=Sup|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Nom", "Degree": "Sup", "Gender": "Fem", "Number": "Plur"},
"adj:sg:nom:f:sup__Case=Nom|Degree=Sup|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Nom", "Degree": "Sup", "Gender": "Fem", "Number": "Sing"},
"adj:sg:nom:m3:sup__Case=Nom|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"pact:sg:nom:m3:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron3:pl:inst:m1:ter:akc:praep__Case=Ins|Gender=Masc|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:gen:n:pos__Case=Gen|Gender=Neut|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"subst:sg:loc:m1__Case=Loc|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: NOUN, "Case": "Loc", "Gender": "Masc", "Number": "Sing"},
"adj:sg:acc:n:com__Case=Acc|Degree=Cmp|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Acc", "Degree": "Cmp", "Gender": "Neut", "Number": "Sing"},
"aglt:pl:sec:imperf:nwok__Aspect=Imp|Number=Plur|Person=2|Variant=Short": {POS: VERB, "Aspect": "Imp", "Number": "Plur", "Person": "2", "Variant": "Short"},
"ppron3:pl:gen:f:ter:akc:npraep__Case=Gen|Gender=Fem|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:sg:inst:f:pri__Case=Ins|Gender=Fem|Number=Sing|Person=1|PronType=Prs": {POS: PRON, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Person": "1", "PronType": "Prs"},
"num:pl:acc:f:rec__Case=Acc|Gender=Fem|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"subst:sg:nom:n__Case=Nom|Gender=Neut|Number=Sing|PronType=Neg": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Neg"},
"ppron3:sg:gen:m1:ter:nakc:npraep__Case=Gen|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Short": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"num:pl:loc:n:congr__Case=Loc|Gender=Neut|Number=Plur|NumType=Card": {POS: NUM, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "NumType": "Card"},
"subst:sg:gen:n__Case=Gen|Gender=Neut|Number=Sing|PronType=Ind": {POS: NOUN, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Ind"},
"adj:sg:gen:n:com__Case=Gen|Degree=Cmp|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Gen", "Degree": "Cmp", "Gender": "Neut", "Number": "Sing"},
"adj:pl:gen:f:pos__Case=Gen|Gender=Fem|Number=Plur|PronType=Neg": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "PronType": "Neg"},
"ppas:pl:gen:f:perf:aff__Aspect=Perf|Case=Gen|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:loc:m1:pos__Case=Loc|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"adj:sg:gen:n:pos__Case=Gen|Gender=Neut|Number=Sing|PronType=Neg": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Neg"},
"adj:pl:nom:f:pos__Case=Nom|Gender=Fem|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "PronType": "Dem"},
"impt:pl:sec:imperf__Aspect=Imp|Mood=Imp|Number=Plur|Person=2|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Mood": "Imp", "Number": "Plur", "Person": "2", "VerbForm": "Fin", "Voice": "Act"},
"subst:sg:voc:f__Case=Voc|Gender=Fem|Number=Sing": {POS: NOUN, "Case": "Voc", "Gender": "Fem", "Number": "Sing"},
"adj:pl:nom:f:pos__Case=Nom|Gender=Fem|Number=Plur|PronType=Tot": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "PronType": "Tot"},
"subst:sg:inst:m1__Case=Ins|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc1": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"qub__Mood=Imp": {POS: PRON, "Mood": "Imp"},
"ppron12:pl:acc:m1:sec__Case=Acc|Gender=Masc|Number=Plur|Person=2|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Person": "2", "PronType": "Prs"},
"adj:pl:nom:m3:pos__Case=Nom|Gender=Masc|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppron3:pl:dat:m1:ter:akc:npraep__Case=Dat|Gender=Masc|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:gen:m1__Case=Gen|Gender=Masc|Number=Sing|PronType=Neg|SubGender=Masc1": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Neg"},
"ppron3:sg:loc:m3:ter:akc:praep__Case=Loc|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:sg:nom:m3:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:gen:f:sup__Case=Gen|Degree=Sup|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Gen", "Degree": "Sup", "Gender": "Fem", "Number": "Sing"},
"num:pl:gen:f:congr__Case=Gen|Gender=Fem|Number=Plur|NumType=Card": {POS: NUM, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "NumType": "Card"},
"ppron3:pl:nom:f:ter:akc:npraep__Case=Nom|Gender=Fem|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:gen:m3:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"ger:sg:gen:n:imperf:aff__Aspect=Imp|Case=Gen|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"ppas:sg:gen:m3:imperf:aff__Aspect=Imp|Case=Gen|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppron3:sg:nom:m3:ter:akc:npraep__Case=Nom|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:loc:m3:sup__Case=Loc|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"adj:pl:dat:m3:pos__Case=Dat|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Dat", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"subst:pl:dat:m3__Case=Dat|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: NOUN, "Case": "Dat", "Gender": "Masc", "Number": "Plur"},
"adj:pl:nom:m3:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Rel|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Rel"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|PronType=Ind": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Ind"},
"pact:sg:gen:m1:imperf:aff__Aspect=Imp|Case=Gen|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:acc:n:pos__Case=Acc|Gender=Neut|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "PronType": "Dem"},
"ppas:sg:loc:m3:perf:aff__Aspect=Perf|Case=Loc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"subst:sg:gen:n__Case=Gen|Gender=Neut|Number=Sing|PronType=Tot": {POS: NOUN, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"num:pl:loc:m3:congr__Case=Loc|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc3": {POS: NUM, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"ger:sg:loc:n:imperf:aff__Aspect=Imp|Case=Loc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"ppron12:pl:dat:f:sec__Case=Dat|Gender=Fem|Number=Plur|Person=2|PronType=Prs": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Plur", "Person": "2", "PronType": "Prs"},
"adj:sg:inst:f:pos__Case=Ins|Gender=Fem|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "PronType": "Tot"},
"adj:sg:inst:f:pos__Case=Ins|Gender=Fem|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:loc:f:pos__Case=Loc|Gender=Fem|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "PronType": "Dem"},
"adj:sg:gen:m2:pos__Case=Gen|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"ppron3:sg:dat:m1:ter:akc:praep__Case=Dat|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron3:pl:gen:m2:ter:akc:praep__Case=Gen|Gender=Masc|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"num:pl:acc:m2:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc2": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"fin:pl:sec:perf__Aspect=Perf|Mood=Ind|Number=Plur|Person=2|Tense=Fut|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Mood": "Ind", "Number": "Plur", "Person": "2", "Tense": "Fut", "VerbForm": "Fin", "Voice": "Act"},
"ppron12:pl:inst:m1:pri__Case=Ins|Gender=Masc|Number=Plur|Person=1|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"num:pl:acc:m3:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Frac|SubGender=Masc3": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Frac"},
"ppas:sg:nom:f:imperf:aff__Aspect=Imp|Case=Nom|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"subst:sg:dat:m2__Case=Dat|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: NOUN, "Case": "Dat", "Gender": "Masc", "Number": "Sing"},
"ppron3:sg:nom:n:ter:akc:npraep__Case=Nom|Gender=Neut|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Ind"},
"ppas:pl:nom:m3:perf:aff__Aspect=Perf|Case=Nom|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"pact:sg:nom:n:imperf:aff__Aspect=Imp|Case=Nom|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron12:sg:nom:m1:sec__Case=Nom|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs"},
"adj:pl:gen:m1:pos__Case=Gen|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"pact:sg:nom:f:imperf:aff__Aspect=Imp|Case=Nom|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron3:sg:gen:f:ter:akc:praep__Case=Gen|Gender=Fem|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:nom:n:com__Case=Nom|Degree=Cmp|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Nom", "Degree": "Cmp", "Gender": "Neut", "Number": "Sing"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:gen:n:pos__Case=Gen|Gender=Neut|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:nom:f:com__Case=Nom|Degree=Cmp|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Nom", "Degree": "Cmp", "Gender": "Fem", "Number": "Sing"},
"adj:pl:acc:n:pos__Case=Acc|Gender=Neut|Number=Plur|PronType=Tot": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "PronType": "Tot"},
"adj:pl:dat:m1:com__Case=Dat|Degree=Cmp|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Degree": "Cmp", "Gender": "Masc", "Number": "Plur"},
"adj:pl:gen:m2:pos__Case=Gen|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc2": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:loc:f:pos__Case=Loc|Gender=Fem|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "PronType": "Tot"},
"adv__PronType=Rel": {POS: ADV, "PronType": "Rel"},
"ppas:pl:nom:f:imperf:aff__Aspect=Imp|Case=Nom|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"subst:sg:inst:m1__Case=Ins|Gender=Masc|Number=Sing|PronType=Neg|SubGender=Masc1": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Neg"},
"adj:sg:gen:m2:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc2": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|PronType=Rel": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "PronType": "Rel"},
"adj:pl:nom:m3:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:nom:n:pos__Case=Nom|Gender=Neut|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "PronType": "Dem"},
"prep:acc:wok__AdpType=Prep|Variant=Long": {POS: ADP, "AdpType": "Prep", "Variant": "Long"},
"adj:sg:nom:n:pos__Case=Nom|Gender=Neut|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"winien:pl:n:imperf__Aspect=Imp|Gender=Neut|Mood=Ind|Number=Plur|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Neut", "Mood": "Ind", "Number": "Plur", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"adj:sg:gen:m3:pos__Case=Gen|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"praet:sg:m1:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc1|Tense=Past|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin"},
"ppas:pl:inst:n:perf:aff__Aspect=Perf|Case=Ins|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"ppron3:sg:gen:m1:ter:akc:praep__Case=Gen|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:nom:m3:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Ind"},
"adj:pl:gen:m1:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"fin:pl:ter:imperf__Aspect=Imp|Mood=Ind|Number=Plur|Person=3|Tense=Pres|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Plur", "Person": "3", "Tense": "Pres", "VerbForm": "Fin"},
"ppron12:sg:dat:m1:sec:akc__Case=Dat|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Long"},
"adj:pl:nom:m1:sup__Case=Nom|Degree=Sup|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Degree": "Sup", "Gender": "Masc", "Number": "Plur"},
"adj:pl:inst:m1:pos__Case=Ins|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|PronType=Ind": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "PronType": "Ind"},
"ppas:sg:nom:m2:perf:aff__Aspect=Perf|Case=Nom|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:loc:m3:pos__Case=Loc|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"adj:pl:gen:m3:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Neg|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Neg"},
"praet:sg:m3:imperf:nagl__Agglutination=Nagl|Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc3|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppas:sg:acc:n:perf:aff__Aspect=Perf|Case=Acc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppas:sg:gen:m2:perf:aff__Aspect=Perf|Case=Gen|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:sg:gen:m3:ter:akc:praep__Case=Gen|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"num:pl:gen:f:congr__Case=Gen|Gender=Fem|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"adj:sg:loc:m1:pos__Case=Loc|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"ppron3:sg:dat:m3:ter:nakc:npraep__Case=Dat|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc3|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"subst:sg:inst:n__Case=Ins|Gender=Neut|Number=Sing|PronType=Ind": {POS: NOUN, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Ind"},
"adj:pl:loc:m3:pos__Case=Loc|Gender=Masc|Number=Plur|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "PronType": "Ind"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"adj:sg:loc:f:pos__Case=Loc|Gender=Fem|Number=Sing|PronType=Ind": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "PronType": "Ind"},
"ger:pl:nom:n:imperf:aff__Aspect=Imp|Case=Nom|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Vnoun"},
"num:pl:loc:f:congr__Case=Loc|Gender=Fem|Number=Plur|NumType=Card": {POS: NUM, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "NumType": "Card"},
"ppron3:pl:gen:f:ter:akc:praep__Case=Gen|Gender=Fem|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:loc:n:pos__Case=Loc|Gender=Neut|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppas:pl:gen:f:imperf:aff__Aspect=Imp|Case=Gen|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppas:pl:acc:f:perf:aff__Aspect=Perf|Case=Acc|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:acc:n:congr__Case=Acc|Gender=Neut|Number=Plur|NumType=Card": {POS: NUM, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "NumType": "Card"},
"pact:pl:acc:f:imperf:aff__Aspect=Imp|Case=Acc|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron3:pl:acc:n:ter:akc:praep__Case=Acc|Gender=Neut|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:sg:loc:m1:sec__Case=Loc|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs"},
"subst:sg:acc:n__Case=Acc|Gender=Neut|Number=Sing|PronType=Rel": {POS: NOUN, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Rel"},
"ppas:pl:inst:m1:perf:aff__Aspect=Perf|Case=Ins|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppas:sg:acc:m2:imperf:aff__Aspect=Imp|Case=Acc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"subst:sg:dat:n__Case=Dat|Gender=Neut|Number=Sing": {POS: NOUN, "Case": "Dat", "Gender": "Neut", "Number": "Sing"},
"adj:sg:loc:n:pos__Case=Loc|Gender=Neut|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:gen:m3:sup__Case=Gen|Degree=Sup|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Degree": "Sup", "Gender": "Masc", "Number": "Plur"},
"adj:pl:acc:m3:sup__Case=Acc|Degree=Sup|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Degree": "Sup", "Gender": "Masc", "Number": "Plur"},
"num:pl:acc:f:rec__Case=Acc|Gender=Fem|Number=Plur|NumType=Card|PronType=Int": {POS: NUM, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "NumType": "Card", "PronType": "Int"},
"adj:pl:gen:f:pos__Case=Gen|Gender=Fem|Number=Plur|PronType=Tot": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "PronType": "Tot"},
"ppas:pl:acc:m2:perf:aff__Aspect=Perf|Case=Acc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:acc:n:pos__Case=Acc|Gender=Neut|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppas:sg:gen:f:perf:aff__Aspect=Perf|Case=Gen|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"pact:pl:gen:m1:imperf:aff__Aspect=Imp|Case=Gen|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:gen:f:pos__Case=Gen|Gender=Fem|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppas:sg:gen:n:perf:aff__Aspect=Perf|Case=Gen|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"adj:pl:acc:m1:pos__Case=Acc|Gender=Masc|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:acc:f:pos__Case=Acc|Gender=Fem|Number=Plur|PronType=Rel": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "PronType": "Rel"},
"adj:sg:loc:f:pos__Case=Loc|Gender=Fem|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppron3:pl:gen:m3:ter:akc:npraep__Case=Gen|Gender=Masc|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:sg:inst:f:perf:aff__Aspect=Perf|Case=Ins|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:pl:gen:n:ter:akc:praep__Case=Gen|Gender=Neut|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:acc:m1:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Neg|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Neg"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|PronType=Neg": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "PronType": "Neg"},
"adj:pl:acc:m1:pos__Case=Acc|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"subst:pl:dat:n__Case=Dat|Gender=Neut|Number=Plur": {POS: NOUN, "Case": "Dat", "Gender": "Neut", "Number": "Plur"},
"adj:pl:dat:n:pos__Case=Dat|Degree=Pos|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Dat", "Degree": "Pos", "Gender": "Neut", "Number": "Plur"},
"subst:pl:voc:n__Case=Voc|Gender=Neut|Number=Plur": {POS: NOUN, "Case": "Voc", "Gender": "Neut", "Number": "Plur"},
"ppron3:pl:acc:f:ter:akc:praep__Case=Acc|Gender=Fem|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"winien:pl:m1:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc1|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"adj:pl:inst:f:com__Case=Ins|Degree=Cmp|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Ins", "Degree": "Cmp", "Gender": "Fem", "Number": "Plur"},
"num:pl:acc:n:rec__Case=Acc|Gender=Neut|Number=Plur|NumType=Card": {POS: NUM, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "NumType": "Card"},
"ppas:pl:gen:n:imperf:aff__Aspect=Imp|Case=Gen|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"prep:acc__AdpType=Post": {POS: ADP, "AdpType": "Post"},
"pact:pl:gen:f:imperf:aff__Aspect=Imp|Case=Gen|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:gen:m3:pos__Case=Gen|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:pl:inst:m3:pos__Case=Ins|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"adj:pl:nom:m1:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Rel|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Rel"},
"ppron12:pl:nom:m1:sec__Case=Nom|Gender=Masc|Number=Plur|Person=2|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "2", "PronType": "Prs"},
"num:pl:acc:m3:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|PronType=Int|SubGender=Masc3": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Int"},
"subst:pl:dat:f__Case=Dat|Gender=Fem|Number=Plur": {POS: NOUN, "Case": "Dat", "Gender": "Fem", "Number": "Plur"},
"ppron12:sg:gen:m1:sec:akc__Case=Gen|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Long"},
"ppron12:sg:inst:f:sec__Case=Ins|Gender=Fem|Number=Sing|Person=2|PronType=Prs": {POS: PRON, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Person": "2", "PronType": "Prs"},
"adj:sg:nom:m2:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"adj:pl:nom:m2:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Int|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Int"},
"ppron3:sg:dat:m1:ter:akc:npraep__Case=Dat|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "PronType": "Tot"},
"ppron3:pl:loc:m3:ter:akc:praep__Case=Loc|Gender=Masc|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:pl:gen:m1:sec__Case=Gen|Gender=Masc|Number=Plur|Person=2|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "2", "PronType": "Prs"},
"ppron12:pl:nom:f:pri__Case=Nom|Gender=Fem|Number=Plur|Person=1|PronType=Prs": {POS: PRON, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|Number[psor]=Plur|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"num:pl:acc:n:rec__Case=Acc|Gender=Neut|Number=Plur|NumType=Card|PronType=Int": {POS: NUM, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Int"},
"subst:sg:dat:n__Case=Dat|Gender=Neut|Number=Sing|PronType=Int": {POS: NOUN, "Case": "Dat", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"ppas:pl:dat:m3:perf:aff__Aspect=Perf|Case=Dat|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Dat", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:nom:m1:sup__Case=Nom|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"depr:pl:nom:m2__Case=Nom|Gender=Masc|Number=Plur|Polite=Depr|SubGender=Masc2": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Polite": "Depr"},
"adj:pl:acc:m3:com__Case=Acc|Degree=Cmp|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Degree": "Cmp", "Gender": "Masc", "Number": "Plur"},
"adj:pl:gen:f:pos__Case=Gen|Gender=Fem|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "PronType": "Dem"},
"ppas:pl:nom:m3:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:dat:f:pos__Case=Dat|Gender=Fem|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:nom:n:sup__Case=Nom|Degree=Sup|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Nom", "Degree": "Sup", "Gender": "Neut", "Number": "Sing"},
"ppron12:sg:dat:f:sec:akc__Case=Dat|Gender=Fem|Number=Sing|Person=2|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Long"},
"ppron12:sg:nom:m2:sec__Case=Nom|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc2": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs"},
"adj:pl:acc:f:com__Case=Acc|Degree=Cmp|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Acc", "Degree": "Cmp", "Gender": "Fem", "Number": "Plur"},
"subst:sg:gen:n__Case=Gen|Gender=Neut|Number=Sing|PronType=Rel": {POS: NOUN, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Rel"},
"adj:pl:acc:f:pos__Case=Acc|Gender=Fem|Number=Plur|PronType=Ind": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "PronType": "Ind"},
"adj:pl:loc:f:pos__Case=Loc|Gender=Fem|Number=Plur|PronType=Tot": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "PronType": "Tot"},
"adj:pl:nom:m1:com__Case=Nom|Degree=Cmp|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Degree": "Cmp", "Gender": "Masc", "Number": "Plur"},
"winien:sg:f:imperf__Aspect=Imp|Gender=Fem|Mood=Ind|Number=Sing|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Fem", "Mood": "Ind", "Number": "Sing", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"subst:sg:dat:n__Case=Dat|Gender=Neut|Number=Sing|PronType=Dem": {POS: NOUN, "Case": "Dat", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"subst:sg:dat:m1__Case=Dat|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc1": {POS: NOUN, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"ppron12:sg:gen:m1:sec:nakc__Case=Gen|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc1|Variant=Short": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Short"},
"adj:pl:acc:m1:sup__Case=Acc|Degree=Sup|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Degree": "Sup", "Gender": "Masc", "Number": "Plur"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"adj:sg:inst:m2:pos__Case=Ins|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc2": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|Number[psor]=Plur|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"ppron12:sg:inst:m1:pri__Case=Ins|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs"},
"adj:pl:acc:n:pos__Case=Acc|Gender=Neut|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppas:sg:acc:m1:perf:aff__Aspect=Perf|Case=Acc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"praet:sg:m2:perf:nagl__Agglutination=Nagl|Aspect=Perf|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc2|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppron12:sg:acc:m2:pri:akc__Case=Acc|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Long"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:acc:f:pos__Case=Acc|Gender=Fem|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "PronType": "Dem"},
"praet:pl:m3:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc3|Tense=Past|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin"},
"ppron3:sg:acc:m3:ter:akc:praep__Case=Acc|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"pact:sg:inst:m3:imperf:aff__Aspect=Imp|Case=Ins|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron12:sg:dat:f:pri:akc__Case=Dat|Gender=Fem|Number=Sing|Person=1|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Long"},
"ppron3:sg:inst:m1:ter:akc:npraep__Case=Ins|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:loc:n__Case=Loc|Gender=Neut|Number=Sing|PronType=Ind": {POS: NOUN, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Ind"},
"adj:sg:loc:n:com__Case=Loc|Degree=Cmp|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Loc", "Degree": "Cmp", "Gender": "Neut", "Number": "Sing"},
"adj:pl:gen:n:pos__Case=Gen|Gender=Neut|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:pl:nom:m1:pos__Case=Nom|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"pact:pl:nom:m1:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"num:pl:inst:m1:congr__Case=Ins|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc1": {POS: NUM, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"adj:sg:loc:n:pos__Case=Loc|Gender=Neut|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:nom:m2:pos__Case=Nom|Gender=Masc|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:inst:m1:pos__Case=Ins|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"comp__PronType=Rel": {POS: SCONJ, "PronType": "Rel"},
"subst:pl:acc:m1__Case=Acc|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc1": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"ppas:sg:inst:m3:perf:aff__Aspect=Perf|Case=Ins|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"subst:sg:inst:n__Case=Ins|Gender=Neut|Number=Sing|PronType=Tot": {POS: NOUN, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"adj:sg:inst:m2:pos__Case=Ins|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:sg:inst:f:pos__Case=Ins|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"pact:sg:gen:f:imperf:aff__Aspect=Imp|Case=Gen|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"subst:sg:acc:n__Case=Acc|Gender=Neut|Number=Sing|PronType=Neg": {POS: NOUN, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Neg"},
"adj:sg:nom:m2:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppron12:sg:loc:f:pri__Case=Loc|Gender=Fem|Number=Sing|Person=1|PronType=Prs": {POS: PRON, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Person": "1", "PronType": "Prs"},
"ger:sg:nom:n:perf:aff__Aspect=Perf|Case=Nom|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|PronType=Rel": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "PronType": "Rel"},
"adj:sg:nom:m2:com__Case=Nom|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"adj:pl:loc:m3:pos__Case=Loc|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppas:pl:gen:n:perf:aff__Aspect=Perf|Case=Gen|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:acc:f:pos__Case=Acc|Gender=Fem|Number=Plur|PronType=Tot": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "PronType": "Tot"},
"adj:pl:inst:m3:pos__Case=Ins|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"praet:pl:m1:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc1|Tense=Past|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin"},
"adj:sg:inst:m1:pos__Case=Ins|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"ppas:sg:inst:m1:imperf:aff__Aspect=Imp|Case=Ins|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:gen:m1:com__Case=Gen|Degree=Cmp|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Degree": "Cmp", "Gender": "Masc", "Number": "Plur"},
"adj:pl:nom:m3:sup__Case=Nom|Degree=Sup|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Degree": "Sup", "Gender": "Masc", "Number": "Plur"},
"praet:pl:n:imperf__Aspect=Imp|Gender=Neut|Mood=Ind|Number=Plur|Tense=Past|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Gender": "Neut", "Mood": "Ind", "Number": "Plur", "Tense": "Past", "VerbForm": "Fin"},
"adj:pl:nom:n:pos__Case=Nom|Gender=Neut|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"bedzie:pl:sec:imperf__Aspect=Imp|Mood=Ind|Number=Plur|Person=2|Tense=Fut|VerbForm=Fin": {POS: VERB, "Aspect": "Imp", "Mood": "Ind", "Number": "Plur", "Person": "2", "Tense": "Fut", "VerbForm": "Fin"},
"adj:pl:dat:n:pos__Case=Dat|Gender=Neut|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Dat", "Gender": "Neut", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppron3:sg:gen:m3:ter:nakc:npraep__Case=Gen|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc3|Variant=Short": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"adj:sg:gen:m3:com__Case=Gen|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"ppron3:pl:gen:n:ter:akc:npraep__Case=Gen|Gender=Neut|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:sg:gen:f:imperf:aff__Aspect=Imp|Case=Gen|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:acc:m2:pos__Case=Acc|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc2": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"adj:pl:nom:m1:pos__Case=Nom|Gender=Masc|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:loc:f:sup__Case=Loc|Degree=Sup|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Loc", "Degree": "Sup", "Gender": "Fem", "Number": "Sing"},
"ppas:pl:nom:m1:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:pl:acc:m2:ter:akc:npraep__Case=Acc|Gender=Masc|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"num:pl:inst:n:congr__Case=Ins|Gender=Neut|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Ins", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"adj:sg:loc:f:pos__Case=Loc|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adjc__Variant=Short": {POS: ADJ, "Variant": "Short"},
"adj:sg:nom:n:pos__Case=Nom|Gender=Neut|Number=Sing|PronType=Ind": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Ind"},
"subst:sg:inst:n__Case=Ins|Gender=Neut|Number=Sing|PronType=Rel": {POS: NOUN, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Rel"},
"ppas:sg:acc:m2:perf:aff__Aspect=Perf|Case=Acc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:dat:f:pos__Case=Dat|Gender=Fem|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Dat", "Gender": "Fem", "Number": "Plur", "PronType": "Dem"},
"adj:pl:gen:m3:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Ind"},
"adj:pl:gen:f:pos__Case=Gen|Gender=Fem|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppron3:pl:nom:m3:ter:akc:npraep__Case=Nom|Gender=Masc|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:sg:loc:m3:imperf:aff__Aspect=Imp|Case=Loc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:gen:m1:congr__Case=Gen|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc1": {POS: NUM, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"pact:pl:acc:n:imperf:aff__Aspect=Imp|Case=Acc|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron3:pl:loc:f:ter:akc:praep__Case=Loc|Gender=Fem|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:inst:n:pos__Case=Ins|Gender=Neut|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Plur", "PronType": "Dem"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"subst:sg:dat:m1__Case=Dat|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc1": {POS: NOUN, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"ppas:sg:inst:n:perf:aff__Aspect=Perf|Case=Ins|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:gen:n:pos__Case=Gen|Gender=Neut|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:loc:f:pos__Case=Loc|Gender=Fem|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"num:pl:acc:m1:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|PronType=Dem|SubGender=Masc1": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Dem"},
"adj:sg:inst:n:com__Case=Ins|Degree=Cmp|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Ins", "Degree": "Cmp", "Gender": "Neut", "Number": "Sing"},
"pact:pl:nom:m3:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"subst:sg:voc:m3__Case=Voc|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: NOUN, "Case": "Voc", "Gender": "Masc", "Number": "Sing"},
"adj:sg:voc:m3:pos__Case=Voc|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Voc", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"subst:sg:inst:n__Case=Ins|Emphatic=Yes|Gender=Neut|Number=Sing|PronType=Int": {POS: NOUN, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"ppas:sg:nom:m2:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:loc:f:congr__Case=Loc|Gender=Fem|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"ppron12:sg:acc:f:sec:akc__Case=Acc|Gender=Fem|Number=Sing|Person=2|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Long"},
"ppas:sg:nom:f:perf:neg__Aspect=Perf|Case=Nom|Gender=Fem|Number=Sing|Polarity=Neg|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:sg:gen:m2:ter:akc:npraep__Case=Gen|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:pl:acc:n:perf:aff__Aspect=Perf|Case=Acc|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:gen:m3:rec__Case=Gen|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc3": {POS: NUM, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"adj:sg:acc:f:sup__Case=Acc|Degree=Sup|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Acc", "Degree": "Sup", "Gender": "Fem", "Number": "Sing"},
"adj:sg:inst:n:pos__Case=Ins|Gender=Neut|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"adj:pl:inst:m1:pos__Case=Ins|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"ppron12:sg:gen:f:sec:akc__Case=Gen|Gender=Fem|Number=Sing|Person=2|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Long"},
"ppas:sg:loc:n:imperf:aff__Aspect=Imp|Case=Loc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:inst:m1:pos__Case=Ins|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:gen:m1:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adj:sg:inst:m3:pos__Case=Ins|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"adj:sg:dat:m1:pos__Case=Dat|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:pl:nom:n:pos__Case=Nom|Gender=Neut|Number=Plur|PronType=Ind": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "PronType": "Ind"},
"adj:pl:loc:m3:pos__Case=Loc|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"ger:sg:dat:n:perf:aff__Aspect=Perf|Case=Dat|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Perf", "Case": "Dat", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"adj:pl:inst:m2:pos__Case=Ins|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: ADJ, "Case": "Ins", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"adj:sg:nom:m1:pos__Case=Nom|Gender=Masc|Number=Sing|Poss=Yes|PronType=Int|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Poss": "Yes", "PronType": "Int"},
"qub__Emphatic=Yes|PartType=Int": {POS: PRON, "PartType": "Int"},
"ppron3:pl:acc:m1:ter:akc:praep__Case=Acc|Gender=Masc|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:nom:f:pos__Case=Nom|Gender=Fem|Number=Plur|PronType=Ind": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "PronType": "Ind"},
"adj:sg:nom:m3:com__Case=Nom|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"adj:pl:nom:m3:com__Case=Nom|Degree=Cmp|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Degree": "Cmp", "Gender": "Masc", "Number": "Plur"},
"subst:sg:nom:n__Case=Nom|Emphatic=Yes|Gender=Neut|Number=Sing|PronType=Int": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"adj:pl:dat:f:pos__Case=Dat|Degree=Pos|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Dat", "Degree": "Pos", "Gender": "Fem", "Number": "Plur"},
"adj:pl:loc:m1:pos__Case=Loc|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Loc", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"adj:sg:acc:m1:sup__Case=Acc|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"adj:sg:dat:m3:pos__Case=Dat|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Dat", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:pl:acc:f:pos__Case=Acc|Gender=Fem|Number=Plur|PronType=Int": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "PronType": "Int"},
"adj:sg:gen:f:com__Case=Gen|Degree=Cmp|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Gen", "Degree": "Cmp", "Gender": "Fem", "Number": "Sing"},
"subst:pl:dat:m1__Case=Dat|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc1": {POS: NOUN, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adj:pl:gen:m1:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Ind|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Ind"},
"pact:pl:inst:m1:imperf:aff__Aspect=Imp|Case=Ins|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppas:pl:gen:m3:imperf:aff__Aspect=Imp|Case=Gen|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:inst:m1:congr__Case=Ins|Gender=Masc|Number=Plur|NumType=Card|PronType=Ind|SubGender=Masc1": {POS: NUM, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"num:pl:gen:m3:congr__Case=Gen|Gender=Masc|Number=Plur|NumType=Card|PronType=Ind|SubGender=Masc3": {POS: NUM, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"adj:pl:gen:n:pos__Case=Gen|Gender=Neut|Number=Plur|PronType=Int": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "PronType": "Int"},
"adj:sg:gen:n:pos__Case=Gen|Gender=Neut|Number=Sing|PronType=Int": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppas:pl:nom:n:imperf:aff__Aspect=Imp|Case=Nom|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppas:sg:acc:f:imperf:aff__Aspect=Imp|Case=Acc|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:voc:m1:pos__Case=Voc|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Voc", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:voc:m1:pos__Case=Voc|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Voc", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:pl:nom:f:com__Case=Nom|Degree=Cmp|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Nom", "Degree": "Cmp", "Gender": "Fem", "Number": "Plur"},
"adj:sg:loc:f:pos__Case=Loc|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:acc:f:sup__Case=Acc|Degree=Sup|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Acc", "Degree": "Sup", "Gender": "Fem", "Number": "Plur"},
"adj:pl:nom:n:pos__Case=Nom|Gender=Neut|Number=Plur|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"pact:sg:acc:m2:imperf:aff__Aspect=Imp|Case=Acc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:sg:inst:m3:pos__Case=Ins|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"ppas:pl:gen:m3:perf:aff__Aspect=Perf|Case=Gen|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"pact:pl:nom:f:imperf:aff__Aspect=Imp|Case=Nom|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron3:sg:acc:n:ter:akc:praep__Case=Acc|Gender=Neut|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:loc:f:pos__Case=Loc|Gender=Fem|Number=Sing|PronType=Neg": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "PronType": "Neg"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|PronType=Neg": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "PronType": "Neg"},
"ppron3:pl:nom:n:ter:akc:npraep__Case=Nom|Gender=Neut|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"num:pl:inst:m3:congr__Case=Ins|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc3": {POS: NUM, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"ppas:pl:acc:m3:imperf:aff__Aspect=Imp|Case=Acc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:nom:m1:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Ind|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Ind"},
"adj:pl:loc:n:pos__Case=Loc|Gender=Neut|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "PronType": "Dem"},
"adj:sg:loc:m3:com__Case=Loc|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"adj:pl:acc:m1:pos__Case=Acc|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ger:sg:dat:n:imperf:aff__Aspect=Imp|Case=Dat|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Vnoun": {POS: VERB, "Aspect": "Imp", "Case": "Dat", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Vnoun"},
"ppron3:pl:dat:m3:ter:akc:praep__Case=Dat|Gender=Masc|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:inst:m3:pos__Case=Ins|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:inst:f:pos__Case=Ins|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:gen:n:pos__Case=Gen|Gender=Neut|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"pact:sg:dat:m1:imperf:aff__Aspect=Imp|Case=Dat|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:acc:m1:com__Case=Acc|Degree=Cmp|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Degree": "Cmp", "Gender": "Masc", "Number": "Plur"},
"adj:pl:gen:n:pos__Case=Gen|Gender=Neut|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "PronType": "Dem"},
"adj:pl:inst:m1:pos__Case=Ins|Gender=Masc|Number=Plur|PronType=Rel|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "PronType": "Rel"},
"pact:sg:nom:m1:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppas:pl:inst:m3:perf:aff__Aspect=Perf|Case=Ins|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:nom:m2:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:nom:m3:pos__Case=Nom|Gender=Masc|Number=Plur|Poss=Yes|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Ind"},
"num:pl:acc:m3:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|PronType=Dem|SubGender=Masc3": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Dem"},
"num:pl:inst:m2:congr__Case=Ins|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc2": {POS: NUM, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"ppas:pl:loc:m3:perf:aff__Aspect=Perf|Case=Loc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:acc:m2:pos__Case=Acc|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: ADJ, "Case": "Acc", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"winien:pl:m3:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Plur|SubGender=Masc3|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Plur", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"ppas:sg:nom:n:imperf:aff__Aspect=Imp|Case=Nom|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:dat:f:pos__Case=Dat|Gender=Fem|Number=Sing|PronType=Int": {POS: ADJ, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "PronType": "Int"},
"adj:pl:loc:n:pos__Case=Loc|Gender=Neut|Number=Plur|Number[psor]=Plur|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:dat:n:pos__Case=Dat|Gender=Neut|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Dat", "Gender": "Neut", "Number": "Sing", "PronType": "Dem"},
"adj:sg:inst:m3:pos__Case=Ins|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"adj:pl:nom:m3:pos__Case=Nom|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:dat:m3:pos__Case=Dat|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adj:pl:acc:n:pos__Case=Acc|Gender=Neut|Number=Plur|PronType=Ind": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "PronType": "Ind"},
"ppron3:pl:inst:f:ter:akc:praep__Case=Ins|Gender=Fem|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"num:pl:acc:m1:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|PronType=Int|SubGender=Masc1": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Int"},
"subst:pl:gen:m1__Case=Gen|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc1": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"pact:pl:gen:m3:imperf:aff__Aspect=Imp|Case=Gen|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:gen:n:pos__Case=Gen|Gender=Neut|Number=Plur|PronType=Rel": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "PronType": "Rel"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|PronType=Neg|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Neg"},
"praet:sg:m3:perf:nagl__Agglutination=Nagl|Aspect=Perf|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc3|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppron3:pl:dat:n:ter:akc:npraep__Case=Dat|Gender=Neut|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Neut", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:inst:f:pos__Case=Ins|Gender=Fem|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Plur", "PronType": "Dem"},
"adj:pl:loc:f:pos__Case=Loc|Gender=Fem|Number=Plur|PronType=Dem": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "PronType": "Dem"},
"num:pl:acc:m2:rec__Case=Acc|Gender=Masc|Number=Plur|NumType=Card|PronType=Ind|SubGender=Masc2": {POS: NUM, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"ppron3:pl:inst:m2:ter:akc:praep__Case=Ins|Gender=Masc|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:nom:n:pos__Case=Nom|Gender=Neut|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:nom:m3:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"pact:pl:acc:m3:imperf:aff__Aspect=Imp|Case=Acc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:gen:m3:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Int|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Int"},
"adj:pl:nom:n:pos__Case=Nom|Gender=Neut|Number=Plur|PronType=Int": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "PronType": "Int"},
"adj:pl:nom:f:pos__Case=Nom|Gender=Fem|Number=Plur|PronType=Int": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "PronType": "Int"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|PronType=Int|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Int"},
"adj:sg:gen:m3:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|PronType=Int": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "PronType": "Int"},
"adj:sg:inst:n:pos__Case=Ins|Gender=Neut|Number=Sing|PronType=Int": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"adj:sg:nom:m3:pos__Case=Nom|Emphatic=Yes|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"ppron12:sg:gen:m2:sec:akc__Case=Gen|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Long"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|PronType=Int": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "PronType": "Int"},
"pact:sg:inst:m1:imperf:aff__Aspect=Imp|Case=Ins|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:gen:m2:pos__Case=Gen|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: ADJ, "Case": "Gen", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"num:pl:gen:m2:congr__Case=Gen|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc2": {POS: NUM, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"ppron3:sg:acc:m1:ter:akc:npraep__Case=Acc|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:sg:loc:f:sec__Case=Loc|Gender=Fem|Number=Sing|Person=2|PronType=Prs": {POS: PRON, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Person": "2", "PronType": "Prs"},
"ppas:sg:inst:m1:perf:aff__Aspect=Perf|Case=Ins|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppas:pl:gen:f:imperf:neg__Aspect=Imp|Case=Gen|Gender=Fem|Number=Plur|Polarity=Neg|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:loc:f:pos__Case=Loc|Gender=Fem|Number=Sing|PronType=Rel": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Sing", "PronType": "Rel"},
"adj:sg:gen:n:pos__Case=Gen|Gender=Neut|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppron12:sg:gen:f:sec:nakc__Case=Gen|Gender=Fem|Number=Sing|Person=2|PronType=Prs|Variant=Short": {POS: PRON, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Short"},
"subst:sg:gen:m1__Case=Gen|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc1": {POS: NOUN, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"qub:nwok__Variant=Short": {POS: PRON, "Variant": "Short"},
"adj:sg:nom:m2:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"ppron3:sg:inst:m3:ter:akc:praep__Case=Ins|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"num:pl:dat:f:congr__Case=Dat|Gender=Fem|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Dat", "Gender": "Fem", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"praet:sg:m1:perf:agl__Agglutination=Agl|Aspect=Perf|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc1|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Perf", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"adj:pl:loc:n:pos__Case=Loc|Gender=Neut|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:nom:f:pos__Case=Nom|Gender=Fem|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Sing", "PronType": "Tot"},
"adj:sg:nom:n:pos__Case=Nom|Gender=Neut|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"adj:sg:inst:m1:sup__Case=Ins|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"adj:pl:gen:f:pos__Case=Gen|Gender=Fem|Number=Plur|PronType=Ind": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "PronType": "Ind"},
"adj:pl:inst:m1:pos__Case=Ins|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:acc:n:com__Case=Acc|Degree=Cmp|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Acc", "Degree": "Cmp", "Gender": "Neut", "Number": "Plur"},
"subst:sg:inst:m1__Case=Ins|Emphatic=Yes|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc1": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"ppas:sg:gen:m1:perf:aff__Aspect=Perf|Case=Gen|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"subst:sg:acc:m1__Case=Acc|Emphatic=Yes|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc1": {POS: NOUN, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"adj:sg:inst:m1:com__Case=Ins|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"adj:pl:dat:f:pos__Case=Dat|Gender=Fem|Number=Plur|PronType=Tot": {POS: ADJ, "Case": "Dat", "Gender": "Fem", "Number": "Plur", "PronType": "Tot"},
"adj:pl:dat:m1:pos__Case=Dat|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"pact:sg:dat:f:imperf:aff__Aspect=Imp|Case=Dat|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:dat:m1:pos__Case=Dat|Degree=Pos|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Degree": "Pos", "Gender": "Masc", "Number": "Plur"},
"pact:sg:acc:m1:imperf:aff__Aspect=Imp|Case=Acc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:sg:loc:m1:pos__Case=Loc|Degree=Pos|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Loc", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|PronType=Neg": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "PronType": "Neg"},
"ppas:pl:acc:f:imperf:aff__Aspect=Imp|Case=Acc|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:gen:m3:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"num:pl:gen:f:congr__Case=Gen|Gender=Fem|Number=Plur|NumType=Card|PronType=Dem": {POS: NUM, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "NumType": "Card", "PronType": "Dem"},
"adj:sg:acc:m1:com__Case=Acc|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"adj:pl:inst:n:pos__Case=Ins|Gender=Neut|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:pl:loc:f:sup__Case=Loc|Degree=Sup|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Loc", "Degree": "Sup", "Gender": "Fem", "Number": "Plur"},
"ppas:sg:gen:m3:perf:aff__Aspect=Perf|Case=Gen|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppron12:sg:acc:m1:pri:nakc__Case=Acc|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc1|Variant=Short": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Short"},
"adj:sg:gen:m1:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"adj:pl:inst:f:pos__Case=Ins|Gender=Fem|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"pact:pl:gen:m1:imperf:neg__Aspect=Imp|Case=Gen|Gender=Masc|Number=Plur|Polarity=Neg|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Act"},
"ppron3:sg:gen:n:ter:nakc:npraep__Case=Gen|Gender=Neut|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Short": {POS: PRON, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"num:pl:acc:f:rec__Case=Acc|Gender=Fem|Number=Plur|NumType=Card|PronType=Dem": {POS: NUM, "Case": "Acc", "Gender": "Fem", "Number": "Plur", "NumType": "Card", "PronType": "Dem"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|PronType=Rel": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "PronType": "Rel"},
"subst:sg:nom:n__Case=Nom|Gender=Neut|Number=Sing|NumType=Card|PronType=Ind": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "NumType": "Card", "PronType": "Ind"},
"ppron3:pl:loc:n:ter:akc:praep__Case=Loc|Gender=Neut|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:inst:n:pos__Case=Ins|Gender=Neut|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"subst:pl:voc:m1__Case=Voc|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: NOUN, "Case": "Voc", "Gender": "Masc", "Number": "Plur"},
"adj:pl:nom:m2:pos__Case=Nom|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:loc:m3:pos__Case=Loc|Gender=Masc|Number=Plur|PronType=Rel|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "PronType": "Rel"},
"ppron12:pl:gen:f:pri__Case=Gen|Gender=Fem|Number=Plur|Person=1|PronType=Prs": {POS: PRON, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"adj:pl:inst:m1:pos__Case=Ins|Gender=Masc|Number=Plur|PronType=Ind|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "PronType": "Ind"},
"ppron3:sg:inst:n:ter:akc:praep__Case=Ins|Gender=Neut|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:loc:f:pos__Case=Loc|Gender=Fem|Number=Plur|PronType=Ind": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "PronType": "Ind"},
"adj:sg:gen:m1:pos__Case=Gen|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"pact:sg:loc:m3:imperf:aff__Aspect=Imp|Case=Loc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:sg:loc:m1:pos__Case=Loc|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"ppas:pl:loc:f:perf:aff__Aspect=Perf|Case=Loc|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Loc", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:gen:f:pos__Case=Gen|Gender=Fem|Number=Plur|PronType=Rel": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "PronType": "Rel"},
"adj:sg:gen:m3:pos__Case=Gen|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"ppas:pl:nom:m2:perf:aff__Aspect=Perf|Case=Nom|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:loc:n:pos__Case=Loc|Gender=Neut|Number=Plur|PronType=Ind": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "PronType": "Ind"},
"adj:sg:gen:m3:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Neg|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Neg"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"adj:pl:nom:m3:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adj:sg:gen:m1:pos__Case=Gen|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppas:pl:loc:m3:imperf:aff__Aspect=Imp|Case=Loc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:acc:m1:pos__Case=Acc|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:inst:m3:sup__Case=Ins|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"adj:sg:dat:n:pos__Case=Dat|Gender=Neut|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Dat", "Gender": "Neut", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:dat:n:pos__Case=Dat|Degree=Pos|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Dat", "Degree": "Pos", "Gender": "Neut", "Number": "Sing"},
"adj:pl:inst:n:pos__Case=Ins|Gender=Neut|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"num:pl:nom:m2:congr__Case=Nom|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc2": {POS: NUM, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"adj:pl:gen:f:sup__Case=Gen|Degree=Sup|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Gen", "Degree": "Sup", "Gender": "Fem", "Number": "Plur"},
"adj:sg:acc:m2:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc2": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"num:pl:dat:m1:congr__Case=Dat|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc1": {POS: NUM, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"ppas:sg:gen:n:imperf:aff__Aspect=Imp|Case=Gen|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|Poss=Yes|PronType=Ind|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Ind"},
"adj:sg:inst:f:com__Case=Ins|Degree=Cmp|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Ins", "Degree": "Cmp", "Gender": "Fem", "Number": "Sing"},
"adj:pl:nom:f:pos__Case=Nom|Gender=Fem|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Fem", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:nom:n:pos__Case=Nom|Gender=Neut|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:dat:m1:pos__Case=Dat|Gender=Masc|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:inst:m3:pos__Case=Ins|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppron3:sg:acc:m1:ter:nakc:praep__Case=Acc|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Short": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"pact:pl:nom:n:imperf:aff__Aspect=Imp|Case=Nom|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron3:sg:dat:n:ter:nakc:npraep__Case=Dat|Gender=Neut|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"ppron3:sg:inst:m2:ter:akc:praep__Case=Ins|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:inst:n__Case=Ins|Gender=Neut|Number=Sing|PronType=Neg": {POS: NOUN, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Neg"},
"adj:pl:dat:m1:pos__Case=Dat|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adj:pl:gen:m1:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Neg|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Neg"},
"adj:sg:gen:m3:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Tot"},
"subst:sg:loc:n__Case=Loc|Gender=Neut|Number=Sing|PronType=Neg": {POS: NOUN, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Neg"},
"adj:sg:dat:f:pos__Case=Dat|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:loc:f:pos__Case=Loc|Gender=Fem|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:dat:f:pos__Case=Dat|Gender=Fem|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"winien:sg:n:imperf__Aspect=Imp|Gender=Neut|Mood=Ind|Number=Sing|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Neut", "Mood": "Ind", "Number": "Sing", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"adj:pl:inst:m3:pos__Case=Ins|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppas:sg:inst:n:perf:neg__Aspect=Perf|Case=Ins|Gender=Neut|Number=Sing|Polarity=Neg|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Ins", "Gender": "Neut", "Number": "Sing", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"ppron12:pl:loc:m1:sec__Case=Loc|Gender=Masc|Number=Plur|Person=2|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Person": "2", "PronType": "Prs"},
"adj:pl:gen:n:com__Case=Gen|Degree=Cmp|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Gen", "Degree": "Cmp", "Gender": "Neut", "Number": "Plur"},
"adj:sg:loc:n:pos__Case=Loc|Gender=Neut|Number=Sing|PronType=Neg": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Neg"},
"adj:pl:dat:n:pos__Case=Dat|Gender=Neut|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Dat", "Gender": "Neut", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:gen:n:pos__Case=Gen|Gender=Neut|Number=Plur|Poss=Yes|PronType=Ind": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "Poss": "Yes", "PronType": "Ind"},
"adj:pl:nom:n:pos__Case=Nom|Gender=Neut|Number=Plur|PronType=Tot": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "PronType": "Tot"},
"adj:pl:gen:m1:pos__Case=Gen|Gender=Masc|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:nom:m1:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Int|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Int"},
"adj:pl:gen:m3:pos__Case=Gen|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:nom:m2:pos__Case=Nom|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:inst:m3:com__Case=Ins|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"ppas:sg:gen:m1:imperf:aff__Aspect=Imp|Case=Gen|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:dat:m1:pos__Case=Dat|Gender=Masc|Number=Plur|PronType=Ind|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "PronType": "Ind"},
"ppas:sg:nom:f:imperf:neg__Aspect=Imp|Case=Nom|Gender=Fem|Number=Sing|Polarity=Neg|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Fem", "Number": "Sing", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:sg:inst:m2:ter:akc:npraep__Case=Ins|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:inst:m1:pos__Case=Ins|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"num:pl:acc:n:rec__Case=Acc|Gender=Neut|Number=Plur|NumType=Card|PronType=Dem": {POS: NUM, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Dem"},
"ppas:sg:loc:n:perf:aff__Aspect=Perf|Case=Loc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|Poss=Yes|PronType=Int": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Poss": "Yes", "PronType": "Int"},
"num:pl:acc:n:rec__Case=Acc|Emphatic=Yes|Gender=Neut|Number=Plur|NumType=Card|PronType=Int": {POS: NUM, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Int"},
"adj:pl:loc:n:pos__Case=Loc|Gender=Neut|Number=Plur|PronType=Tot": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "PronType": "Tot"},
"adj:pl:gen:f:com__Case=Gen|Degree=Cmp|Gender=Fem|Number=Plur": {POS: ADJ, "Case": "Gen", "Degree": "Cmp", "Gender": "Fem", "Number": "Plur"},
"adj:sg:dat:m1:pos__Case=Dat|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:dat:m1:com__Case=Dat|Degree=Cmp|Gender=Masc|Number=Sing|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Degree": "Cmp", "Gender": "Masc", "Number": "Sing"},
"ppron3:sg:inst:n:ter:akc:npraep__Case=Ins|Gender=Neut|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"subst:sg:nom:n__Case=Nom|Gender=Neut|Number=Sing|PronType=Rel": {POS: NOUN, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Rel"},
"adj:sg:inst:f:pos__Case=Ins|Gender=Fem|Number=Sing|PronType=Rel": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "PronType": "Rel"},
"ppas:pl:nom:f:imperf:neg__Aspect=Imp|Case=Nom|Gender=Fem|Number=Plur|Polarity=Neg|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Fem", "Number": "Plur", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"pact:pl:dat:m1:imperf:aff__Aspect=Imp|Case=Dat|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Dat", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:sg:gen:n:pos__Case=Gen|Gender=Neut|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"ppas:pl:gen:m1:imperf:aff__Aspect=Imp|Case=Gen|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:sg:nom:m2:ter:akc:npraep__Case=Nom|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:pl:dat:f:pri__Case=Dat|Gender=Fem|Number=Plur|Person=1|PronType=Prs": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"adj:sg:loc:n:sup__Case=Loc|Degree=Sup|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Loc", "Degree": "Sup", "Gender": "Neut", "Number": "Sing"},
"adj:pl:acc:m2:pos__Case=Acc|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc2": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:sg:acc:m2:pos__Case=Acc|Gender=Masc|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc2": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:pl:gen:n:pos__Case=Gen|Gender=Neut|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppas:pl:loc:n:perf:aff__Aspect=Perf|Case=Loc|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Loc", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:gen:n:congr__Case=Gen|Gender=Neut|Number=Plur|NumType=Card": {POS: NUM, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "NumType": "Card"},
"ppas:sg:loc:f:imperf:aff__Aspect=Imp|Case=Loc|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:pl:loc:m1:ter:akc:praep__Case=Loc|Gender=Masc|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc1|Variant=Long": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"pact:pl:acc:m2:imperf:aff__Aspect=Imp|Case=Acc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"subst:sg:loc:n__Case=Loc|Gender=Neut|Number=Sing|PronType=Rel": {POS: NOUN, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Rel"},
"adj:pl:nom:m2:sup__Case=Nom|Degree=Sup|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Degree": "Sup", "Gender": "Masc", "Number": "Plur"},
"adj:sg:inst:m3:pos__Case=Ins|Gender=Masc|Number=Sing|PronType=Neg|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Neg"},
"praet:sg:m2:imperf:nagl__Agglutination=Nagl|Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc2|Tense=Past|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"},
"ppron3:pl:inst:n:ter:akc:praep__Case=Ins|Gender=Neut|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Neut", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:pl:loc:n:imperf:aff__Aspect=Imp|Case=Loc|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:sg:loc:n:ter:akc:praep__Case=Loc|Gender=Neut|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:sg:nom:n:pos__Case=Nom|Gender=Neut|Number=Sing|PronType=Neg": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Neg"},
"adj:sg:nom:n:pos__Case=Nom|Gender=Neut|Number=Sing|PronType=Rel": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Sing", "PronType": "Rel"},
"qub:wok__Variant=Long": {POS: PRON, "Variant": "Long"},
"adj:pl:loc:n:pos__Case=Loc|Gender=Neut|Number=Plur|PronType=Rel": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "PronType": "Rel"},
"adj:pl:gen:f:pos__Case=Gen|Gender=Fem|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"ppas:sg:dat:f:perf:aff__Aspect=Perf|Case=Dat|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Dat", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:inst:f:sup__Case=Ins|Degree=Sup|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Ins", "Degree": "Sup", "Gender": "Fem", "Number": "Sing"},
"adj:sg:inst:m2:sup__Case=Ins|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc2": {POS: ADJ, "Case": "Ins", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"ppron12:sg:acc:m2:sec:nakc__Case=Acc|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc2|Variant=Short": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Short"},
"adj:sg:gen:n:pos__Case=Gen|Gender=Neut|Number=Sing|PronType=Ind": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Sing", "PronType": "Ind"},
"ppron3:sg:inst:m3:ter:akc:npraep__Case=Ins|Gender=Masc|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppron12:pl:acc:n:sec__Case=Acc|Gender=Neut|Number=Plur|Person=2|PronType=Prs": {POS: PRON, "Case": "Acc", "Gender": "Neut", "Number": "Plur", "Person": "2", "PronType": "Prs"},
"adj:pl:acc:n:sup__Case=Acc|Degree=Sup|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Acc", "Degree": "Sup", "Gender": "Neut", "Number": "Plur"},
"adj:sg:nom:m2:pos__Case=Nom|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"pact:sg:loc:m2:imperf:aff__Aspect=Imp|Case=Loc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppas:sg:dat:m1:perf:aff__Aspect=Perf|Case=Dat|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:voc:f:sup__Case=Voc|Degree=Sup|Gender=Fem|Number=Sing": {POS: ADJ, "Case": "Voc", "Degree": "Sup", "Gender": "Fem", "Number": "Sing"},
"adj:sg:acc:m1:pos__Case=Acc|Gender=Masc|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"pact:sg:inst:f:imperf:aff__Aspect=Imp|Case=Ins|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:loc:m1:com__Case=Loc|Degree=Cmp|Gender=Masc|Number=Plur|SubGender=Masc1": {POS: ADJ, "Case": "Loc", "Degree": "Cmp", "Gender": "Masc", "Number": "Plur"},
"adj:sg:gen:m3:sup__Case=Gen|Degree=Sup|Gender=Masc|Number=Sing|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Degree": "Sup", "Gender": "Masc", "Number": "Sing"},
"adj:pl:nom:m2:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"adj:sg:acc:m1:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc1": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|PronType=Int": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Int"},
"adj:pl:dat:m3:pos__Case=Dat|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc3": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"adj:pl:dat:f:pos__Case=Dat|Gender=Fem|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes": {POS: ADJ, "Case": "Dat", "Gender": "Fem", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"ppron3:pl:inst:f:ter:akc:npraep__Case=Ins|Gender=Fem|Number=Plur|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"pact:pl:nom:m2:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppas:pl:dat:m1:perf:aff__Aspect=Perf|Case=Dat|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Dat", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"num:pl:dat:m1:congr__Case=Dat|Gender=Masc|Number=Plur|NumType=Card|PronType=Ind|SubGender=Masc1": {POS: NUM, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:gen:n:pos__Case=Gen|Gender=Neut|Number=Plur|PronType=Tot": {POS: ADJ, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "PronType": "Tot"},
"ppron3:sg:loc:m2:ter:akc:praep__Case=Loc|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"adj:pl:loc:m3:sup__Case=Loc|Degree=Sup|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Degree": "Sup", "Gender": "Masc", "Number": "Plur"},
"ppron3:sg:acc:m3:ter:nakc:praep__Case=Acc|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc3|Variant=Short": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Short"},
"adj:pl:loc:m1:pos__Case=Loc|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc1": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adj:pl:inst:m3:pos__Case=Ins|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc3": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"pact:pl:loc:f:imperf:aff__Aspect=Imp|Case=Loc|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:gen:m3:pos__Case=Gen|Gender=Masc|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:inst:n:pos__Case=Ins|Gender=Neut|Number=Plur|PronType=Ind": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Plur", "PronType": "Ind"},
"adj:sg:dat:n:pos__Case=Dat|Gender=Neut|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Dat", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"depr:pl:voc:m2__Case=Voc|Gender=Masc|Number=Plur|Polite=Depr|SubGender=Masc2": {POS: NOUN, "Case": "Voc", "Gender": "Masc", "Number": "Plur", "Polite": "Depr"},
"subst:sg:inst:m1__Case=Ins|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc1": {POS: NOUN, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"pact:sg:inst:n:imperf:aff__Aspect=Imp|Case=Ins|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"num:pl:dat:f:congr__Case=Dat|Gender=Fem|Number=Plur|NumType=Card": {POS: NUM, "Case": "Dat", "Gender": "Fem", "Number": "Plur", "NumType": "Card"},
"ppron12:pl:loc:m1:pri__Case=Loc|Gender=Masc|Number=Plur|Person=1|PronType=Prs|SubGender=Masc1": {POS: PRON, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Person": "1", "PronType": "Prs"},
"adj:pl:nom:n:pos__Case=Nom|Gender=Neut|Number=Plur|PronType=Rel": {POS: ADJ, "Case": "Nom", "Gender": "Neut", "Number": "Plur", "PronType": "Rel"},
"winien:sg:m3:imperf__Aspect=Imp|Gender=Masc|Mood=Ind|Number=Sing|SubGender=Masc3|Tense=Pres|VerbForm=Fin|Voice=Act": {POS: VERB, "Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Pres", "VerbForm": "Fin", "Voice": "Act"},
"adj:pl:acc:m3:pos__Case=Acc|Gender=Masc|Number=Plur|PronType=Rel|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Plur", "PronType": "Rel"},
"ppron12:sg:dat:m2:sec:nakc__Case=Dat|Gender=Masc|Number=Sing|Person=2|PronType=Prs|SubGender=Masc2|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "2", "PronType": "Prs", "Variant": "Short"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|Poss=Yes|PronType=Ind": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Poss": "Yes", "PronType": "Ind"},
"ppas:sg:gen:f:perf:neg__Aspect=Perf|Case=Gen|Gender=Fem|Number=Sing|Polarity=Neg|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"pact:sg:gen:n:imperf:aff__Aspect=Imp|Case=Gen|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:sg:dat:m2:pos__Case=Dat|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc2": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"ppron3:sg:acc:m2:ter:akc:praep__Case=Acc|Gender=Masc|Number=Sing|Person=3|PrepCase=Pre|PronType=Prs|SubGender=Masc2|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"ppas:pl:loc:m1:imperf:aff__Aspect=Imp|Case=Loc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:dat:f:pos__Case=Dat|Gender=Fem|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Dat", "Gender": "Fem", "Number": "Sing", "PronType": "Dem"},
"num:pl:loc:n:congr__Case=Loc|Gender=Neut|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"ppas:sg:nom:m3:perf:neg__Aspect=Perf|Case=Nom|Gender=Masc|Number=Sing|Polarity=Neg|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"adj:sg:acc:n:pos__Case=Acc|Gender=Neut|Number=Sing|PronType=Rel": {POS: ADJ, "Case": "Acc", "Gender": "Neut", "Number": "Sing", "PronType": "Rel"},
"adj:pl:gen:m2:sup__Case=Gen|Degree=Sup|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: ADJ, "Case": "Gen", "Degree": "Sup", "Gender": "Masc", "Number": "Plur"},
"subst:sg:dat:n__Case=Dat|Gender=Neut|Number=Sing|PronType=Neg": {POS: NOUN, "Case": "Dat", "Gender": "Neut", "Number": "Sing", "PronType": "Neg"},
"adj:sg:dat:m1:pos__Case=Dat|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"pact:pl:inst:n:imperf:aff__Aspect=Imp|Case=Ins|Gender=Neut|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Neut", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"adj:pl:loc:m3:pos__Case=Loc|Gender=Masc|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:nom:m1:pos__Case=Nom|Gender=Masc|Number=Plur|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"num:pl:gen:n:congr__Case=Gen|Gender=Neut|Number=Plur|NumType=Card|PronType=Dem": {POS: NUM, "Case": "Gen", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Dem"},
"adj:pl:nom:m1:pos__Case=Nom|Gender=Masc|Number=Plur|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc1": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"adj:pl:loc:n:com__Case=Loc|Degree=Cmp|Gender=Neut|Number=Plur": {POS: ADJ, "Case": "Loc", "Degree": "Cmp", "Gender": "Neut", "Number": "Plur"},
"ger:sg:acc:n:imperf:neg__Aspect=Imp|Case=Acc|Gender=Neut|Number=Sing|Polarity=Neg|VerbForm=Vnoun": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Polarity": "Neg", "VerbForm": "Vnoun"},
"pact:sg:loc:n:imperf:aff__Aspect=Imp|Case=Loc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppron12:sg:dat:n:pri:nakc__Case=Dat|Gender=Neut|Number=Sing|Person=1|PronType=Prs|Variant=Short": {POS: PRON, "Case": "Dat", "Gender": "Neut", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Short"},
"num:pl:loc:n:congr__Case=Loc|Emphatic=Yes|Gender=Neut|Number=Plur|NumType=Card|PronType=Int": {POS: NUM, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Int"},
"adj:pl:loc:f:pos__Case=Loc|Gender=Fem|Number=Plur|PronType=Int": {POS: ADJ, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "PronType": "Int"},
"adj:pl:loc:n:pos__Case=Loc|Gender=Neut|Number=Plur|PronType=Int": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Plur", "PronType": "Int"},
"adj:sg:loc:m3:pos__Case=Loc|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc3": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"adj:sg:loc:n:pos__Case=Loc|Gender=Neut|Number=Sing|PronType=Tot": {POS: ADJ, "Case": "Loc", "Gender": "Neut", "Number": "Sing", "PronType": "Tot"},
"ppas:pl:gen:m2:perf:aff__Aspect=Perf|Case=Gen|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:pl:dat:f:ter:akc:praep__Case=Dat|Gender=Fem|Number=Plur|Person=3|PrepCase=Pre|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Dat", "Gender": "Fem", "Number": "Plur", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"pact:sg:loc:f:imperf:aff__Aspect=Imp|Case=Loc|Gender=Fem|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Fem", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppas:pl:loc:f:imperf:neg__Aspect=Imp|Case=Loc|Gender=Fem|Number=Plur|Polarity=Neg|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Fem", "Number": "Plur", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"pact:sg:acc:n:imperf:aff__Aspect=Imp|Case=Acc|Gender=Neut|Number=Sing|Polarity=Pos|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Acc", "Gender": "Neut", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"num:pl:loc:m1:congr__Case=Loc|Gender=Masc|Number=Plur|NumType=Card|SubGender=Masc1": {POS: NUM, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "NumType": "Card"},
"adj:pl:loc:m1:pos__Case=Loc|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc1": {POS: ADJ, "Case": "Loc", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"pact:sg:inst:f:imperf:neg__Aspect=Imp|Case=Ins|Gender=Fem|Number=Sing|Polarity=Neg|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Act"},
"adj:sg:acc:m3:pos__Case=Acc|Gender=Masc|Number=Sing|Number[psor]=Plur|Person=2|Poss=Yes|PronType=Prs|SubGender=Masc3": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:gen:m3:com__Case=Gen|Degree=Cmp|Gender=Masc|Number=Plur|SubGender=Masc3": {POS: ADJ, "Case": "Gen", "Degree": "Cmp", "Gender": "Masc", "Number": "Plur"},
"adj:sg:acc:m2:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Dem|SubGender=Masc2": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Dem"},
"adj:pl:inst:m1:pos__Case=Ins|Gender=Masc|Number=Plur|Number[psor]=Plur|Person=1|Poss=Yes|PronType=Prs|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "Person": "1", "Poss": "Yes", "PronType": "Prs"},
"adj:pl:nom:m2:pos__Case=Nom|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc2": {POS: ADJ, "Case": "Nom", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"adj:sg:gen:f:pos__Case=Gen|Gender=Fem|Number=Sing|Number[psor]=Sing|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Gen", "Gender": "Fem", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
"adj:sg:dat:n:com__Case=Dat|Degree=Cmp|Gender=Neut|Number=Sing": {POS: ADJ, "Case": "Dat", "Degree": "Cmp", "Gender": "Neut", "Number": "Sing"},
"adj:pl:dat:m1:pos__Case=Dat|Gender=Masc|Number=Plur|PronType=Rel|SubGender=Masc1": {POS: ADJ, "Case": "Dat", "Gender": "Masc", "Number": "Plur", "PronType": "Rel"},
"num:pl:loc:f:congr__Case=Loc|Gender=Fem|Number=Plur|NumType=Card|PronType=Dem": {POS: NUM, "Case": "Loc", "Gender": "Fem", "Number": "Plur", "NumType": "Card", "PronType": "Dem"},
"pact:pl:gen:m2:imperf:aff__Aspect=Imp|Case=Gen|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Gen", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"num:pl:dat:n:congr__Case=Dat|Gender=Neut|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Dat", "Gender": "Neut", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"adj:pl:gen:m2:pos__Case=Gen|Gender=Masc|Number=Plur|PronType=Dem|SubGender=Masc2": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Plur", "PronType": "Dem"},
"subst:pl:loc:m2__Case=Loc|Gender=Masc|Number=Plur|SubGender=Masc2": {POS: NOUN, "Case": "Loc", "Gender": "Masc", "Number": "Plur"},
"adj:sg:gen:m1:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"pact:sg:nom:m2:imperf:aff__Aspect=Imp|Case=Nom|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc2|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Nom", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppas:pl:acc:m3:perf:neg__Aspect=Perf|Case=Acc|Gender=Masc|Number=Plur|Polarity=Neg|SubGender=Masc3|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Polarity": "Neg", "VerbForm": "Part", "Voice": "Pass"},
"ppron12:sg:acc:m3:pri:akc__Case=Acc|Gender=Masc|Number=Sing|Person=1|PronType=Prs|SubGender=Masc3|Variant=Long": {POS: PRON, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "Person": "1", "PronType": "Prs", "Variant": "Long"},
"adj:sg:gen:m1:pos__Case=Gen|Gender=Masc|Number=Sing|PronType=Ind|SubGender=Masc1": {POS: ADJ, "Case": "Gen", "Gender": "Masc", "Number": "Sing", "PronType": "Ind"},
"pact:pl:loc:m3:imperf:aff__Aspect=Imp|Case=Loc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc3|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"pact:sg:loc:m1:imperf:aff__Aspect=Imp|Case=Loc|Gender=Masc|Number=Sing|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Act": {POS: VERB, "Aspect": "Imp", "Case": "Loc", "Gender": "Masc", "Number": "Sing", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Act"},
"ppas:pl:dat:f:perf:aff__Aspect=Perf|Case=Dat|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Dat", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppas:pl:dat:f:imperf:aff__Aspect=Imp|Case=Dat|Gender=Fem|Number=Plur|Polarity=Pos|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Imp", "Case": "Dat", "Gender": "Fem", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"ppron3:sg:inst:f:ter:akc:npraep__Case=Ins|Gender=Fem|Number=Sing|Person=3|PrepCase=Npr|PronType=Prs|Variant=Long": {POS: PRON, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "Person": "3", "PronType": "Prs", "Variant": "Long"},
"num:pl:inst:f:congr__Case=Ins|Gender=Fem|Number=Plur|NumType=Card|PronType=Ind": {POS: NUM, "Case": "Ins", "Gender": "Fem", "Number": "Plur", "NumType": "Card", "PronType": "Ind"},
"adj:sg:inst:f:pos__Case=Ins|Emphatic=Yes|Gender=Fem|Number=Sing|PronType=Dem": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Sing", "PronType": "Dem"},
"adj:pl:inst:m2:pos__Case=Ins|Gender=Masc|Number=Plur|PronType=Tot|SubGender=Masc2": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Plur", "PronType": "Tot"},
"ppas:pl:acc:m1:perf:aff__Aspect=Perf|Case=Acc|Gender=Masc|Number=Plur|Polarity=Pos|SubGender=Masc1|VerbForm=Part|Voice=Pass": {POS: VERB, "Aspect": "Perf", "Case": "Acc", "Gender": "Masc", "Number": "Plur", "Polarity": "Pos", "VerbForm": "Part", "Voice": "Pass"},
"adj:pl:inst:f:pos__Case=Ins|Gender=Fem|Number=Plur|PronType=Ind": {POS: ADJ, "Case": "Ins", "Gender": "Fem", "Number": "Plur", "PronType": "Ind"},
"adj:sg:inst:n:pos__Case=Ins|Gender=Neut|Number=Sing|PronType=Ind": {POS: ADJ, "Case": "Ins", "Gender": "Neut", "Number": "Sing", "PronType": "Ind"},
"adj:sg:acc:m2:pos__Case=Acc|Gender=Masc|Number=Sing|PronType=Int|SubGender=Masc2": {POS: ADJ, "Case": "Acc", "Gender": "Masc", "Number": "Sing", "PronType": "Int"},
"adj:sg:inst:m1:pos__Case=Ins|Gender=Masc|Number=Sing|Poss=Yes|PronType=Prs|Reflex=Yes|SubGender=Masc1": {POS: ADJ, "Case": "Ins", "Gender": "Masc", "Number": "Sing", "Poss": "Yes", "PronType": "Prs", "Reflex": "Yes"},
"subst:sg:nom:m1__Case=Nom|Gender=Masc|Number=Sing|PronType=Rel|SubGender=Masc1": {POS: NOUN, "Case": "Nom", "Gender": "Masc", "Number": "Sing", "PronType": "Rel"},
"adj:sg:acc:f:pos__Case=Acc|Gender=Fem|Number=Sing|Number[psor]=Plur|Person=2|Poss=Yes|PronType=Prs": {POS: ADJ, "Case": "Acc", "Gender": "Fem", "Number": "Sing", "Person": "2", "Poss": "Yes", "PronType": "Prs"},
}
| 187.10727 | 275 | 0.652535 | 31,345 | 211,057 | 4.321933 | 0.005073 | 0.085922 | 0.120232 | 0.079722 | 0.985672 | 0.979612 | 0.970842 | 0.953008 | 0.936134 | 0.862148 | 0 | 0.009544 | 0.078642 | 211,057 | 1,127 | 276 | 187.273292 | 0.687111 | 0 | 0 | 0 | 0 | 0.914742 | 0.693235 | 0.450892 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.079929 | 0.001776 | 0 | 0.001776 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 11 |
6b626f743111c88f30530aac2e09a665237f7894 | 31 | py | Python | project1/subpackage/module.py | stroxler/pyre-check-scratch-projects | 0da938da1fbf84437f5d37fc1685e2c561a44e10 | [
"MIT"
] | null | null | null | project1/subpackage/module.py | stroxler/pyre-check-scratch-projects | 0da938da1fbf84437f5d37fc1685e2c561a44e10 | [
"MIT"
] | null | null | null | project1/subpackage/module.py | stroxler/pyre-check-scratch-projects | 0da938da1fbf84437f5d37fc1685e2c561a44e10 | [
"MIT"
] | null | null | null | def f():
return "a string"
| 10.333333 | 21 | 0.548387 | 5 | 31 | 3.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.290323 | 31 | 2 | 22 | 15.5 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0.258065 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
6b9fd6fce64b1b73ad691bc0c9c4eb923402538f | 1,059 | py | Python | vscode/extensions/magicstack.magicpython-1.0.12/test/strings/format12.py | nlimpid/dotfiles | b78d08707992f742f984f556fa58349c2ccd095d | [
"MIT"
] | 5 | 2017-02-22T10:17:39.000Z | 2021-04-06T16:36:13.000Z | test/strings/format12.py | Setonas/MagicSetonas | ef76da5f27a0506b194c58072b81424e3ce985d7 | [
"MIT"
] | 4 | 2019-06-16T09:52:03.000Z | 2019-08-18T02:11:35.000Z | vscode/extensions/magicstack.magicpython-1.0.12/test/strings/format12.py | nlimpid/dotfiles | b78d08707992f742f984f556fa58349c2ccd095d | [
"MIT"
] | 1 | 2020-08-29T02:30:52.000Z | 2020-08-29T02:30:52.000Z | a = R'$\frac{m_{j \%srightarrow i}(\mathrm{%sgood})}{\su%m{m_{j \rightarrow i}}}$'
a : source.python
: source.python
= : keyword.operator.assignment.python, source.python
: source.python
R : source.python, storage.type.string.python, string.quoted.raw.single.python
' : punctuation.definition.string.begin.python, source.python, string.quoted.raw.single.python
$\frac : source.python, string.quoted.raw.single.python
{m_{j \ : source.python, string.quoted.raw.single.python
%s : constant.character.format.placeholder.other.python, source.python, string.quoted.raw.single.python
rightarrow i}(\mathrm{ : source.python, string.quoted.raw.single.python
%s : constant.character.format.placeholder.other.python, source.python, string.quoted.raw.single.python
good})}{\su%m{m_{j \rightarrow i}}}$ : source.python, string.quoted.raw.single.python
' : punctuation.definition.string.end.python, source.python, string.quoted.raw.single.python
| 58.833333 | 114 | 0.675165 | 133 | 1,059 | 5.345865 | 0.24812 | 0.219409 | 0.227848 | 0.265823 | 0.752461 | 0.752461 | 0.707454 | 0.644163 | 0.517581 | 0.348805 | 0 | 0 | 0.17847 | 1,059 | 17 | 115 | 62.294118 | 0.817241 | 0 | 0 | 0.285714 | 0 | 0.357143 | 0.070822 | 0.029273 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6ba03fdfb20c20c5978f15203528a5a9b5a16bb7 | 1,611 | py | Python | home/models.py | Kshitij-Kumar-Singh-Chauhan/docon | bff0547e7bbd030e027217a2ca7800a8da529b56 | [
"MIT"
] | null | null | null | home/models.py | Kshitij-Kumar-Singh-Chauhan/docon | bff0547e7bbd030e027217a2ca7800a8da529b56 | [
"MIT"
] | null | null | null | home/models.py | Kshitij-Kumar-Singh-Chauhan/docon | bff0547e7bbd030e027217a2ca7800a8da529b56 | [
"MIT"
] | 2 | 2021-06-17T05:35:07.000Z | 2021-06-17T06:01:23.000Z | from django.db import models
# Create your models here.
class Contact(models.Model):
email = models.CharField(max_length=122)
name = models.CharField(max_length=122)
phone = models.CharField(max_length=12)
address = models.TextField()
date = models.DateField()
def __str__(self):
return self.name
class UserDetails(models.Model):
name = models.CharField(max_length=100, default=" ")
email = models.CharField(max_length=100)
password = models.CharField(max_length=100)
key = models.CharField(max_length=100, default=" ")
profession = models.CharField(max_length=100, default="PATIENT" )
data = models.CharField(max_length=100, default=" ")
def __str__(self):
return self.email
class Book(models.Model):
email = models.CharField(max_length=122)
name = models.CharField(max_length=122)
phone = models.CharField(max_length=12)
problem = models.TextField()
date = models.DateField()
def __str__(self):
return self.name+" "+self.problem
class Report(models.Model):
email = models.CharField(max_length=122)
name = models.CharField(max_length=122)
phone = models.CharField(max_length=12)
message = models.TextField()
date = models.DateField()
def __str__(self):
return self.name
class Diagnostic(models.Model):
email = models.CharField(max_length=122)
name = models.CharField(max_length=122)
phone = models.CharField(max_length=12)
tests = models.TextField()
date = models.DateField()
def __str__(self):
return self.name | 29.833333 | 69 | 0.683426 | 198 | 1,611 | 5.368687 | 0.207071 | 0.253998 | 0.304798 | 0.406397 | 0.828786 | 0.750706 | 0.622766 | 0.622766 | 0.622766 | 0.622766 | 0 | 0.03885 | 0.201117 | 1,611 | 54 | 70 | 29.833333 | 0.787102 | 0.014898 | 0 | 0.571429 | 0 | 0 | 0.006936 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.119048 | false | 0.02381 | 0.02381 | 0.119048 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
6becf706ff15ad980f8cec97364a33a05135114f | 96,826 | py | Python | test/end/test_ending.py | reputage/bluepea | ab77548b76bfd410c60fd98c2b35a0ca84715ff0 | [
"Apache-2.0"
] | null | null | null | test/end/test_ending.py | reputage/bluepea | ab77548b76bfd410c60fd98c2b35a0ca84715ff0 | [
"Apache-2.0"
] | null | null | null | test/end/test_ending.py | reputage/bluepea | ab77548b76bfd410c60fd98c2b35a0ca84715ff0 | [
"Apache-2.0"
] | null | null | null | from __future__ import generator_stop
import os
import tempfile
import shutil
import binascii
import base64
import datetime
import time
import copy
from collections import OrderedDict as ODict
try:
import simplejson as json
except ImportError:
import json
import arrow
import libnacl
from ioflo.base import storing
from ioflo.aid import timing
from ioflo.aio.http import Valet, Patron
from ioflo.aid import odict
import falcon
import pytest
from pytest import approx
import pytest_falcon # declares client fixture
"""
PyTest fixtures are registered globally in the pytest package
So any test function can accept a fixture as a parameter supplied by
the pytest runner
pytest_falcon assumes there is a fixture named 'app'
"""
from bluepea import bluepeaing
from bluepea.bluepeaing import SEPARATOR_BYTES, PROPAGATION_DELAY
from bluepea.help.helping import (key64uToKey, keyToKey64u, makeDid,
verify, verify64u, parseSignatureHeader,
setupTmpBaseDir, cleanupTmpBaseDir,
makeSignedAgentReg, makeSignedThingReg,
extractDidSignerParts,
)
from bluepea.db.dbing import setupTestDbEnv, setupTestDbAgentsThings
import bluepea.db.dbing as dbing
import bluepea.keep.keeping as keeping
import bluepea.prime.priming as priming
import bluepea.end.ending as ending
store = storing.Store(stamp=0.0)
exapp = falcon.API() # falcon.API instances are callable WSGI apps
ending.loadEnds(exapp, store=store)
@pytest.fixture
def app():
return exapp
def test_get_StaticSink(client): # client is a fixture in pytest_falcon
"""
Test GET to static files
"""
print("Testing GET /static")
# get default /static
rep = client.get('/static')
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'text/html; charset=UTF-8'
assert len(rep.body) > 0
# get default /
rep = client.get('/')
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'text/html; charset=UTF-8'
assert len(rep.body) > 0
# get default trailing /
rep = client.get('/static/')
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'text/html; charset=UTF-8'
assert len(rep.body) > 0
# get main.html
rep = client.get('/static/main.html')
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'text/html; charset=UTF-8'
assert len(rep.body) > 0
# get main.html
rep = client.get('/main.html')
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'text/html; charset=UTF-8'
assert len(rep.body) > 0
# attempt missing file
rep = client.get('/static/missing.txt')
assert rep.status == falcon.HTTP_NOT_FOUND
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert rep.json == {
'description': 'File '
'"/Data/Code/private/indigo/bluepea/src/bluepea/static/missing.txt" '
'not found or forbidden',
'title': 'Missing Resource'}
# get robots.txt
rep = client.get('/static/robots.txt')
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'text/plain; charset=UTF-8'
assert rep.body == '# robotstxt.org\n\nUser-agent: *\n'
# get trial.js
rep = client.get('/static/trial.js')
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/javascript; charset=UTF-8'
assert len(rep.body) > 0
print("Done Test")
def test_post_AgentRegisterSigned():
"""
Use libnacl and Base64 to generate compliant signed Agent Registration
Test both POST to create resource and subsequent GET to retrieve it.
"""
global store # use Global store
print("Testing Issuer creation POST Demo /agent/register with signature ")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# create local test server
valet = Valet(port=8101,
bufsize=131072,
store=store,
app=exapp,)
result = valet.open()
assert result
assert valet.servant.ha == ('0.0.0.0', 8101)
assert valet.servant.eha == ('127.0.0.1', 8101)
# Create registration for issuer Ike
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
# Ann's seed
seed = (b'PTi\x15\xd5\xd3`\xf1u\x15}^r\x9bfH\x02l\xc6\x1b\x1d\x1c\x0b9\xd7{\xc0_'
b'\xf2K\x93`')
# creates signing/verification key pair
vk, sk = libnacl.crypto_sign_seed_keypair(seed)
dt = datetime.datetime(2000, 1, 1, tzinfo=datetime.timezone.utc)
date = timing.iso8601(dt, aware=True)
assert date == "2000-01-01T00:00:00+00:00"
assert arrow.get(date).datetime == dt
sig, ser = makeSignedAgentReg(vk, sk, changed=date)
assert sig == ('AeYbsHot0pmdWAcgTo5sD8iAuSQAfnH5U6wiIGpVNJQQoYKBYrPPx'
'AoIc1i5SHCIDS8KFFgf8i0tDq8XGizaCg==')
assert ser == (
'{\n'
' "did": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ]\n'
'}')
dat = json.loads(ser, object_pairs_hook=ODict)
did = dat['did']
headers = ODict()
headers["Content-Type"] = "application/json; charset=UTF-8"
headers["Signature"] = 'signer="{}"'.format(sig)
assert headers['Signature'] == ('signer="AeYbsHot0pmdWAcgTo5sD8iAuSQAfnH5U6wiIGpVNJQQoYKBYrPPxAoIc1i5SHCIDS8KFFgf8i0tDq8XGizaCg=="')
body = ser # client.post encodes the body
path = "http://{}:{}{}".format('localhost', valet.servant.eha[1], '/agent')
# instantiate Patron client
patron = Patron(bufsize=131072,
store=store,
method = 'POST',
path=path,
headers=headers,
body=body,
reconnectable=True,)
patron.transmit()
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 201
assert rep['reason'] == 'Created'
assert rep['data'] == dat == {
'changed': '2000-01-01T00:00:00+00:00',
'did': 'did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=',
'keys':
[
{
'key': 'Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=',
'kind': 'EdDSA'
}
],
'signer': 'did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0'
}
assert rep['body'].decode() == (
'{\n'
' "did": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ]\n'
'}')
location = falcon.uri.decode(rep['headers']['location'])
assert location == "/agent?did=did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE="
path, query = location.rsplit("?", maxsplit=1)
assert query == "did=did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE="
query = falcon.uri.parse_query_string(query)
assert query['did'] == "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE="
assert rep['headers']['content-type'] == "application/json; charset=UTF-8"
# make sure in database
rdat, rser, rsig = dbing.getSelfSigned(did)
assert rdat == dat
assert rser == ser
assert rsig == sig
# now get from location
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
patron.request(method='GET', path=location, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == sig
assert rep['data'] == dat
ser = rep['body'].decode()
assert verify64u(sig, ser, dat['keys'][0]['key'])
# now get all
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
path = "/agent"
qargs = odict(all="true")
patron.request(method='GET', path=path, qargs=qargs, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
assert len(rep['data']) == 2 # server and new agent
assert did in rep["data"]
cleanupTmpBaseDir(dbEnv.path())
valet.close()
patron.close()
print("Done Test")
def test_post_IssuerRegisterSigned():
"""
Use libnacl and Base64 to generate compliant signed Agent Registration
Test both POST to create resource and subsequent GET to retrieve it.
"""
global store # use Global store
print("Testing Issuer creation POST Demo /agent/register with signature ")
#store = Store(stamp=0.0) # create store use global above
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# create local test server
valet = Valet(port=8101,
bufsize=131072,
store=store,
app=exapp,)
result = valet.open()
assert result
assert valet.servant.ha == ('0.0.0.0', 8101)
assert valet.servant.eha == ('127.0.0.1', 8101)
# Create registration for issuer Ike
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
# ike's seed
seed = (b'!\x85\xaa\x8bq\xc3\xf8n\x93]\x8c\xb18w\xb9\xd8\xd7\xc3\xcf\x8a\x1dP\xa9m'
b'\x89\xb6h\xfe\x10\x80\xa6S')
# creates signing/verification key pair
vk, sk = libnacl.crypto_sign_seed_keypair(seed)
dt = datetime.datetime(2000, 1, 1, tzinfo=datetime.timezone.utc)
stamp = timing.iso8601(dt, aware=True)
assert stamp == "2000-01-01T00:00:00+00:00"
assert arrow.get(stamp).datetime == dt
issuant = ODict(kind="dns",
issuer="localhost",
registered=stamp,
validationURL="http://localhost:8101/demo/check")
issuants = [issuant] # list of hids
sig, ser = makeSignedAgentReg(vk, sk, changed=stamp, issuants=issuants)
assert sig == ('YbLIOlRx8xh5taCxW-_aCBoPboLAZjK5-d1DP4OZ9PWn13BpPCe12ZFVZfFlSsM3Pv-zljbsJnR6Adz7iE5ZAw==')
assert ser == (
'{\n'
' "did": "did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=",\n'
' "signer": "did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ],\n'
' "issuants": [\n'
' {\n'
' "kind": "dns",\n'
' "issuer": "localhost",\n'
' "registered": "2000-01-01T00:00:00+00:00",\n'
' "validationURL": "http://localhost:8101/demo/check"\n'
' }\n'
' ]\n'
'}')
dat = json.loads(ser, object_pairs_hook=ODict)
did = dat['did']
headers = ODict()
headers["Content-Type"] = "application/json; charset=UTF-8"
headers["Signature"] = 'signer="{}"'.format(sig)
assert headers['Signature'] == ('signer="YbLIOlRx8xh5taCxW-_aCBoPboLAZjK5-d1DP4OZ9PWn13BpPCe12ZFVZfFlSsM3Pv-zljbsJnR6Adz7iE5ZAw=="')
body = ser # client.post encodes the body
path = "http://{}:{}{}".format('localhost', valet.servant.eha[1], '/agent')
# instantiate Patron client
patron = Patron(bufsize=131072,
store=store,
method = 'POST',
path=path,
headers=headers,
body=body,
reconnectable=True,)
patron.transmit()
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 201
assert rep['reason'] == 'Created'
body = rep['body'].decode()
assert body == (
'{\n'
' "did": "did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=",\n'
' "signer": "did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ],\n'
' "issuants": [\n'
' {\n'
' "kind": "dns",\n'
' "issuer": "localhost",\n'
' "registered": "2000-01-01T00:00:00+00:00",\n'
' "validationURL": "http://localhost:8101/demo/check"\n'
' }\n'
' ]\n'
'}')
assert rep['data'] == {'changed': '2000-01-01T00:00:00+00:00',
'did': 'did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=',
'issuants': [{'issuer': 'localhost',
'kind': 'dns',
'registered': '2000-01-01T00:00:00+00:00',
'validationURL': 'http://localhost:8101/demo/check'}],
'keys': [{'key': '3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=',
'kind': 'EdDSA'}],
'signer': 'did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=#0'}
location = falcon.uri.decode(rep['headers']['location'])
assert location == "/agent?did=did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA="
assert rep['headers']['content-type'] == "application/json; charset=UTF-8"
reg = rep['data']
assert reg == dat
# make sure in database
rdat, rser, rsig = dbing.getSelfSigned(did)
assert rdat == dat
assert rser == ser
assert rsig == sig
# now get from location
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
#request = odict([('method', 'GET'),
#('path', location),
#('headers', headers) ])
#patron.requests.append(request)
patron.request(method='GET', path=location, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == sig
assert rep['data'] == dat
ser = rep['body'].decode()
assert verify64u(sig, ser, dat['keys'][0]['key'])
# now get all issuers
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
path = "/agent"
qargs = odict(all="true", issuer="true")
patron.request(method='GET', path=path, qargs=qargs, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
assert len(rep['data']) == 1 # new agent
assert did in rep["data"]
cleanupTmpBaseDir(dbEnv.path())
valet.close()
patron.close()
print("Done Test")
def test_post_IssuerRegisterSignedFakeHid():
"""
Use fake hid kind
Use libnacl and Base64 to generate compliant signed Agent Registration
Test both POST to create resource and subsequent GET to retrieve it.
"""
global store # use Global store
print("Testing Issuer creation POST Demo /agent/register with signature Fake hid")
bluepeaing.fakeHidKind = True
#store = Store(stamp=0.0) # create store use global above
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# create local test server
valet = Valet(port=8101,
bufsize=131072,
store=store,
app=exapp,)
result = valet.open()
assert result
assert valet.servant.ha == ('0.0.0.0', 8101)
assert valet.servant.eha == ('127.0.0.1', 8101)
# Create registration for issuer Ike
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
# ike's seed
seed = (b'!\x85\xaa\x8bq\xc3\xf8n\x93]\x8c\xb18w\xb9\xd8\xd7\xc3\xcf\x8a\x1dP\xa9m'
b'\x89\xb6h\xfe\x10\x80\xa6S')
# creates signing/verification key pair
vk, sk = libnacl.crypto_sign_seed_keypair(seed)
dt = datetime.datetime(2000, 1, 1, tzinfo=datetime.timezone.utc)
stamp = timing.iso8601(dt, aware=True)
assert stamp == "2000-01-01T00:00:00+00:00"
assert arrow.get(stamp).datetime == dt
issuant = ODict(kind="fake",
issuer="generic",
registered=stamp)
issuants = [issuant] # list of hids
sig, ser = makeSignedAgentReg(vk, sk, changed=stamp, issuants=issuants)
assert sig == 'tnvn8jrwUxtoiCItTjFrCUYgHk3L4Q32INHTD9lFTBEFpJdgBCjsxHIN2e5eu4oICdi9b2tvtJmDbTi2U3PkBA=='
assert ser == (
'{\n'
' "did": "did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=",\n'
' "signer": "did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ],\n'
' "issuants": [\n'
' {\n'
' "kind": "fake",\n'
' "issuer": "generic",\n'
' "registered": "2000-01-01T00:00:00+00:00"\n'
' }\n'
' ]\n'
'}')
dat = json.loads(ser, object_pairs_hook=ODict)
did = dat['did']
headers = ODict()
headers["Content-Type"] = "application/json; charset=UTF-8"
headers["Signature"] = 'signer="{}"'.format(sig)
body = ser # client.post encodes the body
path = "http://{}:{}{}".format('localhost', valet.servant.eha[1], '/agent')
# instantiate Patron client
patron = Patron(bufsize=131072,
store=store,
method = 'POST',
path=path,
headers=headers,
body=body,
reconnectable=True,)
patron.transmit()
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 201
assert rep['reason'] == 'Created'
assert rep['data'] == {'changed': '2000-01-01T00:00:00+00:00',
'did': 'did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=',
'issuants': [{'issuer': 'generic',
'kind': 'fake',
'registered': '2000-01-01T00:00:00+00:00'}],
'keys': [{'key': '3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=',
'kind': 'EdDSA'}],
'signer': 'did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=#0'}
location = falcon.uri.decode(rep['headers']['location'])
assert location == "/agent?did=did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA="
assert rep['headers']['content-type'] == "application/json; charset=UTF-8"
reg = rep['data']
assert reg == dat
# make sure in database
rdat, rser, rsig = dbing.getSelfSigned(did)
assert rdat == dat
assert rser == ser
assert rsig == sig
# now get from location
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
#request = odict([('method', 'GET'),
#('path', location),
#('headers', headers) ])
#patron.requests.append(request)
patron.request(method='GET', path=location, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == sig
assert rep['data'] == dat
ser = rep['body'].decode()
assert verify64u(sig, ser, dat['keys'][0]['key'])
cleanupTmpBaseDir(dbEnv.path())
valet.close()
patron.close()
bluepeaing.fakeHidKind = False
print("Done Test")
def test_post_ThingRegisterSigned(): # client is a fixture in pytest_falcon
"""
Use libnacl and Base64 to generate compliant signed Thing Registration
Test both POST to create resource and subsequent GET to retrieve it.
Does an Agent registration to setup database
"""
global store # use Global store
print("Testing Thing creation POST /thing with signature ")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# create local test server
valet = Valet(port=8101,
bufsize=131072,
store=store,
app=exapp,)
result = valet.open()
assert result
assert valet.servant.ha == ('0.0.0.0', 8101)
assert valet.servant.eha == ('127.0.0.1', 8101)
dt = datetime.datetime(2000, 1, 1, tzinfo=datetime.timezone.utc)
changed = timing.iso8601(dt, aware=True)
# First create the controlling agent for thing
# random seed used to generate private signing key
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
# make "ivy" the issurer
seed = seed = (b"\xb2PK\xad\x9b\x92\xa4\x07\xc6\xfa\x0f\x13\xd7\xe4\x08\xaf\xc7'~\x86"
b'\xd2\x92\x93rA|&9\x16Bdi')
# creates signing/verification key pair
ivk, isk = libnacl.crypto_sign_seed_keypair(seed)
issuant = ODict(kind="dns",
issuer="localhost",
registered=changed,
validationURL="http://localhost:8101/demo/check")
issuants = [issuant] # list of issuants hid name spaces
sig, ser = makeSignedAgentReg(ivk, isk, changed=changed, issuants=issuants)
idat = json.loads(ser, object_pairs_hook=ODict)
idid = idat['did']
dbing.putSigned(key=idid, ser=ser, sig=sig, clobber=False)
assert idat == {
'did': 'did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=',
'signer': 'did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#0',
'changed': '2000-01-01T00:00:00+00:00',
'keys':
[
{
'key': 'dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=',
'kind': 'EdDSA'
}
],
'issuants':
[
{
'issuer': 'localhost',
'kind': 'dns',
'registered': '2000-01-01T00:00:00+00:00',
'validationURL': 'http://localhost:8101/demo/check'
}
]
}
# Now create the thing
# creates signing/verification key pair thing DID
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b'\xba^\xe4\xdd\x81\xeb\x8b\xfa\xb1k\xe2\xfd6~^\x86tC\x9c\xa7\xe3\x1d2\x9d'
b'P\xdd&R <\x97\x01')
dvk, dsk = libnacl.crypto_sign_seed_keypair(seed)
assert dvk == (b'\xe0\x90\x8c\xf1\xd2V\xc3\xf3\xb9\xee\xf38\x90\x0bS\xb7L\x96\xa9('
b'\x01\xbb\x08\x87\xa5X\x1d\xe7\x90b\xa0#')
assert dsk == (b'\xba^\xe4\xdd\x81\xeb\x8b\xfa\xb1k\xe2\xfd6~^\x86tC\x9c\xa7\xe3\x1d2\x9d'
b'P\xdd&R <\x97\x01\xe0\x90\x8c\xf1\xd2V\xc3\xf3\xb9\xee\xf38\x90\x0bS\xb7'
b'L\x96\xa9(\x01\xbb\x08\x87\xa5X\x1d\xe7\x90b\xa0#')
signer = idat['signer']
hid = "hid:dns:localhost#02"
data = ODict(keywords=["Canon", "EOS Rebel T6", "251440"],
message="If found please return.")
dsignature, ssignature, tregistration = makeSignedThingReg(dvk,
dsk,
isk,
signer,
changed=changed,
hid=hid,
data=data)
assert tregistration == (
'{\n'
' "did": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "hid": "hid:dns:localhost#02",\n'
' "signer": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "data": {\n'
' "keywords": [\n'
' "Canon",\n'
' "EOS Rebel T6",\n'
' "251440"\n'
' ],\n'
' "message": "If found please return."\n'
' }\n'
'}')
assert dsignature == 'bzJDEvEprraZc9aOLYS7WaPi5UB_px0EH9wu76rFPrbRgjAUO9JJ4roMpQrD31v3WlbHHTG8WzB5L8PE6v3BCg=='
assert ssignature == 'FGRHzSNS70LIjwcSTAxHx5RahDwAet090fYSnsReMco_WvpTVpvfEygWDXslCBh0TqBoEOMLQ78-kN8fj6NFAg=='
treg = json.loads(tregistration, object_pairs_hook=ODict)
assert treg == {
"did": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",
"hid": "hid:dns:localhost#02",
"signer": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#0",
"changed": "2000-01-01T00:00:00+00:00",
"data":
{
"keywords": ["Canon", "EOS Rebel T6", "251440"],
"message": "If found please return.",
}
}
headers = {
"Content-Type": "text/html; charset=utf-8",
"Signature": 'signer="{}";did="{}"'.format(ssignature, dsignature),
}
body = tregistration # client.post encodes the body
#rep = client.post('/thing', body=body, headers=headers)
path = "http://{}:{}{}".format('localhost', valet.servant.eha[1], '/thing')
# instantiate Patron client
patron = Patron(bufsize=131072,
store=store,
method = 'POST',
path=path,
headers=headers,
body=body,
reconnectable=True,)
patron.transmit()
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 201
assert rep['reason'] == 'Created'
location = falcon.uri.decode(rep['headers']['location'])
assert location == "/thing?did=did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM="
path, query = location.rsplit("?", maxsplit=1)
assert query == "did=did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM="
query = falcon.uri.parse_query_string(query)
tdid = query['did']
assert tdid == "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM="
assert rep["headers"]['content-type'] == "application/json; charset=UTF-8"
assert rep['data'] == treg
assert treg["did"] == tdid
# make sure in database
tdat, tser, tsig = dbing.getSigned(tdid)
assert tdat == treg
assert tser == tregistration
assert tsig == ssignature
sverkey = keyToKey64u(ivk)
assert sverkey == 'dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY='
assert verify64u(signature=tsig, message=tser, verkey=sverkey)
# verify hid table entry
assert hid == treg['hid']
htdid = dbing.getHid(treg['hid'])
assert htdid == tdid
# get thing by did
print("Testing GET /thing?did=....")
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
patron.request(method='GET', path=location, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == ssignature
assert rep['data'] == treg
ser = rep['body'].decode()
assert ser == tregistration
assert verify64u(ssignature, ser, sverkey)
# get thing by hid
print("Testing GET /thing?hid=....")
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
qargs = odict(hid=hid)
path = '/thing'
patron.request(method='GET', path=path, qargs=qargs, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == ssignature
assert rep['data'] == treg
ser = rep['body'].decode()
assert ser == tregistration
assert verify64u(ssignature, ser, sverkey)
# now get all
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
path = "/thing"
qargs = odict(all="true")
patron.request(method='GET', path=path, qargs=qargs, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
assert len(rep['data']) == 1 # new thing
assert tdid in rep["data"]
cleanupTmpBaseDir(dbEnv.path())
valet.close()
patron.close()
print("Done Test")
def test_get_AgentServer(client): # client is a fixture in pytest_falcon
"""
Test GET to retrieve precreated server agent.
"""
print("Testing GET /server with signature")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
print("Testing GET /server")
rep = client.get('/server')
assert rep.status == falcon.HTTP_OK
assert int(rep.headers['content-length']) == 291
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert rep.headers['signature'] == ('signer="u72j9aKHgz99f0K8pSkMnyqwvEr_3r'
'pS_z2034L99sTWrMIIJGQPbVuIJ1cupo6cfIf_KCB5ecVRYoFRzAPnAQ=="')
sigs = parseSignatureHeader(rep.headers['signature'])
assert sigs['signer'] == ('u72j9aKHgz99f0K8pSkMnyqwvEr_3rpS_z2034L99sTWrMII'
'JGQPbVuIJ1cupo6cfIf_KCB5ecVRYoFRzAPnAQ==')
assert rep.body == (
'{\n'
' "did": "did:igo:Xq5YqaL6L48pf0fu7IUhL0JRaU2_RxFP0AL43wYn148=",\n'
' "signer": "did:igo:Xq5YqaL6L48pf0fu7IUhL0JRaU2_RxFP0AL43wYn148=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "Xq5YqaL6L48pf0fu7IUhL0JRaU2_RxFP0AL43wYn148=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ]\n'
'}')
assert rep.json == {
'did': 'did:igo:Xq5YqaL6L48pf0fu7IUhL0JRaU2_RxFP0AL43wYn148=',
'signer': 'did:igo:Xq5YqaL6L48pf0fu7IUhL0JRaU2_RxFP0AL43wYn148=#0',
'changed': '2000-01-01T00:00:00+00:00',
'keys':
[
{
'key': 'Xq5YqaL6L48pf0fu7IUhL0JRaU2_RxFP0AL43wYn148=',
'kind': 'EdDSA'
}
],
}
assert rep.json['did'] == kdid
assert verify64u(sigs['signer'], rep.body, rep.json['keys'][0]['key'])
dat, ser, sig = dbing.getSigned(kdid)
assert dat == rep.json
assert ser == rep.body
assert sig == sigs['signer']
print("Testing get server using GET /agent/registration?did=")
didURI = falcon.uri.encode_value(kdid)
rep = client.get('/agent?did={}'.format(didURI))
assert rep.status == falcon.HTTP_OK
assert int(rep.headers['content-length']) == 291
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert rep.headers['signature'] == ('signer="u72j9aKHgz99f0K8pSkMnyqwvEr_3r'
'pS_z2034L99sTWrMIIJGQPbVuIJ1cupo6cfIf_KCB5ecVRYoFRzAPnAQ=="')
sigs = parseSignatureHeader(rep.headers['signature'])
assert sigs['signer'] == sig
assert rep.body == ser
assert rep.json == dat
cleanupTmpBaseDir(dbEnv.path())
print("Done Test")
def test_get_AgentDid(client): # client is a fixture in pytest_falcon
"""
Test GET to agent at did.
"""
print("Testing GET /agent/{did}")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# put agent into database
# random seed used to generate private signing key
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b'PTi\x15\xd5\xd3`\xf1u\x15}^r\x9bfH\x02l\xc6\x1b\x1d\x1c\x0b9\xd7{\xc0_'
b'\xf2K\x93`')
# creates signing/verification key pair
vk, sk = libnacl.crypto_sign_seed_keypair(seed)
dt = datetime.datetime(2000, 1, 1, tzinfo=datetime.timezone.utc)
stamp = timing.iso8601(dt, aware=True)
sig, res = makeSignedAgentReg(vk, sk, changed=stamp)
reg = json.loads(res, object_pairs_hook=ODict)
did = reg['did']
dbing.putSigned(key=did, ser=res, sig=sig, clobber=False)
didURI = falcon.uri.encode_value(did)
rep = client.get('/agent/{}'.format(didURI))
assert rep.status == falcon.HTTP_OK
assert int(rep.headers['content-length']) == 291
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert rep.headers['signature'] == ('signer="{}"'.format(sig))
sigs = parseSignatureHeader(rep.headers['signature'])
assert sigs['signer'] == ('AeYbsHot0pmdWAcgTo5sD8iAuSQAfnH5U6wiIGpVNJQQoYKB'
'YrPPxAoIc1i5SHCIDS8KFFgf8i0tDq8XGizaCg==')
assert rep.body == (
'{\n'
' "did": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ]\n'
'}')
assert rep.json['did'] == did
assert verify64u(sigs['signer'], rep.body, rep.json['keys'][0]['key'])
cleanupTmpBaseDir(dbEnv.path())
print("Done Test")
def test_put_AgentDid():
"""
Test PUT to agent at did.
Overwrites existing agent data resource with new data
"""
global store # use Global store
print("Testing put /agent/{did}")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# create local test server
valet = Valet(port=8101,
bufsize=131072,
store=store,
app=exapp,)
result = valet.open()
assert result
assert valet.servant.ha == ('0.0.0.0', 8101)
assert valet.servant.eha == ('127.0.0.1', 8101)
# put an agent into database so we can update it
# random seed used to generate private signing key
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b'PTi\x15\xd5\xd3`\xf1u\x15}^r\x9bfH\x02l\xc6\x1b\x1d\x1c\x0b9\xd7{\xc0_'
b'\xf2K\x93`')
# creates signing/verification key pair
vk, sk = libnacl.crypto_sign_seed_keypair(seed)
dt = datetime.datetime(2000, 1, 1, tzinfo=datetime.timezone.utc)
date = timing.iso8601(dt, aware=True)
sig, res = makeSignedAgentReg(vk, sk, changed=date)
reg = json.loads(res, object_pairs_hook=ODict)
did = reg['did']
dbing.putSigned(key=did, ser=res, sig=sig, clobber=False)
rdat, rser, rsig = dbing.getSelfSigned(did)
assert rdat == reg
assert rser == res
assert rsig == sig
# change signer and key fields
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b'\xd2\'\x87\x0fs\xd7\xf5\xba\xc1\xff!\x85j\xf5\xe4"\x87\x1c8\n[\xe9\x8e\x0b'
b'\x11\xf55\x8b\xb8\x0c\x19\x13')
# creates signing/verification key pair
nvk, nsk = libnacl.crypto_sign_seed_keypair(seed)
ndt = datetime.datetime(2000, 1, 2, tzinfo=datetime.timezone.utc)
nstamp = timing.iso8601(ndt, aware=True)
index = 1
signer = "{}#{}".format(did, index) # signer field value key at index
nverkey = keyToKey64u(nvk) # make key index field value
assert nverkey == 'FsSQTQnp_W-6RPkuvULH8h8G5u_4qYl61ec9-k-2hKc='
kind = "EdDSA"
reg["signer"] = signer
reg["changed"] = nstamp
reg["keys"].append(ODict(key=nverkey, kind=kind))
assert reg["keys"][1] == {'key': 'FsSQTQnp_W-6RPkuvULH8h8G5u_4qYl61ec9-k-2hKc=',
'kind': 'EdDSA'}
assert reg['signer'] == "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#1"
nres = json.dumps(reg, indent=2)
nsig = keyToKey64u(libnacl.crypto_sign(nres.encode("utf-8"), nsk)[:libnacl.crypto_sign_BYTES])
csig = keyToKey64u(libnacl.crypto_sign(nres.encode("utf-8"), sk)[:libnacl.crypto_sign_BYTES])
assert nsig == 'Y5xTb0_jTzZYrf5SSEK2f3LSLwIwhOX7GEj6YfRWmGViKAesa08UkNWukUkPGuKuu-EAH5U-sdFPPboBAsjRBw=='
assert csig == "Xhh6WWGJGgjU5V-e57gj4HcJ87LLOhQr2Sqg5VToTSg-SI1W3A8lgISxOjAI5pa2qnonyz3tpGvC2cmf1VTpBg=="
# now overwrite with new one
headers = {"Content-Type": "text/html; charset=utf-8",
"Signature": 'signer="{}";current="{}"'.format(nsig, csig)}
body = nres # client.post encodes the body
#didURI = falcon.uri.encode_value(did)
# patron url quotes path for us so don't quote before
path = "http://{}:{}/agent/{}".format('localhost', valet.servant.eha[1], did)
# instantiate Patron client
patron = Patron(bufsize=131072,
store=store,
method = 'PUT',
path=path,
headers=headers,
body=body,
reconnectable=True,)
patron.transmit()
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['reason'] == 'OK'
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == nsig
assert sigs['signer'] == ('Y5xTb0_jTzZYrf5SSEK2f3LSLwIwhOX7GEj6YfRWmGViKAes'
'a08UkNWukUkPGuKuu-EAH5U-sdFPPboBAsjRBw==')
assert rep['data'] == reg
assert rep['body'].decode() == nres == (
'{\n'
' "did": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#1",\n'
' "changed": "2000-01-02T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "kind": "EdDSA"\n'
' },\n'
' {\n'
' "key": "FsSQTQnp_W-6RPkuvULH8h8G5u_4qYl61ec9-k-2hKc=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ]\n'
'}')
# verify that its in database
ddat, dser, dsig = dbing.getSigned(did)
assert ddat == reg
assert dser == nres
assert dsig == nsig
# now get it
# patron url encodes path for us
patron.request(path='/agent/{}'.format(did), method='GET' )
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['reason'] == 'OK'
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == nsig
assert sigs['signer'] == ('Y5xTb0_jTzZYrf5SSEK2f3LSLwIwhOX7GEj6YfRWmGViKAes'
'a08UkNWukUkPGuKuu-EAH5U-sdFPPboBAsjRBw==')
assert rep['data']['did'] == did
assert rep['body'].decode() == (
'{\n'
' "did": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#1",\n'
' "changed": "2000-01-02T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "kind": "EdDSA"\n'
' },\n'
' {\n'
' "key": "FsSQTQnp_W-6RPkuvULH8h8G5u_4qYl61ec9-k-2hKc=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ]\n'
'}')
assert verify64u(sigs['signer'], rep['body'].decode(), rep['data']['keys'][1]['key'])
cleanupTmpBaseDir(dbEnv.path())
valet.close()
patron.close()
print("Done Test")
def test_put_IssuerDid(client): # client is a fixture in pytest_falcon
"""
Test PUT to issuer agent at did.
Overwrites existing agent data resource with new data
"""
global store # use Global store
print("Testing put /agent/{did} for issuer")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# create local test server
valet = Valet(port=8101,
bufsize=131072,
store=store,
app=exapp,)
result = valet.open()
assert result
assert valet.servant.ha == ('0.0.0.0', 8101)
assert valet.servant.eha == ('127.0.0.1', 8101)
# put an agent into database so we can update it
# random seed used to generate private signing key
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
# ivy's seed
seed = (b"\xb2PK\xad\x9b\x92\xa4\x07\xc6\xfa\x0f\x13\xd7\xe4\x08\xaf\xc7'~\x86"
b'\xd2\x92\x93rA|&9\x16Bdi')
# creates signing/verification key pair
vk, sk = libnacl.crypto_sign_seed_keypair(seed)
dt = datetime.datetime(2000, 1, 1, tzinfo=datetime.timezone.utc)
date = timing.iso8601(dt, aware=True)
issuant = ODict(kind="dns",
issuer="localhost",
registered=date,
validationURL="http://localhost:8101/demo/check")
issuants = [issuant] # list of hid issuants
sig, res = makeSignedAgentReg(vk, sk, changed=date, issuants=issuants)
dat = json.loads(res, object_pairs_hook=ODict)
did = dat['did']
dbing.putSigned(key=did, ser=res, sig=sig, clobber=False)
vdat, vser, vsig = dbing.getSelfSigned(did)
assert vdat == dat
assert vser == res
assert vsig == sig
# change signer and key fields
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b'Z\xda?\x93M\xf8|\xe2!d\x16{s\x9d\x07\xd2\x98\xf2!\xff\xb8\xb6\xf9Z'
b'\xe5I\xbc\x97}IFV')
# creates signing/verification key pair
nvk, nsk = libnacl.crypto_sign_seed_keypair(seed)
ndt = datetime.datetime(2000, 1, 2, tzinfo=datetime.timezone.utc)
ndate = timing.iso8601(ndt, aware=True)
index = 1
signer = "{}#{}".format(did, index) # signer field value key at index
nverkey = keyToKey64u(nvk) # make key index field value
assert nverkey == '0UX5tP24WPEmAbROdXdygGAM3oDcvrqb3foX4EyayYI='
kind = "EdDSA"
dat["signer"] = signer
dat["changed"] = ndate
dat["keys"].append(ODict(key=nverkey, kind=kind))
assert dat["keys"][1] == {'key': '0UX5tP24WPEmAbROdXdygGAM3oDcvrqb3foX4EyayYI=',
'kind': 'EdDSA'}
assert dat['signer'] == "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#1"
nres = json.dumps(dat, indent=2)
nsig = keyToKey64u(libnacl.crypto_sign(nres.encode("utf-8"), nsk)[:libnacl.crypto_sign_BYTES])
csig = keyToKey64u(libnacl.crypto_sign(nres.encode("utf-8"), sk)[:libnacl.crypto_sign_BYTES])
assert nsig == 'n6Rpwa17V7_mjROO4ZAZYfJ7IejuL8XjaMHx6ylFgMvaa9AxNJ9KAcfXYe8PTSIdws81yvUSpWzQtqPFi2tHBQ=='
assert csig == 'mKH2K-EHbARadyDufNnu-_YB8LoHjh3pL6NpJk4Z6Cn3MCdCAvGtyncuUssF-6e8DxFnmo0um-vBb3-Hr2UkAA=='
# now overwrite with new one using web service
path = "http://{}:{}/agent/{}".format('localhost', valet.servant.eha[1], did)
headers = {"Content-Type": "text/html; charset=utf-8",
"Signature": 'signer="{}";current="{}"'.format(nsig, csig)}
body = nres.encode()
#didURI = falcon.uri.encode_value(did)
#rep = client.put('/agent/{}'.format(didURI), body=body, headers=headers)
# instantiate Patron client
patron = Patron(bufsize=131072,
store=store,
method = 'PUT',
path=path,
headers=headers,
body=body,
reconnectable=True,)
patron.transmit()
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['reason'] == 'OK'
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep["headers"]['signature'])
assert sigs['signer'] == nsig
assert rep['data'] == dat == {
'changed': '2000-01-02T00:00:00+00:00',
'did': 'did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=',
'issuants': [{'issuer': 'localhost',
'kind': 'dns',
'registered': '2000-01-01T00:00:00+00:00',
'validationURL': 'http://localhost:8101/demo/check'}],
'keys': [{'key': 'dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=',
'kind': 'EdDSA'},
{'key': '0UX5tP24WPEmAbROdXdygGAM3oDcvrqb3foX4EyayYI=',
'kind': 'EdDSA'}],
'signer': 'did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#1'
}
assert rep['body'].decode() == nres == (
'{\n'
' "did": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=",\n'
' "signer": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#1",\n'
' "changed": "2000-01-02T00:00:00+00:00",\n'
' "keys": [\n'
' {\n'
' "key": "dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=",\n'
' "kind": "EdDSA"\n'
' },\n'
' {\n'
' "key": "0UX5tP24WPEmAbROdXdygGAM3oDcvrqb3foX4EyayYI=",\n'
' "kind": "EdDSA"\n'
' }\n'
' ],\n'
' "issuants": [\n'
' {\n'
' "kind": "dns",\n'
' "issuer": "localhost",\n'
' "registered": "2000-01-01T00:00:00+00:00",\n'
' "validationURL": "http://localhost:8101/demo/check"\n'
' }\n'
' ]\n'
'}')
# verify that its in database
vdat, vser, vsig = dbing.getSigned(did)
assert vdat == dat
assert vser == nres
assert vsig == nsig
# now get it from web service
patron.request(path='/agent/{}'.format(did), method='GET' )
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['reason'] == 'OK'
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == nsig
assert rep['data']['did'] == did
assert rep['body'].decode() == nres
assert verify64u(sigs['signer'], rep['body'].decode(), rep['data']['keys'][1]['key'])
cleanupTmpBaseDir(dbEnv.path())
valet.close()
patron.close()
print("Done Test")
def test_get_ThingDid(client): # client is a fixture in pytest_falcon
"""
Test GET to thing at did.
"""
print("Testing GET /thing/{did}")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# To put thing into database first need to put owning agent and then thing
# put agent into database
# random seed used to generate private signing key
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b"\xb2PK\xad\x9b\x92\xa4\x07\xc6\xfa\x0f\x13\xd7\xe4\x08\xaf\xc7'~\x86"
b'\xd2\x92\x93rA|&9\x16Bdi')
# creates signing/verification key pair
svk, ssk = libnacl.crypto_sign_seed_keypair(seed)
dt = datetime.datetime(2000, 1, 1, tzinfo=datetime.timezone.utc)
stamp = timing.iso8601(dt, aware=True)
asig, aser = makeSignedAgentReg(svk, ssk, changed=stamp)
adat = json.loads(aser, object_pairs_hook=ODict)
adid = adat['did']
dbing.putSigned(key=adid, ser=aser, sig=asig, clobber=False)
# verify that its in database
vdat, vser, vsig = dbing.getSigned(adid)
assert vdat == adat
assert vser == aser
assert vsig == asig
# create thing signed by agent and put into database
# creates signing/verification key pair thing DID
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b'\xba^\xe4\xdd\x81\xeb\x8b\xfa\xb1k\xe2\xfd6~^\x86tC\x9c\xa7\xe3\x1d2\x9d'
b'P\xdd&R <\x97\x01')
dvk, dsk = libnacl.crypto_sign_seed_keypair(seed)
signer = adat['signer'] # use same signer key fragment reference as agent
hid = "hid:dns:localhost#02"
data = ODict(keywords=["Canon", "EOS Rebel T6", "251440"],
message="If found please return.")
dsig, ssig, tser = makeSignedThingReg(dvk,
dsk,
ssk,
signer,
changed=stamp,
hid=hid,
data=data)
assert ssig == 'FGRHzSNS70LIjwcSTAxHx5RahDwAet090fYSnsReMco_WvpTVpvfEygWDXslCBh0TqBoEOMLQ78-kN8fj6NFAg=='
tdat = json.loads(tser, object_pairs_hook=ODict)
tdid = tdat['did']
assert tdid == "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM="
assert tser == (
'{\n'
' "did": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "hid": "hid:dns:localhost#02",\n'
' "signer": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "data": {\n'
' "keywords": [\n'
' "Canon",\n'
' "EOS Rebel T6",\n'
' "251440"\n'
' ],\n'
' "message": "If found please return."\n'
' }\n'
'}')
dbing.putSigned(key=tdid, ser=tser, sig=ssig, clobber=False)
# verify that its in database
vdat, vser, vsig = dbing.getSigned(tdid)
assert vdat == tdat
assert vser == tser
assert vsig == ssig
# now get it from web service
didURI = falcon.uri.encode_value(tdid)
rep = client.get('/thing/{}'.format(didURI))
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert rep.headers['signature'] == ('signer="{}"'.format(ssig))
sigs = parseSignatureHeader(rep.headers['signature'])
assert sigs['signer'] == 'FGRHzSNS70LIjwcSTAxHx5RahDwAet090fYSnsReMco_WvpTVpvfEygWDXslCBh0TqBoEOMLQ78-kN8fj6NFAg=='
assert rep.body == (
'{\n'
' "did": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "hid": "hid:dns:localhost#02",\n'
' "signer": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "data": {\n'
' "keywords": [\n'
' "Canon",\n'
' "EOS Rebel T6",\n'
' "251440"\n'
' ],\n'
' "message": "If found please return."\n'
' }\n'
'}')
assert rep.json['did'] == tdid
assert verify64u(sigs['signer'], rep.body, adat['keys'][0]['key'])
cleanupTmpBaseDir(dbEnv.path())
print("Done Test")
def test_put_ThingDid(client): # client is a fixture in pytest_falcon
"""
Test PUT to thing at did.
"""
global store # use Global store
print("Testing PUT /thing/{did}")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# create local test server
valet = Valet(port=8101,
bufsize=131072,
store=store,
app=exapp,)
result = valet.open()
assert result
assert valet.servant.ha == ('0.0.0.0', 8101)
assert valet.servant.eha == ('127.0.0.1', 8101)
# To put thing into database first need to put owning agent and then thing
# put agent into database
# random seed used to generate private signing key
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b"\xb2PK\xad\x9b\x92\xa4\x07\xc6\xfa\x0f\x13\xd7\xe4\x08\xaf\xc7'~\x86"
b'\xd2\x92\x93rA|&9\x16Bdi')
# creates signing/verification key pair
svk, ssk = libnacl.crypto_sign_seed_keypair(seed)
dt = datetime.datetime(2000, 1, 1, tzinfo=datetime.timezone.utc)
stamp = timing.iso8601(dt, aware=True)
issuant = ODict(kind="dns",
issuer="localhost",
registered=stamp,
validationURL="http://localhost:8101/demo/check")
issuants = [issuant] # list of hid issuants
asig, aser = makeSignedAgentReg(svk, ssk, changed=stamp, issuants=issuants)
adat = json.loads(aser, object_pairs_hook=ODict)
adid = adat['did']
# modify agent so has another key in keys
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b'Z\xda?\x93M\xf8|\xe2!d\x16{s\x9d\x07\xd2\x98\xf2!\xff\xb8\xb6\xf9Z'
b'\xe5I\xbc\x97}IFV')
# creates signing/verification key pair
nvk, nsk = libnacl.crypto_sign_seed_keypair(seed)
nverkey = keyToKey64u(nvk) # make key index field value
assert nverkey == '0UX5tP24WPEmAbROdXdygGAM3oDcvrqb3foX4EyayYI='
kind = "EdDSA"
adat["keys"].append(ODict(key=nverkey, kind=kind))
assert adat["keys"][1] == {'key': '0UX5tP24WPEmAbROdXdygGAM3oDcvrqb3foX4EyayYI=',
'kind': 'EdDSA'}
nser = json.dumps(adat, indent=2)
# did not change signer so sign with prior signer
nsig = keyToKey64u(libnacl.crypto_sign(nser.encode("utf-8"), ssk)[:libnacl.crypto_sign_BYTES])
assert nsig == 'EH_KoQBJU7u8gWuheKQyfpj1rP17cDycOsmn5X_ZXQfkRtORrsBc0bUK3G_MSZ80zM5AaghKqAJXQbLquBZPAw=='
dbing.putSigned(key=adid, ser=nser, sig=nsig, clobber=False)
# verify that its in database
vdat, vser, vsig = dbing.getSelfSigned(adid)
assert vdat == adat
assert vser == nser
assert vsig == nsig
# create thing signed by agent and put into database
# creates signing/verification key pair thing DID
#seed = libnacl.randombytes(libnacl.crypto_sign_SEEDBYTES)
seed = (b'\xba^\xe4\xdd\x81\xeb\x8b\xfa\xb1k\xe2\xfd6~^\x86tC\x9c\xa7\xe3\x1d2\x9d'
b'P\xdd&R <\x97\x01')
dvk, dsk = libnacl.crypto_sign_seed_keypair(seed)
signer = adat['signer'] # use same signer key fragment reference as agent
hid = "hid:dns:localhost#02"
data = ODict(keywords=["Canon", "EOS Rebel T6", "251440"],
message="If found please return.")
dsig, ssig, tser = makeSignedThingReg(dvk,
dsk,
ssk,
signer,
changed=stamp,
hid=hid,
data=data)
assert ssig == 'FGRHzSNS70LIjwcSTAxHx5RahDwAet090fYSnsReMco_WvpTVpvfEygWDXslCBh0TqBoEOMLQ78-kN8fj6NFAg=='
tdat = json.loads(tser, object_pairs_hook=ODict)
tdid = tdat['did']
assert tdid == "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM="
assert tser == (
'{\n'
' "did": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "hid": "hid:dns:localhost#02",\n'
' "signer": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "data": {\n'
' "keywords": [\n'
' "Canon",\n'
' "EOS Rebel T6",\n'
' "251440"\n'
' ],\n'
' "message": "If found please return."\n'
' }\n'
'}')
dbing.putSigned(key=tdid, ser=tser, sig=ssig, clobber=False)
# verify that its in database
vdat, vser, vsig = dbing.getSigned(tdid)
assert vdat == tdat
assert vser == tser
assert vsig == ssig
# put entry in hid table
dbing.putHid(tdat['hid'], tdid)
htdid = dbing.getHid(tdat['hid'])
assert htdid == tdid
# now change signer field, hid field, and changed field
odat = copy.copy(tdat) # make copy before change
index = 1
signer = "{}#{}".format(adid, index) # signer field value key at index
assert signer == 'did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#1'
tdat['signer'] = signer
tdat["hid"] = "hid:dns:localhost#03"
dt = datetime.datetime(2000, 1, 2, tzinfo=datetime.timezone.utc)
stamp = timing.iso8601(dt, aware=True)
tdat['changed'] = stamp
# now double sign and put to web service
ntser = json.dumps(tdat, indent=2)
ntsig = keyToKey64u(libnacl.crypto_sign(ntser.encode("utf-8"), nsk)[:libnacl.crypto_sign_BYTES])
ctsig = keyToKey64u(libnacl.crypto_sign(ntser.encode("utf-8"), ssk)[:libnacl.crypto_sign_BYTES])
assert ntsig == 'rsUNXdD5-gIgfTPkJNsXtF2ZEMJpUFOKn2EVsSlKWtG7EfyzdqM4iHYQw5pviPGd7EPqBKvafGDvmuHBNI0wBg=='
assert ctsig == 'CEbDp3n-ZKMIQZr9f4O2fjHqnXTkwPDd87Mgx7Cphql1m54_YqmGEvKZC9tVw3nWjNq3LTTwwH6OFDL25be7CQ=='
# now overwrite with new one using web service
headers = {"Content-Type": "text/html; charset=utf-8",
"Signature": 'signer="{}";current="{}"'.format(ntsig, ctsig)}
body = ntser # client.post encodes the body
# patron url quotes path for us so don't quote before
path = "http://{}:{}/thing/{}".format('localhost', valet.servant.eha[1], tdid)
# instantiate Patron client
patron = Patron(bufsize=131072,
store=store,
method = 'PUT',
path=path,
headers=headers,
body=body,
reconnectable=True,)
patron.transmit()
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['reason'] == 'OK'
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == ntsig
assert rep['data'] == tdat
ser = rep['body'].decode()
assert ser == (
'{\n'
' "did": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "hid": "hid:dns:localhost#03",\n'
' "signer": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#1",\n'
' "changed": "2000-01-02T00:00:00+00:00",\n'
' "data": {\n'
' "keywords": [\n'
' "Canon",\n'
' "EOS Rebel T6",\n'
' "251440"\n'
' ],\n'
' "message": "If found please return."\n'
' }\n'
'}')
assert verify64u(sigs['signer'], ser, adat['keys'][1]['key'])
# verify that its in database
vdat, vser, vsig = dbing.getSigned(tdid)
assert vdat == tdat
assert vser == ntser
assert vsig == ntsig
# verify hid table changes
htdid = dbing.getHid(tdat['hid'])
assert htdid == tdid
otdid = dbing.getHid(odat['hid'])
assert otdid == ""
# now get it from web service
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
path = "http://{}:{}/thing/{}".format('localhost', valet.servant.eha[1], tdid)
patron.request(method='GET', path=path, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
assert len(patron.responses) == 1
rep = patron.responses.popleft()
assert rep['status'] == 200
assert rep['headers']['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep['headers']['signature'])
assert sigs['signer'] == ntsig
assert rep['data'] == tdat
ser = rep['body'].decode()
assert ser == (
'{\n'
' "did": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "hid": "hid:dns:localhost#03",\n'
' "signer": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#1",\n'
' "changed": "2000-01-02T00:00:00+00:00",\n'
' "data": {\n'
' "keywords": [\n'
' "Canon",\n'
' "EOS Rebel T6",\n'
' "251440"\n'
' ],\n'
' "message": "If found please return."\n'
' }\n'
'}')
assert verify64u(sigs['signer'], ser, adat['keys'][1]['key'])
cleanupTmpBaseDir(dbEnv.path())
valet.close()
patron.close()
print("Done Test")
def test_post_AgentDidDrop(client): # client is a fixture in pytest_falcon
"""
Test POST drop message to agent .
{
"uid": "m_00035d2976e6a000_26ace93",
"kind": "found",
"signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0",
"date": "2000-01-03T00:00:00+00:00",
"to": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=",
"from": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",
"thing": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",
"subject": "Lose something?",
"content": "Look what I found"
}
"""
print("Testing POST /agent/{adid}/drop")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
agents, things = setupTestDbAgentsThings()
agents['sam'] = (kdid, keeper.verkey, keeper.sigkey) # sam the server
for did, vk, sk in agents.values():
dat, ser, sig = dbing.getSelfSigned(did)
assert dat is not None
assert dat['did'] == did
for did, vk, sk in things.values():
dat, ser, sig = dbing.getSigned(did)
assert dat is not None
assert dat['did'] == did
# post message from Ann to Ivy
dt = datetime.datetime(2000, 1, 3, tzinfo=datetime.timezone.utc)
changed = timing.iso8601(dt, aware=True)
assert changed == "2000-01-03T00:00:00+00:00"
stamp = dt.timestamp() # make time.time value
#muid = timing.tuuid(stamp=stamp, prefix="m")
muid = "m_00035d2976e6a000_26ace93"
assert muid == "m_00035d2976e6a000_26ace93"
srcDid, srcVk, srcSk = agents['ann']
dstDid, dstVk, dskSk = agents['ivy']
thingDid, thingVk, thingSk = things['cam']
assert dstDid == "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY="
signer = "{}#0".format(srcDid)
assert signer == "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0"
msg = ODict()
msg['uid'] = muid
msg['kind'] = "found"
msg['signer'] = "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0"
msg['date'] = changed
msg['to'] = dstDid
msg['from'] = srcDid
msg['thing'] = thingDid
msg['subject'] = "Lose something?"
msg['content'] = "Look what I found"
mser = json.dumps(msg, indent=2)
msig = keyToKey64u(libnacl.crypto_sign(mser.encode("utf-8"), srcSk)[:libnacl.crypto_sign_BYTES])
assert msig == "07u1OcQI8FUeWPqeiga3A9k4MPJGSFmC4vShiJNpv2Rke9ssnW7aLx857HC5ZaJ973WSKkLAwPzkl399d01HBA=="
dstDidUri = falcon.uri.encode_value(dstDid)
headers = {"Content-Type": "text/html; charset=utf-8",
"Signature": 'signer="{}"'.format(msig)}
body = mser # client.post encodes the body
rep = client.post('/agent/{}/drop'.format(dstDidUri),
body=body,
headers=headers)
assert rep.status == falcon.HTTP_201
assert msg == rep.json
location = falcon.uri.decode(rep.headers['location'])
assert location == "/agent/{}/drop?from={}&uid={}".format(dstDid, srcDid, muid)
# now get it from web service
# need to use uri encode version of location header
assert rep.headers['location'] == ('/agent/did%3Aigo%3AdZ74MLZXD-1QHoa73w9pQ'
'9GroAvxqFi2RTZWlkC0raY%3D/drop'
'?from=did%3Aigo%3AQt27fThWoNZsa88VrTkep'
'6H-4HA8tr54sHON1vWl6FE%3D&'
'uid=m_00035d2976e6a000_26ace93')
rep = client.get(rep.headers['location'])
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep.headers['signature'])
assert sigs['signer'] == msig
assert rep.body == (
'{\n'
' "uid": "m_00035d2976e6a000_26ace93",\n'
' "kind": "found",\n'
' "signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0",\n'
' "date": "2000-01-03T00:00:00+00:00",\n'
' "to": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=",\n'
' "from": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "thing": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "subject": "Lose something?",\n'
' "content": "Look what I found"\n'
'}')
assert verify64u(msig, rep.body, keyToKey64u(srcVk))
# get message list
path = "/agent/{}/drop?all=true".format(dstDidUri)
assert path == "/agent/did%3Aigo%3AdZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY%3D/drop?all=true"
rep = client.get(path)
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert rep.json == [{'from': 'did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=',
'uid': 'm_00035d2976e6a000_26ace93'}]
cleanupTmpBaseDir(dbEnv.path())
print("Done Test")
def test_get_AgentDidDrop(client): # client is a fixture in pytest_falcon
"""
Test GET drop message to agent
{
"uid": "m_00035d2976e6a000_26ace93",
"kind": "found",
"signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0",
"date": "2000-01-03T00:00:00+00:00",
"to": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=",
"from": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",
"thing": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",
"subject": "Lose something?",
"content": "Look what I found"
}
"""
print("Testing GET /agent/{adid}/drop/{cdid}")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
agents, things = setupTestDbAgentsThings()
agents['sam'] = (kdid, keeper.verkey, keeper.sigkey) # sam the server
for did, vk, sk in agents.values():
dat, ser, sig = dbing.getSelfSigned(did)
assert dat is not None
assert dat['did'] == did
for did, vk, sk in things.values():
dat, ser, sig = dbing.getSigned(did)
assert dat is not None
assert dat['did'] == did
# Insert message from Ann to Ivy into database
dt = datetime.datetime(2000, 1, 3, tzinfo=datetime.timezone.utc)
changed = timing.iso8601(dt, aware=True)
assert changed == "2000-01-03T00:00:00+00:00"
stamp = dt.timestamp() # make time.time value
#muid = timing.tuuid(stamp=stamp, prefix="m")
muid = "m_00035d2976e6a000_26ace93"
assert muid == "m_00035d2976e6a000_26ace93"
srcDid, srcVk, srcSk = agents['ann']
dstDid, dstVk, dskSk = agents['ivy']
thingDid, thingVk, thingSk = things['cam']
signer = "{}#0".format(srcDid)
assert signer == "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0"
msg = ODict()
msg['uid'] = muid
msg['kind'] = "found"
msg['signer'] = "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0"
msg['date'] = changed
msg['to'] = dstDid
msg['from'] = srcDid
msg['thing'] = thingDid
msg['subject'] = "Lose something?"
msg['content'] = "Look what I found"
mser = json.dumps(msg, indent=2)
msig = keyToKey64u(libnacl.crypto_sign(mser.encode("utf-8"), srcSk)[:libnacl.crypto_sign_BYTES])
key = "{}/drop/{}/{}".format(dstDid, srcDid, muid)
assert key == ("did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY="
"/drop/did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE="
"/m_00035d2976e6a000_26ace93")
dbing.putSigned(key=key, ser=mser, sig=msig)
# now get it from web service
dstDidUri = falcon.uri.encode_value(dstDid)
srcDidUri = falcon.uri.encode_value(srcDid)
rep = client.get("/agent/{}/drop?from={}&uid={}".format(dstDidUri, srcDidUri, muid))
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep.headers['signature'])
assert sigs['signer'] == msig
assert rep.body == (
'{\n'
' "uid": "m_00035d2976e6a000_26ace93",\n'
' "kind": "found",\n'
' "signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0",\n'
' "date": "2000-01-03T00:00:00+00:00",\n'
' "to": "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=",\n'
' "from": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "thing": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "subject": "Lose something?",\n'
' "content": "Look what I found"\n'
'}')
assert verify64u(msig, rep.body, keyToKey64u(srcVk))
cleanupTmpBaseDir(dbEnv.path())
print("Done Test")
def test_post_ThingDidOffer(client): # client is a fixture in pytest_falcon
"""
Test POST offer of thing.
offer request fields
{
"thing": thingDID,
"aspirant": AgentDID,
"duration": timeinsecondsofferisopen,
}
offer response fields
{
"thing": thingDID,
"aspirant": AgentDID,
"duration": timeinsecondsofferisopen,
"expiration": datetimeofexpiration,
"signer": serverkeydid,
"offerer": ownerkeydid,
"offer": Base64serrequest
}
"""
print("Testing POST /thing/{did}/offer")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
agents, things = setupTestDbAgentsThings()
agents['sam'] = (kdid, keeper.verkey, keeper.sigkey) # sam the server
for did, vk, sk in agents.values():
dat, ser, sig = dbing.getSelfSigned(did)
assert dat is not None
assert dat['did'] == did
for did, vk, sk in things.values():
dat, ser, sig = dbing.getSigned(did)
assert dat is not None
assert dat['did'] == did
setupTestPriorOffer(agents=agents, things=things, ago=600.0) # to test that it checks for priors
sDid, sVk, sSk = agents['sam'] # server keys
# post offer Ivy to Ann
hDid, hVk, hSk = agents['ivy']
aDid, aVk, aSk = agents['ann']
tDid, tVk, tSk = things['cam']
assert tDid == "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM="
signer = "{}#0".format(hDid)
assert signer == "did:igo:dZ74MLZXD-1QHoa73w9pQ9GroAvxqFi2RTZWlkC0raY=#0"
#dt = datetime.datetime(2000, 1, 3, tzinfo=datetime.timezone.utc)
#stamp = dt.timestamp() # make time.time value
#ouid = timing.tuuid(stamp=stamp, prefix="o")
ouid = "o_00035d2976e6a000_26ace93"
offer = ODict()
offer['uid'] = ouid
offer['thing'] = tDid
offer['aspirant'] = aDid
offer['duration'] = PROPAGATION_DELAY * 2.0
oser = json.dumps(offer, indent=2)
assert oser == (
'{\n'
' "uid": "o_00035d2976e6a000_26ace93",\n'
' "thing": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "aspirant": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=",\n'
' "duration": 120.0\n'
'}')
osig = keyToKey64u(libnacl.crypto_sign(oser.encode("utf-8"), hSk)[:libnacl.crypto_sign_BYTES])
assert osig == 'EhsfS2_4LSVjDMo_QShvciNr6aYf5ut8NuFkBugxL748vlOs1YF971aPIckmtRRAFzby07hY0Ny-7xs27-wXCw=='
dt = datetime.datetime.now(tz=datetime.timezone.utc)
tDidUri = falcon.uri.encode_value(tDid)
headers = {"Content-Type": "text/html; charset=utf-8",
"Signature": 'signer="{}"'.format(osig)}
body = oser # client.post encodes the body
rep = client.post('/thing/{}/offer'.format(tDidUri),
body=body,
headers=headers)
assert rep.status == falcon.HTTP_201
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
location = falcon.uri.decode(rep.headers['location'])
assert location == "/thing/{}/offer?uid={}".format(tDid, ouid)
expiration = rep.json['expiration']
edt = arrow.get(expiration)
assert edt > dt
offer = rep.json
offser = rep.body
assert offer['offerer'].startswith(hDid)
assert offer['signer'].startswith(sDid)
# now get it from web service
rep = client.get(rep.headers['location'])
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep.headers['signature'])
ssig = sigs['signer'] # signature changes everytime because expiration changes
assert rep.json == offer
assert rep.body == offser
assert verify64u(ssig, rep.body, keyToKey64u(sVk))
cleanupTmpBaseDir(dbEnv.path())
print("Done Test")
def test_get_ThingDidOffer(client): # client is a fixture in pytest_falcon
"""
Test GET offer of thing.
offer request fields
{
"thing": thingDID,
"aspirant": AgentDID,
"duration": timeinsecondsofferisopen,
}
offer response fields
{
"thing": thingDID,
"aspirant": AgentDID,
"duration": timeinsecondsofferisopen,
"expiration": datetimeofexpiration,
"signer": serverkeydid,
"offerer": ownerkeydid,
"offer": Base64serrequest
}
"""
print("Testing GET /thing/{did}/offer?uid=")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
agents, things = setupTestDbAgentsThings()
agents['sam'] = (kdid, keeper.verkey, keeper.sigkey) # sam the server
for did, vk, sk in agents.values():
dat, ser, sig = dbing.getSelfSigned(did)
assert dat is not None
assert dat['did'] == did
for did, vk, sk in things.values():
dat, ser, sig = dbing.getSigned(did)
assert dat is not None
assert dat['did'] == did
sDid, sVk, sSk = agents['sam'] # server keys
# post offer Ivy to Ann
hDid, hVk, hSk = agents['ivy']
aDid, aVk, aSk = agents['ann']
tDid, tVk, tSk = things['cam']
#dt = datetime.datetime(2000, 1, 3, tzinfo=datetime.timezone.utc)
#stamp = dt.timestamp() # make time.time value
#ouid = timing.tuuid(stamp=stamp, prefix="o")
ouid = "o_00035d2976e6a000_26ace93"
duration = PROPAGATION_DELAY * 2.0
offerer = "{}#0".format(hDid) # ivy is offerer
# build prior request offer for saved offer
poffer = ODict()
poffer['uid'] = ouid
poffer['thing'] = tDid
poffer['aspirant'] = aDid
poffer['duration'] = duration
poser = json.dumps(poffer, indent=2)
# now build offer in database
odat = ODict()
odat['uid'] = ouid
odat['thing'] = tDid
odat['aspirant'] = aDid
odat['duration'] = duration
dt = datetime.datetime.now(tz=datetime.timezone.utc)
# go back 10 minutes
td = datetime.timedelta(seconds=10 * 60)
odt = dt - td
td = datetime.timedelta(seconds=duration)
expiration = timing.iso8601(odt + td, aware=True)
odat["expiration"] = expiration
signer = "{}#0".format(sDid) # server sam signs
assert signer == "did:igo:Xq5YqaL6L48pf0fu7IUhL0JRaU2_RxFP0AL43wYn148=#0"
odat["signer"] = signer
odat["offerer"] = offerer
odat["offer"] = keyToKey64u(poser.encode("utf-8"))
oser = json.dumps(odat, indent=2)
osig = keyToKey64u(libnacl.crypto_sign(oser.encode("utf-8"), sSk)[:libnacl.crypto_sign_BYTES])
key = "{}/offer/{}".format(tDid, ouid)
# save offer to database, raise error if duplicate
dbing.putSigned(key=key, ser=oser, sig=osig, clobber=False) # no clobber so error
# save entry to offer expires table
result = dbing.putDidOfferExpire(did=tDid,
ouid=ouid,
expire=expiration)
# now get it from web service by uid
tDidUri = falcon.uri.encode_value(tDid)
location = "/thing/{}/offer?uid={}".format(tDidUri, ouid)
rep = client.get(location)
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep.headers['signature'])
ssig = sigs['signer'] # signature changes everytime because expiration changes
assert rep.json == odat
assert rep.body == oser
assert verify64u(ssig, rep.body, keyToKey64u(sVk))
# now get list of all offers from web service
tDidUri = falcon.uri.encode_value(tDid)
location = "/thing/{}/offer?all=true".format(tDidUri)
rep = client.get(location)
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert len(rep.json) == 1
offering = rep.json[-1]
assert offering['uid'] == ouid
assert offering['uid'] == 'o_00035d2976e6a000_26ace93'
# now get list of lastest offer from web service
tDidUri = falcon.uri.encode_value(tDid)
location = "/thing/{}/offer?latest=true".format(tDidUri)
rep = client.get(location)
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert len(rep.json) == 1
offering = rep.json[0]
assert offering['uid'] == ouid
assert offering['uid'] == 'o_00035d2976e6a000_26ace93'
cleanupTmpBaseDir(dbEnv.path())
print("Done Test")
def setupTestPriorOffer(agents, things, ago=600.0):
"""
Utility function to create prior offer in database at ago seconds
in the past.
Agents and things are ODicts of Agents and things in database
offer request fields
{
"thing": thingDID,
"aspirant": AgentDID,
"duration": timeinsecondsofferisopen,
}
offer response fields
{
"thing": thingDID,
"aspirant": AgentDID,
"duration": timeinsecondsofferisopen,
"expiration": datetimeofexpiration,
"signer": serverkeydid,
"offerer": ownerkeydid,
"offer": Base64serrequest
}
"""
# Assumes database setup
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
sDid, sVk, sSk = agents['sam'] # server keys
# post offer Ivy to Ann to transfer cam
hDid, hVk, hSk = agents['ivy']
aDid, aVk, aSk = agents['ann']
tDid, tVk, tSk = things['cam']
dt = datetime.datetime.now(tz=datetime.timezone.utc)
# go back ago seconds
td = datetime.timedelta(seconds=ago)
odt = dt - td
stamp = odt.timestamp() # make time.time value
ouid = timing.tuuid(stamp=stamp, prefix="o")
duration = PROPAGATION_DELAY * 2.0
offerer = "{}#0".format(hDid) # ivy is offerer
# build prior request offer for saved offer
poffer = ODict()
poffer['uid'] = ouid
poffer['thing'] = tDid
poffer['aspirant'] = aDid
poffer['duration'] = duration
poser = json.dumps(poffer, indent=2)
# now build offer in database
odat = ODict()
odat['uid'] = ouid
odat['thing'] = tDid
odat['aspirant'] = aDid
odat['duration'] = duration
td = datetime.timedelta(seconds=duration)
expiration = timing.iso8601(odt + td, aware=True)
odat["expiration"] = expiration
signer = "{}#0".format(sDid) # server sam signs
odat["signer"] = signer
odat["offerer"] = offerer
odat["offer"] = keyToKey64u(poser.encode("utf-8"))
oser = json.dumps(odat, indent=2)
osig = keyToKey64u(libnacl.crypto_sign(oser.encode("utf-8"), sSk)[:libnacl.crypto_sign_BYTES])
key = "{}/offer/{}".format(tDid, ouid)
# save offer to database, raise error if duplicate
dbing.putSigned(key=key, ser=oser, sig=osig, clobber=False) # no clobber so error
# save entry to offer expires table
result = dbing.putDidOfferExpire(did=tDid,
ouid=ouid,
expire=expiration)
return (tDid, ouid)
def test_post_ThingDidAccept(client): # client is a fixture in pytest_falcon
"""
Test POST to thing/did/accept with parameter offer uid.
"""
global store # use Global store
print("Testing POST /thing/{did}/accept?uid={ouid}")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
# create local test server
valet = Valet(port=8101,
bufsize=131072,
store=store,
app=exapp,)
result = valet.open()
assert result
assert valet.servant.ha == ('0.0.0.0', 8101)
assert valet.servant.eha == ('127.0.0.1', 8101)
agents, things = setupTestDbAgentsThings()
agents['sam'] = (kdid, keeper.verkey, keeper.sigkey) # sam the server
for did, vk, sk in agents.values():
dat, ser, sig = dbing.getSelfSigned(did)
assert dat is not None
assert dat['did'] == did
for did, vk, sk in things.values():
dat, ser, sig = dbing.getSigned(did)
assert dat is not None
assert dat['did'] == did
sDid, sVk, sSk = agents['sam'] # server keys
# post offer Ivy to Ann
hDid, hVk, hSk = agents['ivy']
aDid, aVk, aSk = agents['ann']
tDid, tVk, tSk = things['cam']
odid, ouid = setupTestPriorOffer(agents=agents, things=things, ago=10.0) # to test that it checks for priors
assert odid == tDid
# We now have thing in database with offer from ivy to ann
# get think resource and change it to have ann as signer
tdat, tser, tsig = dbing.getSigned(tDid)
# now change signer field to ann as signer
index = 0
signer = "{}#{}".format(aDid, index) # signer field value key at index
assert signer == 'did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0'
tdat['signer'] = signer
# change hid field
tdat['hid'] = "hid:dns:localhost#03"
atser = json.dumps(tdat, indent=2)
assert atser == (
'{\n'
' "did": "did:igo:4JCM8dJWw_O57vM4kAtTt0yWqSgBuwiHpVgd55BioCM=",\n'
' "hid": "hid:dns:localhost#03",\n'
' "signer": "did:igo:Qt27fThWoNZsa88VrTkep6H-4HA8tr54sHON1vWl6FE=#0",\n'
' "changed": "2000-01-01T00:00:00+00:00",\n'
' "data": {\n'
' "keywords": [\n'
' "Canon",\n'
' "EOS Rebel T6",\n'
' "251440"\n'
' ],\n'
' "message": "If found please return."\n'
' }\n'
'}')
# now sign
atsig = keyToKey64u(libnacl.crypto_sign(atser.encode("utf-8"), aSk)[:libnacl.crypto_sign_BYTES])
assert atsig == 'm64m1gS1vh3hONpDbbz1MC9Lc412MYtC_H9K-IkMSucTJmqoTAklmg8Q7h3XtAHT-N4RhJmAqsVsjDqPos-zBA=='
# now post to accept offer with new thing resource using web service
headers = {"Content-Type": "text/html; charset=utf-8",
"Signature": 'signer="{}"'.format(atsig)}
body = atser # client.post encodes the body
# patron url quotes path for us so don't quote before
path = "http://{}:{}/thing/{}/accept".format('localhost',
valet.servant.eha[1],
tDid)
qargs = ODict(uid=ouid)
# instantiate Patron client
patron = Patron(bufsize=131072,
store=store,
method = 'POST',
path=path,
headers=headers,
qargs=qargs,
body=body,
reconnectable=True,)
patron.transmit()
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
rep = patron.respond()
assert rep
assert rep.status == 201
assert rep.reason == 'Created'
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
location = falcon.uri.decode(rep.headers['location'])
assert location == "/thing/{}".format(tDid)
assert rep.data == tdat
# verify that its in database
vdat, vser, vsig = dbing.getSigned(tDid)
assert vdat == tdat
assert vser == atser
assert vsig == atsig
# now get it from web service
headers = odict([('Accept', 'application/json'),
('Content-Length', 0)])
patron.request(method='GET', path=location, headers=headers)
timer = timing.StoreTimer(store, duration=1.0)
while (patron.requests or patron.connector.txes or not patron.responses or
not valet.idle()):
valet.serviceAll()
time.sleep(0.05)
patron.serviceAll()
time.sleep(0.05)
store.advanceStamp(0.1)
rep = patron.respond()
assert rep
assert rep.status == 200
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep.headers['signature'])
assert sigs['signer'] == atsig
assert rep.data == tdat
assert rep.body.decode() == atser
assert verify64u(atsig, rep.body.decode(), keyToKey64u(aVk))
cleanupTmpBaseDir(dbEnv.path())
valet.close()
patron.close()
print("Done Test")
def test_post_Anon(client): # client is a fixture in pytest_falcon
"""
Test POST to /anon?uid={}.
Post body is anon message
{
uid: "AQIDBAoLDA0=", # base64 url safe of 8 byte eid
content: "EjRWeBI0Vng=", # base64 url safe of 8 byte location
date: "2000-01-01T00:36:00+00:00", # ISO-8601 creation date of anon gateway time
}
uid is up 32 bytes
if anon ephemeral ID in base64 url safe
content is message up to 256 bytes
if location string in base 64 url safe
date is iso8601 datetime
This is augmented with server time stamp and stored in database
{
create: 1501774813367861, # creation in server time microseconds since epoch
expire: 1501818013367861, # expiration in server time microseconds since epoch
anon:
{
uid: "AQIDBAoLDA0=", # base64 url safe of 8 byte eid
content: "EjRWeBI0Vng=", # base64 url safe of 8 byte location
date: "2000-01-01T00:36:00+00:00", # ISO-8601 creation date of anon gateway time
}
}
"""
print("Testing POST ")
priming.setupTest()
dbEnv = dbing.gDbEnv
dt = datetime.datetime(2000, 1, 1, minute=30, tzinfo=datetime.timezone.utc)
#stamp = dt.timestamp() # make time.time value
# local time
td = datetime.timedelta(seconds=5)
date = timing.iso8601(dt=dt+td, aware=True)
assert date == '2000-01-01T00:30:05+00:00'
uid = 'AQIDBAoLDA0='
content = 'EjRWeBI0Vng='
anon = ODict()
anon['uid'] = uid
anon['content'] = content
anon['date'] = date
assert anon == {
"uid": "AQIDBAoLDA0=",
"content": "EjRWeBI0Vng=",
"date": "2000-01-01T00:30:05+00:00",
}
tser = json.dumps(anon, indent=2)
# now post anon
headers = {"Content-Type": "text/html; charset=utf-8"}
body = tser # client.post encodes the body
rep = client.post('/anon',
body=body,
headers=headers)
assert rep.status == falcon.HTTP_201
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
location = falcon.uri.decode(rep.headers['location'])
assert location == "/anon?uid={}".format(uid)
data = rep.json
assert data['anon'] == anon
create = rep.json['create']
expire = rep.json['expire']
assert expire > create
# verify that anon is in database
entries = dbing.getAnonMsgs(uid)
assert entries[0] == data
#verify expiration in its database
entries = dbing.getExpireUid(expire)
assert entries[0] == uid
# now get it from web service
rep = client.get(rep.headers['location'])
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert rep.json[0] == data
# now uid list all from web service
rep = client.get("/anon?all=true")
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
assert rep.json[0] == uid
cleanupTmpBaseDir(dbEnv.path())
print("Done Test")
def test_get_CheckHid(client): # client is a fixture in pytest_falcon
"""
Test GET /demo/check?did={}&check={}
response fields
{
signer: keyedsignerkeyfromagent,
check: did|issuer|date
}
"""
print("Testing GET /demo/check?did={}&check={}")
priming.setupTest()
dbEnv = dbing.gDbEnv
keeper = keeping.gKeeper
kdid = keeper.did
agents, things = setupTestDbAgentsThings()
agents['sam'] = (kdid, keeper.verkey, keeper.sigkey) # sam the server
sDid, sVk, sSk = agents['sam']
hDid, hVk, hSk = agents['ivy']
aDid, aVk, aSk = agents['ann']
tDid, tVk, tSk = things['cam']
iDid, iVk, iSk = agents['ike'] # issuer keys
dt = datetime.datetime(2000, 1, 3, tzinfo=datetime.timezone.utc)
did = iDid
assert did == "did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA="
dat, ser, sig = dbing.getSelfSigned(did)
date = timing.iso8601(dt, aware=True)
assert date == '2000-01-03T00:00:00+00:00'
issuer = dat['issuants'][0]['issuer']
assert issuer == "localhost"
check = "{}|{}|{}".format(did, issuer, date)
assert check == "did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=|localhost|2000-01-03T00:00:00+00:00"
# now get it from web service
didUri = falcon.uri.encode_value(did)
checkUri = falcon.uri.encode_value(check)
location = "/demo/check?did={}&check={}".format(didUri, checkUri)
rep = client.get(location)
assert rep.status == falcon.HTTP_OK
assert rep.headers['content-type'] == 'application/json; charset=UTF-8'
sigs = parseSignatureHeader(rep.headers['signature'])
ssig = sigs['signer'] # signature changes everytime because expiration changes
assert ssig == 'efIU4jplMtZzjgaWc85gLjJpmmay6QoFvApMuinHn67UkQZ2it17ZPebYFvmCEKcd0weWQONaTO-ajwQxJe2DA=='
assert rep.json == {'check': 'did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=|localhost|2000-01-03T00:00:00+00:00',
'signer': 'did:igo:3syVH2woCpOvPF0SD9Z0bu_OxNe2ZgxKjTQ961LlMnA=#0'}
assert rep.json['check'] == check
sdid, sindex, keystr = extractDidSignerParts(rep.json['signer'])
assert keystr == dat['keys'][sindex]['key']
assert sindex == 0
assert keystr == keyToKey64u(iVk)
assert verify64u(ssig, rep.json['check'], keystr)
cleanupTmpBaseDir(dbEnv.path())
print("Done Test")
| 34.445393 | 136 | 0.604373 | 10,969 | 96,826 | 5.301942 | 0.071839 | 0.023368 | 0.009285 | 0.017401 | 0.835566 | 0.811304 | 0.77939 | 0.765652 | 0.741613 | 0.729233 | 0 | 0.061357 | 0.253992 | 96,826 | 2,810 | 137 | 34.457651 | 0.743774 | 0.130812 | 0 | 0.758333 | 0 | 0.017188 | 0.287993 | 0.150813 | 0 | 0 | 0 | 0 | 0.220833 | 1 | 0.010417 | false | 0 | 0.016146 | 0.000521 | 0.027604 | 0.020833 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d42c29bc797f28d92830d4fd624fe88937b337e7 | 10,002 | py | Python | Scripts/bgp-peers.py | BeTheMike/Meandering-Python | 5c57597e292e7530739bf9f8b649a82e3f1ca0cf | [
"Apache-2.0"
] | null | null | null | Scripts/bgp-peers.py | BeTheMike/Meandering-Python | 5c57597e292e7530739bf9f8b649a82e3f1ca0cf | [
"Apache-2.0"
] | null | null | null | Scripts/bgp-peers.py | BeTheMike/Meandering-Python | 5c57597e292e7530739bf9f8b649a82e3f1ca0cf | [
"Apache-2.0"
] | 1 | 2018-06-16T22:41:02.000Z | 2018-06-16T22:41:02.000Z | import re
import sys
import socket
import getpass
from lxml import etree
import jxmlease
from jnpr.junos import Device
def ip_lookup(user_input):
pat = re.compile("^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$")
lookup_ip = pat.match(user_input)
if lookup_ip:
print('Looking up: ', rev_dns((lookup_ip)))
return lookup_ip
else:
try:
lookup_ip = socket.gethostbyname(user_input)
print("Looking for: ", user_input, "IP - ", lookup_ip)
return lookup_ip
except socket.gaierror:
print("")
print("No DNS Entry exists for that host - exiting script")
print("")
sys.exit(1)
def rev_dns(ip_address):
try:
#ip_address = "'{}'".format(ip_address)
return socket.gethostbyaddr(ip_address)[0]
except:
#ip_address = "'{}'".format(ip_address)
return ip_address
def local_ip_clean(local_ip):
try:
str = local_ip.split("+")
return str[0]
except:
return local_ip
def peer_status(status):
if status == 'Connect':
return 'Con'
elif status == 'Active':
return 'Act'
elif status == 'Established':
return 'Est'
else:
return (status)
def my_Sort(s):
return s[-3:]
def peer_full():
try:
# Log into the switch
hostname = ip_lookup(input('IP or Hostname of device: '))
username = input('Username: ')
password = getpass.getpass('Password: ')
dev = Device(host=hostname, user=username, passwd=password, gather_facts=False)
dev.open()
conf = dev.rpc.get_config()
conf_parsed = jxmlease.parse(etree.tostring(conf, encoding='unicode'))
routing_instances = []
try:
for r_instance in conf_parsed['configuration']['routing-instances']['instance']:
routing_instances.append((r_instance['name'].get_cdata()))
except:
routing_instances.append('Empty')
sorted_instances = sorted(routing_instances, key=my_Sort)
for instance in sorted_instances:
if 'Empty' not in sorted_instances:
try:
rpc = dev.rpc.get_bgp_summary_information(instance=instance)
result = jxmlease.parse(etree.tostring(rpc, pretty_print=True, encoding='unicode'))
print('Routing Instance: ', instance)
for neighbor in result['bgp-information']['bgp-peer']:
print(neighbor['peer-as'] + ":" + rev_dns(neighbor['peer-address']) + '({})'.format(
neighbor['peer-address']),
"[" + peer_status(neighbor['peer-state']), neighbor['elapsed-time'] + "]")
else:
pass
except:
pass
else:
rpc = dev.rpc.get_bgp_summary_information()
result = jxmlease.parse(etree.tostring(rpc, pretty_print=True, encoding='unicode'))
for neighbor in result['bgp-information']['bgp-peer']:
print(neighbor['peer-as'] + ":" + rev_dns(neighbor['peer-address']),
"[" + peer_status(neighbor['peer-state']),
neighbor['elapsed-time'] + "]")
dev.close()
except:
sys.exit('Bad password. Exiting')
def peer24h():
try:
# Log into the switch
hostname = input('IP or Hostname of device: ')
username = input('Username: ')
password = getpass.getpass('Password: ')
dev = Device(host=hostname, user=username, passwd=password, gather_facts=False)
dev.open()
print('\n'*3)
print('This may take a second...')
print('\n'*3)
conf = dev.rpc.get_config()
conf_parsed = jxmlease.parse(etree.tostring(conf, encoding='unicode'))
routing_instances = []
try:
for r_instance in conf_parsed['configuration']['routing-instances']['instance']:
routing_instances.append((r_instance['name'].get_cdata()))
except:
routing_instances.append('Empty')
sorted_instances = sorted(routing_instances, key=my_Sort)
clean_instances = []
for instance in sorted_instances:
if 'Empty' not in sorted_instances:
try:
rpc = dev.rpc.get_bgp_summary_information(instance=instance)
result = jxmlease.parse(etree.tostring(rpc, pretty_print=True, encoding='unicode'))
for neighbor in result['bgp-information']['bgp-peer']:
if 'd' not in neighbor['elapsed-time']:
clean_instances.append(instance)
except:
pass
else:
rpc = dev.rpc.get_bgp_summary_information()
result = jxmlease.parse(etree.tostring(rpc, pretty_print=True, encoding='unicode'))
for neighbor in result['bgp-information']['bgp-peer']:
if 'd' not in neighbor['elapsed-time']:
print(neighbor['peer-as'] + ":" + rev_dns(neighbor['peer-address']),
"[" + peer_status(neighbor['peer-state']),
neighbor['elapsed-time'] + "]")
final_instances = (list(set(clean_instances)))
sorted_final_instances = sorted(final_instances, key=my_Sort)
for instance in sorted_final_instances:
if not sorted_final_instances:
pass
else:
print('Routing Instance: ', instance)
rpc = dev.rpc.get_bgp_summary_information(instance=instance)
result = jxmlease.parse(etree.tostring(rpc, pretty_print=True, encoding='unicode'))
for neighbor in result['bgp-information']['bgp-peer']:
if 'd' not in neighbor['elapsed-time']:
print(neighbor['peer-as'] + ":" + rev_dns(neighbor['peer-address']) + '({})'.format(
neighbor['peer-address']), "[" + peer_status(neighbor['peer-state']),
neighbor['elapsed-time'] + "]")
else:
pass
dev.close()
except:
sys.exit('Bad password. Exiting')
def peer_established():
try:
# Log into the switch
hostname = input('IP or Hostname of device: ')
username = input('Username: ')
password = getpass.getpass('Password: ')
dev = Device(host=hostname, user=username, passwd=password, gather_facts=False)
dev.open()
print('\n'*3)
print('This may take a sec...')
print('\n'*3)
conf = dev.rpc.get_config()
conf_parsed = jxmlease.parse(etree.tostring(conf, encoding='unicode'))
routing_instances = []
try:
for r_instance in conf_parsed['configuration']['routing-instances']['instance']:
routing_instances.append((r_instance['name'].get_cdata()))
except:
routing_instances.append('Empty')
sorted_instances = sorted(routing_instances, key=my_Sort)
clean_instances = []
for instance in sorted_instances:
if 'Empty' not in sorted_instances:
try:
rpc = dev.rpc.get_bgp_summary_information(instance=instance)
result = jxmlease.parse(etree.tostring(rpc, pretty_print=True, encoding='unicode'))
for neighbor in result['bgp-information']['bgp-peer']:
if 'Established' not in neighbor['peer-state']:
clean_instances.append(instance)
else:
pass
except:
pass
else:
rpc = dev.rpc.get_bgp_summary_information()
result = jxmlease.parse(etree.tostring(rpc, pretty_print=True, encoding='unicode'))
for neighbor in result['bgp-information']['bgp-peer']:
if 'Established' not in neighbor['peer-state']:
print(neighbor['peer-as'] + ":" + rev_dns(neighbor['peer-address']),
"[" + peer_status(neighbor['peer-state']),
neighbor['elapsed-time'] + "]")
final_instances = (list(set(clean_instances)))
sorted_final_instances = sorted(final_instances, key=my_Sort)
for instance in sorted_final_instances:
if not sorted_final_instances:
pass
else:
print('Routing Instance: ', instance)
rpc = dev.rpc.get_bgp_summary_information(instance=instance)
result = jxmlease.parse(etree.tostring(rpc, pretty_print=True, encoding='unicode'))
for neighbor in result['bgp-information']['bgp-peer']:
if 'Established' not in neighbor['peer-state']:
print(neighbor['peer-as'] + ":" + rev_dns(neighbor['peer-address']) + '({})'.format(
neighbor['peer-address']), "[" + peer_status(neighbor['peer-state']),
neighbor['elapsed-time'] + "]")
else:
pass
dev.close()
except:
sys.exit('Bad password. Exiting')
def main():
choice = '0'
while choice =='0':
print('1. Full BGP Peers\n2. Partial - Non-Established Only\n3. Partial - Less Than 24 Hours\n4. Exit')
choice = input("Please select an option: ")
if choice == '4':
sys.exit('Exiting Script')
elif choice == '3':
peer24h()
elif choice == '2':
peer_established()
elif choice == '1':
peer_full()
else:
print("Please pick a valid option")
if __name__ == '__main__':
main()
| 38.321839 | 111 | 0.542591 | 1,036 | 10,002 | 5.089768 | 0.143822 | 0.054618 | 0.018775 | 0.054239 | 0.809406 | 0.801062 | 0.784563 | 0.784563 | 0.784563 | 0.774322 | 0 | 0.004792 | 0.332334 | 10,002 | 260 | 112 | 38.469231 | 0.784816 | 0.013597 | 0 | 0.733945 | 0 | 0.009174 | 0.148768 | 0.003651 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041284 | false | 0.087156 | 0.03211 | 0.004587 | 0.123853 | 0.137615 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
d46ca4af3a4ef89a8bccdaa00867d03139c55456 | 627 | py | Python | tests/exception/__init__.py | isabella232/app-passwords | eb6e377b4005a80c263ea36dc5e97234c4f3a7af | [
"Apache-2.0"
] | 27 | 2017-03-06T02:52:59.000Z | 2018-07-17T22:14:21.000Z | tests/exception/__init__.py | LedgerHQ/app-passwords | 06d0795fd1025705db5d621d1e460d8b4dc92f34 | [
"Apache-2.0"
] | 23 | 2020-11-14T19:43:08.000Z | 2022-02-24T00:33:53.000Z | tests/exception/__init__.py | isabella232/app-passwords | eb6e377b4005a80c263ea36dc5e97234c4f3a7af | [
"Apache-2.0"
] | 6 | 2021-04-13T19:00:18.000Z | 2022-03-06T06:33:21.000Z | from .device_exception import DeviceException
from .types import (UnknownDeviceError,
WrongP1P2Error,
WrongDataLengthError,
InsNotSupportedError,
ClaNotSupportedError,
AppNameTooLongError,
ActionCancelledError,
MetadatasParsingError)
__all__ = [
"DeviceException",
"UnknownDeviceError",
"WrongP1P2Error",
"WrongDataLengthError",
"InsNotSupportedError",
"ClaNotSupportedError",
"AppNameTooLongError",
"ActionCancelledError",
"MetadatasParsingError"
]
| 28.5 | 45 | 0.60925 | 26 | 627 | 14.5 | 0.576923 | 0.169761 | 0.275862 | 0.381963 | 0.806366 | 0.806366 | 0.806366 | 0.806366 | 0 | 0 | 0 | 0.009479 | 0.326954 | 627 | 21 | 46 | 29.857143 | 0.883886 | 0 | 0 | 0 | 0 | 0 | 0.266348 | 0.033493 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0 | 1 | 0 | 1 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d4884402cbe838b412b811a34f60fb065fe9e0c1 | 25,064 | py | Python | structureimpute/explore/test.py | Tsinghua-gongjing/StructureImpute | 59e33e913998a8841c2cb552828f0f0cc19ebc21 | [
"MIT"
] | 9 | 2021-11-17T11:27:41.000Z | 2022-03-04T10:27:37.000Z | structureimpute/explore/test.py | Tsinghua-gongjing/StructureImpute | 59e33e913998a8841c2cb552828f0f0cc19ebc21 | [
"MIT"
] | null | null | null | structureimpute/explore/test.py | Tsinghua-gongjing/StructureImpute | 59e33e913998a8841c2cb552828f0f0cc19ebc21 | [
"MIT"
] | null | null | null | import pandas as pd
import numpy as np
import util
import subprocess
import generate_data_set
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.validation.txt'
# rpkm = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.exp_vs_null.txt'
# # fragment_data_split_by_rpkm(fragment, rpkm)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.train_randomNULL.txt'
# rpkm = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.exp_vs_null.txt'
# fragment_data_split_by_rpkm(fragment, rpkm)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.validationNoRandom.txt'
# np.random.seed(1234)
# util.data_random_null(fragment, null_pct=0.1)
# util.data_random_null(fragment, null_pct=0.2)
# util.data_random_null(fragment, null_pct=0.3)
# util.data_random_null(fragment, null_pct=0.4)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo_rRNA/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.fulllength18S.validation.txt'
# np.random.seed(1234)
# util.fragment_split(fragment=fragment, train_frac=0.8)
# util.data_random_null(fragment, null_pct=0.1, col=9)
# util.data_random_null(fragment, null_pct=0.15, col=9)
# util.data_random_null(fragment, null_pct=0.2, col=9)
# util.data_random_null(fragment, null_pct=0.25, col=9)
# util.data_random_null(fragment, null_pct=0.3, col=9)
# util.data_random_null(fragment, null_pct=0.35, col=9)
# util.data_random_null(fragment, null_pct=0.4, col=9)
# util.data_random_null(fragment, null_pct=0.45, col=9)
# util.data_random_null(fragment, null_pct=0.5, col=9)
# fragment = '/data/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_x10_then_pct30_maxL20/windowLen100.sliding100.trainx10.txt'
# np.random.seed(1234)
# util.data_random_null(fragment, null_pct=0.3, col=9)
# for i in ['hek_ch_vivo', 'hek_cy_vivo', 'hek_np_vivo', 'hek_wc_vitro']:
# fragment = '/home/gongjing/project/shape_imputation/data/{}/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.train+validation_truenull.inwc6205.txt'.format(i)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.validation_truenull.inwc6205.txt'
# fragment = '/home/gongjing/project/shape_imputation/data/mes_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.train+validation_truenull.txt'
# fragment = '/home/gongjing/project/shape_imputation/data/mes_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.validation_truenull_rmnull.txt'
# np.random.seed(1234)
# savefn = fragment.replace('truenull', 'truenull_randomNULL0.3')
# util.data_random_null(fragment, null_pct=0.3, col=9, savefn=savefn)
# seed_ls = [1234, 9999, 5678, 12315, 400100, 42, 1113, 2019, 19930426, 19491001]
# seed_ls = [1, 67789, 83920001, 20200202, 7381910, 987, 92029, 18273, 29191, 5362]
# seed_ls = [567, 3412, 9090, 20148, 191901, 90901, 782716, 9101919, 19181918, 27181910]
# for seed in seed_ls:
# np.random.seed(seed)
# # fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo_30/3.shape/shape.c200T2M0m0.out.windowsHasNull/random_null/windowLen100.sliding100.valid_both.selflabel.train_truenull.txt'
# fragment = '/home/gongjing/project/shape_imputation/data/hek_np_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/random_null/windowLen100.sliding100.train_truenull.txt'
# savefn=fragment.replace('.txt', '.random0.1.s{}.txt'.format(seed))
# util.data_random_null(fragment, null_pct=0.1, col=9, savefn=savefn)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen50.sliding50.txt'
# util.fragment_to_format_data(fragment=fragment, fragment_len=50, split=1, dataset='train')
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.txt'
# util.fragment_split(fragment=fragment, train_frac=0.8)
out = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out'
# generate_data_set.generate_windows(out=out, window_len_ls=[100], sliding_ls=[10], species='human', all_valid_reactivity=1, null_pct_max=0.9)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding10.txt'
# np.random.seed(1234)
# fragment_train,fragment_validate = util.fragment_split(fragment=fragment, train_frac=0.7, cols=8)
# util.data_random_null(fragment_train, null_pct=0.3, col=9, savefn=None)
# util.data_random_null(fragment_validate, null_pct=0.3, col=9, savefn=None)
# util.data_random_null('/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding10.blastn.validation.txt', null_pct=0.3, col=8, savefn=None)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo_50/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.valid_both.txt'
# util.fragment_split(fragment=fragment, train_frac=0.7, cols=9)
# util.data_random_null('/home/gongjing/project/shape_imputation/data/hek_wc_vivo_50/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.valid_both.train_truenull.txt', null_pct=0.2, col=9)
# util.data_random_null('/home/gongjing/project/shape_imputation/data/hek_wc_vivo_50/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.valid_both.validation_truenull.txt', null_pct=0.2, col=9)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo_50/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.allvalid.txt'
# util.fragment_split(fragment=fragment, train_frac=0.7, cols=9)
# util.data_random_null('/home/gongjing/project/shape_imputation/data/hek_wc_vivo_50/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.allvalid.train_truenull.txt', null_pct=0.1, col=9)
# util.data_random_null('/home/gongjing/project/shape_imputation/data/hek_wc_vivo_50/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.allvalid.validation_truenull.txt', null_pct=0.1, col=9)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo_50/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.valid_both.txt'
# util.fragment_split(fragment=fragment, train_frac=0.7, cols=9)
# util.data_random_null('/home/gongjing/project/shape_imputation/data/hek_wc_vivo_50/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.valid_both.train_truenull.txt', null_pct=0.1, col=9)
# util.data_random_null('/home/gongjing/project/shape_imputation/data/hek_wc_vivo_50/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.valid_both.validation_truenull.txt', null_pct=0.1, col=9)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_random/windowLen100.sliding100.train.txt'
# np.random.seed(1234)
# savefn=fragment.replace('.txt', '.{}.txt'.format(1234))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# np.random.seed(9999)
# savefn=fragment.replace('.txt', '.{}.txt'.format(9999))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# np.random.seed(5678)
# savefn=fragment.replace('.txt', '.{}.txt'.format(5678))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# np.random.seed(12315)
# savefn=fragment.replace('.txt', '.{}.txt'.format(12315))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# np.random.seed(400100)
# savefn=fragment.replace('.txt', '.{}.txt'.format(400100))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# np.random.seed(42)
# savefn=fragment.replace('.txt', '.{}.txt'.format(42))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# np.random.seed(1113)
# savefn=fragment.replace('.txt', '.{}.txt'.format(1113))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# np.random.seed(2019)
# savefn=fragment.replace('.txt', '.{}.txt'.format(2019))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# np.random.seed(19930426)
# savefn=fragment.replace('.txt', '.{}.txt'.format(19930426))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# np.random.seed(19491001)
# savefn=fragment.replace('.txt', '.{}.txt'.format(19491001))
# util.data_random_null(fragment, null_pct=0.1, col=8, savefn=savefn)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_randomnullfragment/windowLen100.sliding100.train.txt'
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/validation_randomnullfragment/windowLen100.sliding100.validation.txt'
# np.random.seed(1234)
# savefn=fragment.replace('.txt', '.1perfragmentL{}.S{}.txt'.format(5, 1234))
# util.data_random_nullfragament(fragment, null_pct=0.1, col=8, savefn=savefn, mode='1perfragment', null_len=5, window_len=100)
# seed_ls = [1234, 9999, 5678, 12315, 400100, 42, 1113, 2019, 19930426, 19491001]
# null_len_ls = [5,10,15,20]
# for seed in seed_ls:
# for null_len in null_len_ls:
# np.random.seed(seed)
# savefn=fragment.replace('.txt', '.1perfragmentL{}.S{}.txt'.format(null_len, seed))
# util.data_random_nullfragament(fragment, null_pct=0.1, col=8, savefn=savefn, mode='1perfragment', null_len=null_len, window_len=100)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vitro/3.shape/shape.c200T2M0m0.out.windowsHasNull/random_null/windowLen100.sliding100.validation_truenull.txt'
# fragment = '/home/gongjing/project/shape_imputation/data/hek_cy_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/random_null/windowLen100.sliding100.train_truenull.txt'
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding10.random/windowLen100.sliding10.blastn.validation.txt'
# seed_ls = [1234,9999, 5678, 12315, 400100, 42, 1113, 2019, 19930426, 19491001]
# max_null_len_ls = [20]#[10, 20, 40]# [10, 20]#
# null_pct_ls = [0.3]#[0.3, 0.2, 0.1]#[0.3, 0.2, 0.1]
# for seed in seed_ls:
# for max_null_len in max_null_len_ls:
# for null_pct in null_pct_ls:
# np.random.seed(seed)
# savefn=fragment.replace('.txt', '.randomNperfragmentNullPct{}.maxL{}.S{}.txt'.format(null_pct, max_null_len, seed))
# util.data_random_nullfragament(fragment, null_pct=null_pct, col=9, savefn=savefn, mode='randomNperfragment', null_len=5, window_len=100, max_null_len=max_null_len)
# sample_ls = ['22_trainLossall', '23_trainLossall_Gmultiply', '24_trainLossall_biLSTM', '25_trainLossall_biLSTMHid256', '26_trainLossall_biLSTMLay3', '27_trainLossall_biLSTMHid256Lay3', '28_trainLossall_LR0001', '29_trainLossall_LR0001_rep2']
# sample_ls = ['40_trainLossall_GmultiplyX_noise2']
# sample_ls = ['35_trainLossall_GmultiplyX_channel','38_trainLossall_GmultiplyX_biLSTMHid256Lay3','37_trainLossall_GmultiplyX_biLSTMLay3','36_trainLossall_GmultiplyX_biLSTMHid256','32_trainLossall_GmultiplyX_meswc']
# for sample in sample_ls:
# log = '/home/gongjing/project/shape_imputation/exper/{}/log.txt'.format(sample)
# util.read_log(log=log, savefn=log.replace('log.txt', 'loss.pdf'))
# generate noise data, not include original batch
# fragment = '/home/gongjing/project/shape_imputation/data/mes_wc_vivo/3.shape/c200T2/w100s100.train_null0.1.txt'
# savefn = fragment.replace('.txt','.noise2.txt')
# util.data_add_noise(fragment=fragment, ratio=2, col=7, seed=1234, savefn=savefn, noise=0.05)
# savefn = fragment.replace('.txt','.noise3.txt')
# util.data_add_noise(fragment=fragment, ratio=3, col=7, seed=1234, savefn=savefn, noise=0.05)
# savefn = fragment.replace('.txt','.noise4.txt')
# util.data_add_noise(fragment=fragment, ratio=4, col=7, seed=1234, savefn=savefn, noise=0.05)
# savefn = fragment.replace('.txt','.noise5.txt')
# util.data_add_noise(fragment=fragment, ratio=5, col=7, seed=1234, savefn=savefn, noise=0.05)
# savefn = fragment.replace('.txt','.noise10.txt')
# util.data_add_noise(fragment=fragment, ratio=10, col=7, seed=1234, savefn=savefn, noise=0.05)
# savefn = fragment.replace('.txt','.0.1noise2.txt')
# util.data_add_noise(fragment=fragment, ratio=2, col=7, seed=1234, savefn=savefn, noise=0.1)
# savefn = fragment.replace('.txt','.0.1noise5.txt')
# util.data_add_noise(fragment=fragment, ratio=5, col=7, seed=1234, savefn=savefn, noise=0.1)
# savefn = fragment.replace('.txt','.0.1noise10.txt')
# util.data_add_noise(fragment=fragment, ratio=10, col=7, seed=1234, savefn=savefn, noise=0.1)
# get .fa
# bed = '/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20123_FXR2_HEK293_trx.tx_has_shape_region_null_ok.bed'
# util.bed_get_fa(bed=bed, species='human')
# # search with fimo
# # /home/gongjing/software/meme_4.12.0/bin/fimo -oc ./test --thresh 0.05 --norc ./motif/Collapsed.used.meme ./human_trx_clip/CLIPDB20123_FXR2_HEK293_trx.tx_has_shape_region_null_ok.bed.fa
# motif_meme='/home/gongjing/project/shape_imputation/data/CLIP/motif/Collapsed.used.meme'
# fimo_dir=bed.replace('.bed', '.fimo')
# fimo='/home/gongjing/software/meme_4.12.0/bin/fimo'
# subprocess.call(["{} -oc {} --thresh 0.05 --norc {} {}".format(fimo, fimo_dir, motif_meme, bed.replace('.bed', '.fa'))], shell=True)
# util.fimo_convert('{}/fimo.txt'.format(fimo_dir))
# util.bed_fimo(bed=bed, species='human')
# bed = '/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20123_FXR2_HEK293_trx.tx_has_shape_region_null_exceed.bed'
# util.bed_fimo(bed=bed, species='human')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20121_FXR1_HEK293_trx.tx_has_shape_region_null_ok.bed', species='human', motif='FXR')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20121_FXR1_HEK293_trx.tx_has_shape_region_null_exceed.bed', species='human', motif='FXR')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20122_FXR1_HEK293_trx.tx_has_shape_region_null_ok.bed', species='human', motif='FXR')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20122_FXR1_HEK293_trx.tx_has_shape_region_null_exceed.bed', species='human', motif='FXR')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20124_FXR2_HEK293_trx.tx_has_shape_region_null_ok.bed', species='human', motif='FXR')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20124_FXR2_HEK293_trx.tx_has_shape_region_null_exceed.bed', species='human', motif='FXR')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20173_LIN28A_HEK293_trx.tx_has_shape_region_null_ok.bed', species='human', motif='LIN28')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20173_LIN28A_HEK293_trx.tx_has_shape_region_null_exceed.bed', species='human', motif='LIN28')
# IGFBP1
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20161_IGF2BP1_HEK293_trx.tx_has_shape_region_null_ok.bed', species='human', motif='IGF2BP1_11')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20162_IGF2BP1_HEK293_trx.tx_has_shape_region_null_ok.bed', species='human', motif='IGF2BP1_11')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20161_IGF2BP1_HEK293_trx.tx_has_shape_region_null_exceed.bed', species='human', motif='IGF2BP1_11')
# util.bed_fimo(bed='/home/gongjing/project/shape_imputation/data/CLIP/human_trx_clip/CLIPDB20162_IGF2BP1_HEK293_trx.tx_has_shape_region_null_exceed.bed', species='human', motif='IGF2BP1_11')
# bed = '/home/gongjing/project/shape_imputation/data/RBMbase/download_20191204/RMBase_hg38_all_m6AOld_site.tran.tx_has_shape_base_valid.bed'
# fa0 = util.bed_get_fa(bed=bed, species='human', extend=0, write_new_bed=1)
# fa5 = util.bed_get_fa(bed=bed, species='human', extend=5, write_new_bed=1)
# fa10 = util.bed_get_fa(bed=bed, species='human', extend=10, write_new_bed=1)
# fa50 = util.bed_get_fa(bed=bed, species='human', extend=50, write_new_bed=1)
# subprocess.call(["/home/gongjing/.local/bin/weblogo -f {} -D fasta -o {} -F pdf".format(fa0, fa0.replace('.fa', '.pdf'))], shell=True)
# subprocess.call(["/home/gongjing/.local/bin/weblogo -f {} -D fasta -o {} -F pdf".format(fa5, fa5.replace('.fa', '.pdf'))], shell=True)
# subprocess.call(["/home/gongjing/.local/bin/weblogo -f {} -D fasta -o {} -F pdf".format(fa10, fa10.replace('.fa', '.pdf'))], shell=True)
# subprocess.call(["/home/gongjing/.local/bin/weblogo -f {} -D fasta -o {} -F pdf".format(fa50, fa50.replace('.fa', '.pdf'))], shell=True)
# fragment = '/home/gongjing/project/shape_imputation/data/RBMbase/download_20191204/RMBase_hg38_all_m6A_site.tran.tx_has_shape_base_valid.e10.bed.shape100.txt'
# util.flter_null_fragment(fragment=fragment, col=7, null_pct=0.2)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_random/windowLen100.sliding100.train.txt'
# np.random.seed(1234)
# savefn = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_evenrandom/windowLen100.sliding100.train.txt'
# util.data_null_even_in_interval(fragment=fragment, null_pct=0.1, col=8, savefn=savefn)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_random/windowLen100.sliding100.train.txt'
# np.random.seed(1234)
# savefn = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_randomnullfragment/windowLen100.sliding100.train.1perfragment_even.S{}.txt'.format(1234)
# util.data_random_nullfragament(fragment, null_pct=0.1, col=8, savefn=savefn, mode='1perfragment_even', null_len=5, window_len=100, max_null_len=10)
# seed_ls = [1234, 9999, 5678, 12315, 400100, 42, 1113, 2019, 19930426, 19491001]
# null_len_ls = [20]#[5,10,15,20]
# for seed in seed_ls:
# for null_len in null_len_ls:
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_random/windowLen100.sliding100.train.txt'
# np.random.seed(seed)
# savefn = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_randomnullfragment/windowLen100.sliding100.train.1perfragment_evenL{}.S{}.txt'.format(null_len, seed)
# util.data_random_nullfragament(fragment, null_pct=0.1, col=8, savefn=savefn, mode='1perfragment_even', null_len=null_len, window_len=100, max_null_len=10)
# seed_ls = [1234, 9999, 5678, 12315, 400100, 42, 1113, 2019, 19930426, 19491001]
# for seed in seed_ls:
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_random/windowLen100.sliding100.train.txt'
# savefn = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/train_evenrandom/windowLen100.sliding100.train.pct0.15.s{}.txt'.format(seed)
# util.data_null_even_in_interval(fragment=fragment, null_pct=0.15, col=8, savefn=savefn, seed=seed)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/validation_evenrandom/windowLen100.sliding100.validation.txt'
# np.random.seed(1234)
# savefn = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/validation_evenrandom/windowLen100.sliding100.validation.s1234.txt'
# util.data_null_even_in_interval(fragment=fragment, null_pct=0.1, col=8, savefn=savefn, seed=1234)
# fn = '/home/gongjing/project/shape_imputation/data/RBMbase/download_20191204/RMBase_hg38_all_m6A_site.tran.tx_has_shape_base_valid.e0.bed.sort.shape'
# savefn = '/home/gongjing/project/shape_imputation/data/RBMbase/download_20191204/RMBase_hg38_all_m6A_site.tran.tx_has_shape_base_valid.e0.bed.sort.shape.heatmap.pdf'
# util.plot_heatmap(fn=fn, savefn=savefn, value_col=3, fig_size_x=10, fig_size_y=20, cmap='summer', facecolor='black')
# fn = '/home/gongjing/project/shape_imputation/data/RBMbase/download_20191204/RMBase_hg38_all_m6A_site.tran.tx_has_shape_base_null.e0.bed.sort.shape'
# savefn = '/home/gongjing/project/shape_imputation/data/RBMbase/download_20191204/RMBase_hg38_all_m6A_site.tran.tx_has_shape_base_null.e0.bed.sort.shape.heatmap.pdf'
# util.plot_heatmap(fn=fn, savefn=savefn, value_col=3, fig_size_x=10, fig_size_y=20, cmap='summer', facecolor='black')
# fn = '/home/gongjing/project/shape_imputation/data/RBMbase/download_20191204/RMBase_hg38_all_m6AsearchNegative_site.tran.tx_has_shape_base_null.e0.bed.sort.shape'
# savefn = '/home/gongjing/project/shape_imputation/data/RBMbase/download_20191204/RMBase_hg38_all_m6AsearchNegative_site.tran.tx_has_shape_base_null.e0.bed.sort.shape.heatmap.pdf'
# util.plot_heatmap(fn=fn, savefn=savefn, value_col=3, fig_size_x=10, fig_size_y=20, cmap='summer', facecolor='black')
### mes/hek293 cy/np/ch
# fragment = '/home/gongjing/project/shape_imputation/data/mes_wc_vitro/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.txt'
# np.random.seed(1234)
# fragment_train,fragment_validate = util.fragment_split(fragment=fragment, train_frac=0.7, cols=8)
# util.data_random_null(fragment_train, null_pct=0.1, col=9, savefn=None)
# util.data_random_null(fragment_validate, null_pct=0.1, col=9, savefn=None)
# sample_ls = ['mes_ch_vitro','mes_ch_vivo', 'mes_cy_vitro', 'mes_cy_vivo', 'mes_np_vitro', 'mes_np_vivo']
# sample_ls = ['hek_ch_vitro','hek_ch_vivo', 'hek_cy_vitro', 'hek_cy_vivo', 'hek_np_vitro', 'hek_np_vivo']
# sample_ls = ['hek_wc_vitro']
# species = 'human'
# for sample in sample_ls:
# out = '/home/gongjing/project/shape_imputation/data/{}/3.shape/shape.c200T2M0m0.out'.format(sample)
# generate_data_set.generate_windows(out=out, window_len_ls=None, sliding_ls=None, species=species, all_valid_reactivity=1, null_pct_max=0.9)
# fragment = '/home/gongjing/project/shape_imputation/data/{}/3.shape/shape.c200T2M0m0.out.windowsHasNull/windowLen100.sliding100.txt'.format(sample)
# np.random.seed(1234)
# fragment_train,fragment_validate = util.fragment_split(fragment=fragment, train_frac=0.7, cols=8)
# util.data_random_null(fragment_train, null_pct=0.1, col=9, savefn=None)
# util.data_random_null(fragment_validate, null_pct=0.1, col=9, savefn=None)
# out = '/home/gongjing/project/shape_imputation/data/CIRSseq/CIRSseq_mES.out'
# species = 'mouse(CIRSseq)'
# generate_data_set.generate_windows(out=out, window_len_ls=None, sliding_ls=None, species=species, all_valid_reactivity=1, null_pct_max=0.9)
# fragment = '/home/gongjing/project/shape_imputation/data/CIRSseq/CIRSseq_mES.out.windowsHasNull/windowLen100.sliding100.txt'
# np.random.seed(1234)
# fragment_train,fragment_validate = util.fragment_split(fragment=fragment, train_frac=0.7, cols=8)
# util.data_random_null(fragment_train, null_pct=0.1, col=9, savefn=None)
# util.data_random_null(fragment_validate, null_pct=0.1, col=9, savefn=None)
# fragment = '/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/shape.c200T2M0m0.out.windowsHasNull/validation_randomnullfragment/windowLen100.sliding100.validation.txt'
# seed_ls = [1234, 9999, 5678, 12315, 400100, 42, 1113, 2019, 19930426, 19491001]
# for seed in seed_ls:
# np.random.seed(seed)
# savefn = fragment.replace('.txt', '.randomNullDist.S{}.txt'.format(seed))
# util.fragment_random_based_on_dist(fragment=fragment, col=8, savefn=savefn)
icshape_fragment_pct_plus_exceed_pct2='/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/test_prediction/hek_wc.out.c80.newwithNULL.nominus.predict/iteration1/allfragment.0.5+exceed0.5.txt2'
icshape_fragment_pct_plus_exceed_predict='/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/test_prediction/hek_wc.out.c80.newwithNULL.nominus.predict/iteration1/allfragment.0.5+exceed0.5.txt2.predict'
icshape_fragment_pct_plus_exceed_predict_shapeout='/home/gongjing/project/shape_imputation/data/hek_wc_vivo/3.shape/test_prediction/hek_wc.out.c80.newwithNULL.nominus.predict/iteration1/allfragment.0.5+exceed0.5.txt2.predict.out'
util.predict_to_shape(validation=icshape_fragment_pct_plus_exceed_pct2, predict=icshape_fragment_pct_plus_exceed_predict, shape_out=icshape_fragment_pct_plus_exceed_predict_shapeout)
| 79.066246 | 243 | 0.790576 | 3,827 | 25,064 | 4.926836 | 0.078652 | 0.052188 | 0.081676 | 0.122514 | 0.887298 | 0.869425 | 0.841793 | 0.82551 | 0.81188 | 0.774596 | 0 | 0.08357 | 0.059488 | 25,064 | 316 | 244 | 79.316456 | 0.716286 | 0.935685 | 0 | 0 | 1 | 0.4 | 0.444444 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 11 |
2e05d573e692fc7490b95e03547bb706b1413ad9 | 13,672 | py | Python | tests/nuodb_statement_management_test.py | jgetto/nuodb-python | 3a22260e801d8f9d9bd33f911a694e8caeba7282 | [
"BSD-3-Clause"
] | null | null | null | tests/nuodb_statement_management_test.py | jgetto/nuodb-python | 3a22260e801d8f9d9bd33f911a694e8caeba7282 | [
"BSD-3-Clause"
] | null | null | null | tests/nuodb_statement_management_test.py | jgetto/nuodb-python | 3a22260e801d8f9d9bd33f911a694e8caeba7282 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
These tests assume that the quickstart database exists.
To create it run /opt/nuodb/run-quickstart or use the web console.
"""
import unittest
import decimal
from nuodb_base import NuoBase
class NuoDBStatementManagementTest(NuoBase):
def test_stable_statement(self):
con = self._connect()
cursor = con.cursor()
init_handle = extract_statement_handle(cursor)
cursor.execute("drop table typetest if exists")
try:
cursor.execute("create table typetest (id integer GENERATED ALWAYS AS IDENTITY, smallint_col smallint, "
"integer_col integer, bigint_col bigint, numeric_col numeric(10, 2), "
"decimal_col decimal(10, 2), number_col number, double_col double precision)")
cursor.execute("insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, "
"number_col, double_col) values (0, 0, 0, 0, 0, 0, 0)")
con.commit()
cursor.execute("select * from typetest order by id desc limit 1")
row = cursor.fetchone()
for i in xrange(1, len(row)):
self.assertEqual(row[i], 0)
current_handle = extract_statement_handle(cursor)
self.assertEqual(init_handle, current_handle)
finally:
try:
cursor.execute("drop table typetest if exists")
finally:
con.close()
def test_statement_per_cursor(self):
con = self._connect()
try:
cursor1 = con.cursor()
cursor2 = con.cursor()
cursor3 = con.cursor()
self.assertNotEqual(extract_statement_handle(cursor1), extract_statement_handle(cursor2))
self.assertNotEqual(extract_statement_handle(cursor2), extract_statement_handle(cursor3))
self.assertNotEqual(extract_statement_handle(cursor1), extract_statement_handle(cursor3))
finally:
con.close()
def test_prepared_statement_cache(self):
con = self._connect()
cursor = con.cursor()
cursor.execute("drop table typetest if exists")
try:
cursor.execute("create table typetest (id integer GENERATED ALWAYS AS IDENTITY, smallint_col smallint, "
"integer_col integer, bigint_col bigint, numeric_col numeric(10, 2), "
"decimal_col decimal(10, 2), number_col number, double_col double)")
test_vals = (3424, 23453464, 45453453454545, decimal.Decimal('234355.33'), decimal.Decimal('976.2'),
decimal.Decimal('34524584057.3434234'), 10000.999)
query = "insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, " \
"number_col, double_col) values (?, ?, ?, ?, ?, ?, ?)"
cursor.execute(query, test_vals)
cursor.execute("select * from typetest order by id desc limit 1")
row = cursor.fetchone()
for i in xrange(1, len(row)):
self.assertEqual(row[i], test_vals[i - 1])
ps_cache = extract_prepared_statement_dict(cursor)
self.assertEqual(1, len(ps_cache))
self.assertIn(query, ps_cache)
finally:
try:
cursor.execute("drop table typetest if exists")
finally:
con.close()
def test_prepared_statement_cache_should_not_grow(self):
con = self._connect()
cursor = con.cursor()
cursor.execute("drop table typetest if exists")
try:
cursor.execute("create table typetest (id integer GENERATED ALWAYS AS IDENTITY, smallint_col smallint, "
"integer_col integer, bigint_col bigint, numeric_col numeric(10, 2), "
"decimal_col decimal(10, 2), number_col number, double_col double)")
test_vals = (3424, 23453464, 45453453454545, decimal.Decimal('234355.33'), decimal.Decimal('976.2'),
decimal.Decimal('34524584057.3434234'), 10000.999)
query = "insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, " \
"number_col, double_col) values (?, ?, ?, ?, ?, ?, ?)"
for _ in xrange(0, 20):
cursor.execute(query, test_vals)
cursor.execute("select * from typetest order by id desc limit 1")
row = cursor.fetchone()
for i in xrange(1, len(row)):
self.assertEqual(row[i], test_vals[i - 1])
ps_cache = extract_prepared_statement_dict(cursor)
self.assertEqual(1, len(ps_cache))
self.assertIn(query, ps_cache)
finally:
try:
cursor.execute("drop table typetest if exists")
finally:
con.close()
def test_prepared_statement_cache_stable(self):
con = self._connect()
cursor = con.cursor()
cursor.execute("drop table typetest if exists")
try:
cursor.execute("create table typetest (id integer GENERATED ALWAYS AS IDENTITY, smallint_col smallint, "
"integer_col integer, bigint_col bigint, numeric_col numeric(10, 2), "
"decimal_col decimal(10, 2), number_col number, double_col double)")
test_vals = (3424, 23453464, 45453453454545, decimal.Decimal('234355.33'), decimal.Decimal('976.2'),
decimal.Decimal('34524584057.3434234'), 10000.999)
query = "insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, " \
"number_col, double_col) values (?, ?, ?, ?, ?, ?, ?)"
handle = None
for _ in xrange(0, 20):
cursor.execute(query, test_vals)
cursor.execute("select * from typetest order by id desc limit 1")
row = cursor.fetchone()
for i in xrange(1, len(row)):
self.assertEqual(row[i], test_vals[i - 1])
ps_cache = extract_prepared_statement_dict(cursor)
self.assertEqual(1, len(ps_cache))
self.assertIn(query, ps_cache)
if handle is None:
handle = ps_cache[query].handle
else:
self.assertEqual(handle, ps_cache[query].handle)
finally:
try:
cursor.execute("drop table typetest if exists")
finally:
con.close()
def test_prepared_statement_cache_should_grow(self):
con = self._connect()
cursor = con.cursor()
cursor.execute("drop table typetest if exists")
try:
cursor.execute("create table typetest (id integer GENERATED ALWAYS AS IDENTITY, smallint_col smallint, "
"integer_col integer, bigint_col bigint, numeric_col numeric(10, 2), "
"decimal_col decimal(10, 2), number_col number, double_col double)")
test_vals = (3424, 23453464, 45453453454545, decimal.Decimal('234355.33'), decimal.Decimal('976.2'),
decimal.Decimal('34524584057.3434234'), 10000.999)
queries = ["insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, "
"number_col, double_col) values (?, ?, ?, ?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, "
"number_col) values (?, ?, ?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col) values "
"(?, ?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col, numeric_col) values (?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col) values (?, ?, ?)",
"insert into typetest (smallint_col, integer_col) values (?, ?)",
"insert into typetest (smallint_col) values (?)"]
for _ in xrange(0, 10):
for i in xrange(0, len(queries)):
cursor.execute(queries[i], test_vals[0:len(queries) - i])
ps_cache = extract_prepared_statement_dict(cursor)
self.assertEqual(len(queries), len(ps_cache))
for query in queries:
self.assertIn(query, ps_cache)
finally:
try:
cursor.execute("drop table typetest if exists")
finally:
con.close()
def test_prepared_statement_cache_eviction(self):
con = self._connect()
cache_size = 5
cursor = con.cursor(prepared_statement_cache_size=cache_size)
cursor.execute("drop table typetest if exists")
try:
cursor.execute("create table typetest (id integer GENERATED ALWAYS AS IDENTITY, smallint_col smallint, "
"integer_col integer, bigint_col bigint, numeric_col numeric(10, 2), "
"decimal_col decimal(10, 2), number_col number, double_col double)")
test_vals = (3424, 23453464, 45453453454545, decimal.Decimal('234355.33'), decimal.Decimal('976.2'),
decimal.Decimal('34524584057.3434234'), 10000.999)
queries = ["insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, "
"number_col, double_col) values (?, ?, ?, ?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, "
"number_col) values (?, ?, ?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col) values "
"(?, ?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col, numeric_col) values (?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col) values (?, ?, ?)",
"insert into typetest (smallint_col, integer_col) values (?, ?)",
"insert into typetest (smallint_col) values (?)"]
for i in xrange(0, len(queries)):
cursor.execute(queries[i], test_vals[0:len(queries) - i])
ps_cache = extract_prepared_statement_dict(cursor)
self.assertEqual(cache_size, len(ps_cache))
for query in queries[len(queries) - cache_size:]:
self.assertIn(query, ps_cache)
finally:
try:
cursor.execute("drop table typetest if exists")
finally:
con.close()
def test_prepared_statement_cache_eviction_lru(self):
con = self._connect()
cache_size = 5
cursor = con.cursor(prepared_statement_cache_size=cache_size)
cursor.execute("drop table typetest if exists")
try:
cursor.execute("create table typetest (id integer GENERATED ALWAYS AS IDENTITY, smallint_col smallint, "
"integer_col integer, bigint_col bigint, numeric_col numeric(10, 2), "
"decimal_col decimal(10, 2), number_col number, double_col double)")
test_vals = (3424, 23453464, 45453453454545, decimal.Decimal('234355.33'), decimal.Decimal('976.2'),
decimal.Decimal('34524584057.3434234'), 10000.999)
queries = ["insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, "
"number_col, double_col) values (?, ?, ?, ?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col, "
"number_col) values (?, ?, ?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col, numeric_col, decimal_col) values "
"(?, ?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col, numeric_col) values (?, ?, ?, ?)",
"insert into typetest (smallint_col, integer_col, bigint_col) values (?, ?, ?)",
"insert into typetest (smallint_col, integer_col) values (?, ?)",
"insert into typetest (smallint_col) values (?)"]
query_order = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 5, 5, 3, 1, 4, 6, 1, 5, 4, 5, 2, 3, 1, 5, 6, 4, 3,
6, 1, 5, 6, 1, 6, 3, 1, 2, 1, 1]
for i in query_order:
cursor.execute(queries[i], test_vals[0:len(queries) - i])
ps_cache = extract_prepared_statement_dict(cursor)
self.assertEqual(cache_size, len(ps_cache))
for query in [queries[1], queries[2], queries[3], queries[5], queries[6]]:
self.assertIn(query, ps_cache)
for query in [queries[0], queries[4]]:
self.assertNotIn(query, ps_cache)
finally:
try:
cursor.execute("drop table typetest if exists")
finally:
con.close()
def extract_statement_handle(cursor):
return cursor._statement_cache._statement.handle
def extract_prepared_statement_dict(cursor):
return cursor._statement_cache._ps_cache
if __name__ == '__main__':
unittest.main()
| 46.503401 | 119 | 0.573727 | 1,499 | 13,672 | 5.023349 | 0.08539 | 0.055246 | 0.059761 | 0.086321 | 0.904648 | 0.871049 | 0.866667 | 0.857769 | 0.851926 | 0.833599 | 0 | 0.056086 | 0.317949 | 13,672 | 293 | 120 | 46.662116 | 0.751421 | 0.012142 | 0 | 0.792035 | 0 | 0 | 0.359784 | 0 | 0 | 0 | 0 | 0 | 0.097345 | 1 | 0.044248 | false | 0 | 0.013274 | 0.00885 | 0.070796 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2e522b9f67fd3b2e3a08e9cdbff4d080feb86d6b | 23,582 | py | Python | tests/pygenome/representations/test_tree.py | jorgetavares/pygenome | 2b529ea55feff8c4a0214b37354d4d7c273202a3 | [
"MIT"
] | 1 | 2019-11-18T14:41:20.000Z | 2019-11-18T14:41:20.000Z | tests/pygenome/representations/test_tree.py | jorgetavares/pygenome | 2b529ea55feff8c4a0214b37354d4d7c273202a3 | [
"MIT"
] | null | null | null | tests/pygenome/representations/test_tree.py | jorgetavares/pygenome | 2b529ea55feff8c4a0214b37354d4d7c273202a3 | [
"MIT"
] | null | null | null | import numpy as np
import pygenome as pg
import operator as op
def protected_div(a, b):
if b > 0:
return op.truediv(a, b)
else:
return b
def test_transverse_tree():
np.random.seed(42)
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addVariable("x")
tree = np.array([1, 3, 1, 3, 4, 0, 0])
assert pg.transverse_tree(pset, tree, 0) == 5
assert pg.transverse_tree(pset, tree, 1) == 2
assert pg.transverse_tree(pset, tree, 2) == 5
def test_count_tree_internals():
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addVariable("x")
tree1 = np.array([1, 2, 1, 5, 1, 2, 2, 6, 5, 2, 3, 4, 2, 2, 4, 4, 2, 6, 6, 2, 3, 3, 1,
2, 5, 5, 1, 5, 6, 0, 0, 0])
depth, nodes = pg.count_tree_internals(pset, tree1)
assert depth == 7
assert nodes == 29
tree2 = np.array([5, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0])
depth, nodes = pg.count_tree_internals(pset, tree2)
assert depth == 1
assert nodes == 1
def test_count_tree_internals_typed():
np.random.seed(42)
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, types=[int, int, int])
pset.addFunction(op.sub, 2, types=[int, int, int])
pset.addFunction(op.mul, 2, types=[int, int, int])
pset.addFunction(protected_div, 2, types=[float, float, float])
num_constants = 5
for i in range(num_constants):
pset.addTerminal(np.random.randint(-5, 5), types=[int])
for i in range(num_constants):
pset.addTerminal(np.random.uniform(), types=[float])
pset.addVariable("x", types=[int])
tree1 = np.array([ 3, 2, 1, 6, 8, 2, 1, 3, 7, 6, 3, 7, 8, 8, 1, 1, 8,
6, 2, 14, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
depth, nodes = pg.count_tree_internals(pset, tree1)
assert depth == 6
assert nodes == 21
np.random.seed(42)
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, types=[int, int, int])
pset.addFunction(op.sub, 2, types=[int, float, float])
pset.addFunction(op.mul, 2, types=[float, int, int])
pset.addFunction(protected_div, 2, types=[float, float, float])
num_constants = 5
for i in range(num_constants):
pset.addTerminal(np.random.randint(-5, 5), types=[int])
for i in range(num_constants):
pset.addTerminal(np.random.uniform(), types=[float])
pset.addVariable("x", types=[int])
tree2 = np.array([ 1, 1, 1, 7, 6, 2, 3, 6, 8, 12, 2, 3, 7, 8, 3, 8, 5,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
depth, nodes = pg.count_tree_internals(pset, tree2)
assert depth == 5
assert nodes == 17
def test_interpreter_str():
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addVariable("x")
tree = np.array([1, 2, 1, 1, 1, 2, 3, 3, 1, 4, 5, 1, 1, 5, 6, 1, 6, 6, 2, 1, 2, 3, 4,
2, 6, 4, 2, 2, 6, 6, 1, 3, 6, 2, 2, 1, 2, 3, 3, 1, 5, 5, 2, 2, 6, 6,
2, 5, 4, 2, 1, 2, 5, 6, 1, 6, 6, 1, 1, 3, 5, 1, 3, 3, 1, 2, 2, 1, 2,
4, 4, 2, 3, 4, 1, 2, 6, 6, 1, 6, 5, 2, 1, 2, 5, 5, 2, 3, 4, 2, 2, 6,
4, 2, 4, 4, 2, 2, 2, 1, 5, 4, 2, 6, 4, 2, 2, 6, 4, 1, 6, 5, 2, 2, 1,
6, 3, 2, 6, 3, 2, 1, 4, 5, 1, 6, 4, 0])
tree_str = pg.interpreter(pset, tree)
assert tree_str == 'add(sub(add(add(add(sub(1, 1), add(2, 3)), add(add(3, x), add(x, x))), sub(add(sub(1, 2), sub(x, 2)), sub(sub(x, x), add(1, x)))), sub(sub(add(sub(1, 1), add(3, 3)), sub(sub(x, x), sub(3, 2))), sub(add(sub(3, x), add(x, x)), add(add(1, 3), add(1, 1))))), add(sub(sub(add(sub(2, 2), sub(1, 2)), add(sub(x, x), add(x, 3))), sub(add(sub(3, 3), sub(1, 2)), sub(sub(x, 2), sub(2, 2)))), sub(sub(sub(add(3, 2), sub(x, 2)), sub(sub(x, 2), add(x, 3))), sub(sub(add(x, 1), sub(x, 1)), sub(add(2, 3), add(x, 2))))))'
def test_interpreter_str_typed():
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [float, int, int])
pset.addFunction(op.sub, 2, [int, float, float])
pset.addFunction(op.mul, 2, [float, int, int])
pset.addTerminal(1, [int])
pset.addTerminal(2, [int])
pset.addTerminal(3, [int])
pset.addTerminal(4.0, [float])
pset.addTerminal(5.0, [float])
pset.addTerminal(6.0, [float])
pset.addVariable("x", [float])
pset.addVariable("y", [int])
tree = np.array([ 1, 2, 3, 2, 1, 2, 9, 9, 2, 10, 7, 1, 2, 9, 8, 2, 9,
9, 2, 1, 2, 9, 10, 2, 7, 10, 3, 2, 10, 9, 2, 8, 7, 1,
2, 3, 2, 10, 9, 2, 9, 8, 1, 2, 10, 7, 2, 7, 10, 2, 3,
2, 7, 9, 2, 10, 7, 1, 2, 9, 9, 2, 8, 8, 2, 3, 2, 3,
2, 10, 10, 2, 9, 9, 3, 2, 8, 8, 2, 9, 10, 2, 3, 2, 10,
10, 2, 7, 8, 1, 2, 9, 8, 2, 7, 7, 3, 2, 3, 2, 10, 7,
2, 10, 8, 3, 2, 7, 9, 2, 9, 8, 2, 3, 2, 10, 10, 2, 9,
10, 1, 2, 10, 7, 2, 10, 9, 0])
tree_str = pg.interpreter(pset, tree)
assert tree_str == 'add(sub(mul(sub(add(sub(6.0, 6.0), sub(x, 4.0)), add(sub(6.0, 5.0), sub(6.0, 6.0))), sub(add(sub(6.0, x), sub(4.0, x)), mul(sub(x, 6.0), sub(5.0, 4.0)))), add(sub(mul(sub(x, 6.0), sub(6.0, 5.0)), add(sub(x, 4.0), sub(4.0, x))), sub(mul(sub(4.0, 6.0), sub(x, 4.0)), add(sub(6.0, 6.0), sub(5.0, 5.0))))), sub(mul(sub(mul(sub(x, x), sub(6.0, 6.0)), mul(sub(5.0, 5.0), sub(6.0, x))), sub(mul(sub(x, x), sub(4.0, 5.0)), add(sub(6.0, 5.0), sub(4.0, 4.0)))), mul(sub(mul(sub(x, 4.0), sub(x, 5.0)), mul(sub(4.0, 6.0), sub(6.0, 5.0))), sub(mul(sub(x, x), sub(6.0, x)), add(sub(x, 4.0), sub(x, 6.0))))))'
def test_interpreter_run():
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addTerminal(4)
tree = np.array([1, 2, 1, 1, 1, 2, 3, 3, 1, 4, 5, 1, 1, 5, 6, 1, 6, 6, 2, 1, 2, 3, 4,
2, 6, 4, 2, 2, 6, 6, 1, 3, 6, 2, 2, 1, 2, 3, 3, 1, 5, 5, 2, 2, 6, 6,
2, 5, 4, 2, 1, 2, 5, 6, 1, 6, 6, 1, 1, 3, 5, 1, 3, 3, 1, 2, 2, 1, 2,
4, 4, 2, 3, 4, 1, 2, 6, 6, 1, 6, 5, 2, 1, 2, 5, 5, 2, 3, 4, 2, 2, 6,
4, 2, 4, 4, 2, 2, 2, 1, 5, 4, 2, 6, 4, 2, 2, 6, 4, 1, 6, 5, 2, 2, 1,
6, 3, 2, 6, 3, 2, 1, 4, 5, 1, 6, 4, 0])
tree_run = pg.interpreter(pset, tree, run=True)
assert tree_run == 20
def test_interpreter_run_typed():
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [float, int, int])
pset.addFunction(op.sub, 2, [int, float, float])
pset.addFunction(op.mul, 2, [float, int, int])
pset.addTerminal(1, [int])
pset.addTerminal(2, [int])
pset.addTerminal(3, [int])
pset.addTerminal(4.0, [float])
pset.addTerminal(5.0, [float])
pset.addTerminal(6.0, [float])
pset.addTerminal(7.0, [float])
pset.addTerminal(8, [int])
tree = np.array([ 1, 2, 3, 2, 1, 2, 9, 9, 2, 10, 7, 1, 2, 9, 8, 2, 9,
9, 2, 1, 2, 9, 10, 2, 7, 10, 3, 2, 10, 9, 2, 8, 7, 1,
2, 3, 2, 10, 9, 2, 9, 8, 1, 2, 10, 7, 2, 7, 10, 2, 3,
2, 7, 9, 2, 10, 7, 1, 2, 9, 9, 2, 8, 8, 2, 3, 2, 3,
2, 10, 10, 2, 9, 9, 3, 2, 8, 8, 2, 9, 10, 2, 3, 2, 10,
10, 2, 7, 8, 1, 2, 9, 8, 2, 7, 7, 3, 2, 3, 2, 10, 7,
2, 10, 8, 3, 2, 7, 9, 2, 9, 8, 2, 3, 2, 10, 10, 2, 9,
10, 1, 2, 10, 7, 2, 10, 9, 0])
tree_run = pg.interpreter(pset, tree, run=True)
assert tree_run == 27.0
def test_interpreter_run_inputs():
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addVariable("x")
tree = np.array([1, 2, 1, 1, 1, 2, 3, 3, 1, 4, 5, 1, 1, 5, 6, 1, 6, 6, 2, 1, 2, 3, 4,
2, 6, 4, 2, 2, 6, 6, 1, 3, 6, 2, 2, 1, 2, 3, 3, 1, 5, 5, 2, 2, 6, 6,
2, 5, 4, 2, 1, 2, 5, 6, 1, 6, 6, 1, 1, 3, 5, 1, 3, 3, 1, 2, 2, 1, 2,
4, 4, 2, 3, 4, 1, 2, 6, 6, 1, 6, 5, 2, 1, 2, 5, 5, 2, 3, 4, 2, 2, 6,
4, 2, 4, 4, 2, 2, 2, 1, 5, 4, 2, 6, 4, 2, 2, 6, 4, 1, 6, 5, 2, 2, 1,
6, 3, 2, 6, 3, 2, 1, 4, 5, 1, 6, 4, 0])
inputs = {"x": 4}
tree_run = pg.interpreter(pset, tree, run=True, vars_inputs=inputs)
assert tree_run == 20
def test_interpreter_run_typed_inputs():
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [float, int, int])
pset.addFunction(op.sub, 2, [int, float, float])
pset.addFunction(op.mul, 2, [float, int, int])
pset.addTerminal(1, [int])
pset.addTerminal(2, [int])
pset.addTerminal(3, [int])
pset.addTerminal(4.0, [float])
pset.addTerminal(5.0, [float])
pset.addTerminal(6.0, [float])
pset.addVariable("x", [float])
pset.addVariable("y", [int])
inputs = {"x": 7.0, "y": 8}
tree = np.array([ 1, 2, 3, 2, 1, 2, 9, 9, 2, 10, 7, 1, 2, 9, 8, 2, 9,
9, 2, 1, 2, 9, 10, 2, 7, 10, 3, 2, 10, 9, 2, 8, 7, 1,
2, 3, 2, 10, 9, 2, 9, 8, 1, 2, 10, 7, 2, 7, 10, 2, 3,
2, 7, 9, 2, 10, 7, 1, 2, 9, 9, 2, 8, 8, 2, 3, 2, 3,
2, 10, 10, 2, 9, 9, 3, 2, 8, 8, 2, 9, 10, 2, 3, 2, 10,
10, 2, 7, 8, 1, 2, 9, 8, 2, 7, 7, 3, 2, 3, 2, 10, 7,
2, 10, 8, 3, 2, 7, 9, 2, 9, 8, 2, 3, 2, 10, 10, 2, 9,
10, 1, 2, 10, 7, 2, 10, 9, 0])
tree_run = pg.interpreter(pset, tree, run=True, vars_inputs=inputs)
assert tree_run == 27.0
def test_max_size_from_tree_max_depth_arity2():
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addTerminal(4)
assert pg.max_size_from_tree_max_depth(pset, 4) == 16
assert pg.max_size_from_tree_max_depth(pset, 6) == 64
assert pg.max_size_from_tree_max_depth(pset, 8) == 256
assert pg.max_size_from_tree_max_depth(pset, 10) == 1024
assert pg.max_size_from_tree_max_depth(pset, 12) == 4096
assert pg.max_size_from_tree_max_depth(pset, 14) == 16384
assert pg.max_size_from_tree_max_depth(pset, 16) == 65536
assert pg.max_size_from_tree_max_depth(pset, 18) == 262144
assert pg.max_size_from_tree_max_depth(pset, 20) == 1048576
def test_max_size_from_tree_max_depth_arity3():
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 3)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addTerminal(4)
assert pg.max_size_from_tree_max_depth(pset, 4) == 40
assert pg.max_size_from_tree_max_depth(pset, 6) == 364
assert pg.max_size_from_tree_max_depth(pset, 8) == 3280
assert pg.max_size_from_tree_max_depth(pset, 10) == 29524
assert pg.max_size_from_tree_max_depth(pset, 12) == 265720
assert pg.max_size_from_tree_max_depth(pset, 14) == 2391484
assert pg.max_size_from_tree_max_depth(pset, 16) == 21523360
assert pg.max_size_from_tree_max_depth(pset, 18) == 193710244
assert pg.max_size_from_tree_max_depth(pset, 20) == 1743392200
def test_full_tree1():
np.random.seed(42)
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addVariable("x")
init_max_depth = 3
max_depth = 6
tree = pg.full_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([1, 2, 3, 5, 1, 6, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
def test_full_tree2():
np.random.seed(42)
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addVariable("x")
init_max_depth = 8
max_depth = 8
tree = pg.full_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([1, 2, 1, 1, 1, 2, 1, 3, 5, 2, 5, 5, 1, 1, 6, 3, 2, 6, 6, 1, 2, 1, 4,
6, 2, 4, 4, 2, 2, 6, 3, 1, 6, 4, 2, 1, 2, 1, 3, 5, 1, 5, 4, 2, 2, 6,
6, 1, 4, 4, 1, 2, 1, 6, 5, 2, 6, 3, 1, 1, 5, 5, 1, 3, 5, 2, 2, 1, 2,
2, 4, 4, 1, 4, 3, 2, 2, 6, 5, 2, 5, 6, 1, 2, 1, 5, 4, 1, 6, 4, 2, 2,
4, 4, 2, 4, 4, 2, 2, 1, 1, 4, 4, 2, 4, 4, 2, 2, 4, 5, 2, 5, 6, 2, 1,
2, 3, 4, 2, 3, 6, 1, 2, 5, 3, 2, 4, 3, 2, 2, 2, 1, 1, 1, 5, 3, 1, 3,
5, 1, 2, 3, 6, 2, 6, 5, 1, 1, 1, 6, 5, 1, 3, 5, 1, 2, 5, 4, 1, 6, 5,
1, 2, 2, 2, 3, 6, 1, 5, 4, 2, 1, 5, 6, 2, 4, 5, 1, 1, 1, 3, 5, 2, 5,
3, 1, 2, 5, 5, 2, 5, 5, 1, 1, 1, 2, 2, 6, 3, 1, 5, 6, 1, 1, 6, 3, 2,
6, 4, 1, 1, 1, 3, 5, 1, 3, 6, 1, 2, 5, 5, 1, 4, 6, 2, 2, 1, 2, 3, 3,
2, 6, 6, 2, 2, 6, 4, 2, 5, 6, 2, 1, 2, 3, 5, 2, 3, 3, 1, 1, 4, 3, 2,
3, 3, 0]))
def test_grow_tree1():
np.random.seed(42)
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addVariable("x")
init_max_depth = 3
max_depth = 6
tree = pg.grow_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
def test_grow_tree2():
np.random.seed(45345)
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addVariable("x")
init_max_depth = 3
max_depth = 6
tree = pg.grow_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([1, 4, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
def test_grow_tree3():
np.random.seed(45345)
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(1)
pset.addTerminal(2)
pset.addTerminal(3)
pset.addVariable("x")
init_max_depth = 2
max_depth = 2
tree = pg.full_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([1, 6, 5, 0]))
#####
def test_full_tree1_typed():
np.random.seed(42)
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [int, int, int])
pset.addFunction(op.sub, 2, [int, int, int])
pset.addTerminal(1, [int])
pset.addTerminal(2, [int])
pset.addTerminal(3, [int])
pset.addVariable("x", [int])
init_max_depth = 3
max_depth = 6
tree = pg.full_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([1, 2, 3, 5, 1, 6, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
def test_full_tree2_typed():
np.random.seed(42)
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [int, int, int])
pset.addFunction(op.sub, 2, [float, int, int])
pset.addFunction(op.mul, 2, [int, float, float])
pset.addFunction(op.truediv, 2, [float, float, float])
pset.addTerminal(1, [int])
pset.addTerminal(2, [int])
pset.addTerminal(3, [int])
pset.addTerminal(4.0, [float])
pset.addTerminal(5.0, [float])
pset.addTerminal(6.0, [float])
pset.addVariable("x", [float])
pset.addVariable("y", [int])
init_max_depth = 7
max_depth = 7
tree = pg.full_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([ 3, 4, 2, 1, 1, 3, 8, 8, 1, 6, 7, 1, 1, 7, 12, 1, 12,
12, 3, 2, 3, 8, 9, 3, 11, 9, 4, 4, 11, 11, 2, 5, 12, 4,
4, 2, 3, 8, 8, 1, 7, 7, 4, 4, 11, 11, 4, 10, 9, 4, 2,
3, 10, 11, 1, 12, 12, 2, 1, 5, 7, 1, 5, 5, 2, 3, 4, 2,
3, 9, 9, 3, 8, 9, 2, 3, 11, 11, 1, 12, 7, 4, 2, 3, 10,
10, 3, 8, 11, 4, 4, 11, 9, 4, 9, 9, 3, 4, 4, 2, 7, 6,
4, 11, 9, 4, 4, 11, 9, 2, 12, 7, 4, 4, 2, 12, 5, 4, 11,
8, 4, 2, 6, 7, 2, 12, 6, 0]))
def test_grow_tree1():
np.random.seed(45345)
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [int, int, int])
pset.addFunction(op.sub, 2, [int, int, int])
pset.addTerminal(1, [int])
pset.addTerminal(2, [int])
pset.addTerminal(3, [int])
pset.addVariable("x", [int])
init_max_depth = 3
max_depth = 6
tree = pg.grow_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([1, 4, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
def test_grow_tree2():
np.random.seed(12345)
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [int, int, int])
pset.addFunction(op.sub, 2, [float, int, int])
pset.addFunction(op.mul, 2, [int, float, float])
pset.addFunction(op.truediv, 2, [float, float, float])
pset.addTerminal(1, [int])
pset.addTerminal(2, [int])
pset.addTerminal(3, [int])
pset.addTerminal(4.0, [float])
pset.addTerminal(5.0, [float])
pset.addTerminal(6.0, [float])
pset.addVariable("x", [float])
pset.addVariable("y", [int])
init_max_depth = 4
max_depth = 4
tree = pg.grow_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([ 3, 11, 11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
def test_grow_tree3():
np.random.seed(12345)
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [int, int, int])
pset.addFunction(op.sub, 2, [float, int, int])
pset.addFunction(op.mul, 2, [int, float, float])
pset.addFunction(op.truediv, 2, [float, float, float])
pset.addTerminal(1, [int])
pset.addTerminal(2, [int])
pset.addTerminal(3, [int])
pset.addTerminal(4.0, [float])
pset.addTerminal(5.0, [float])
pset.addTerminal(6.0, [float])
pset.addVariable("x", [float])
pset.addVariable("y", [int])
init_max_depth = 2
max_depth = 2
tree = pg.grow_tree(pset, init_max_depth, max_depth)
assert np.array_equal(tree, np.array([3, 9, 9, 0]))
def float_constants():
return np.random.uniform()
def test_full_tree_ephemeral1():
np.random.seed(42)
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(float_constants, types=None, ephemeral=True)
init_max_depth = 3
max_depth = 6
tree = pg.full_tree(pset, init_max_depth, max_depth)
tree_str = pg.interpreter(pset, tree)
assert np.array_equal(tree, np.array([1, 2, 4, 5, 1, 6, 7, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
assert tree_str == 'add(sub(0.9507143064099162, 0.7319939418114051), add(0.596850157946487, 0.44583275285359114))'
assert pset.num_primitives == 7
assert pset.ephemeral_cache == {3}
assert pset.ephemeral_constants == {4: (0.9507143064099162, None), 5: (0.7319939418114051, None), 6: (0.596850157946487, None), 7: (0.44583275285359114, None)}
def test_full_tree_ephemeral_typed1():
np.random.seed(42)
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [float, float, float])
pset.addFunction(op.sub, 2, [float, float, float])
pset.addTerminal(float_constants, types=[float], ephemeral=True)
init_max_depth = 3
max_depth = 6
tree = pg.full_tree(pset, init_max_depth, max_depth)
tree_str = pg.interpreter(pset, tree)
assert np.array_equal(tree, np.array([1, 2, 4, 5, 1, 6, 7, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
assert tree_str == 'add(sub(0.9507143064099162, 0.7319939418114051), add(0.596850157946487, 0.44583275285359114))'
assert pset.num_primitives == 7
assert pset.ephemeral_cache == {3}
assert pset.ephemeral_constants == {4: (0.9507143064099162, [float]), 5: (0.7319939418114051, [float]), 6: (0.596850157946487, [float]), 7: (0.44583275285359114, [float])}
def test_grow_tree_ephemeral1():
np.random.seed(45345)
pset = pg.PrimitiveSet()
pset.addFunction(op.add, 2)
pset.addFunction(op.sub, 2)
pset.addTerminal(float_constants, types=None, ephemeral=True)
init_max_depth = 5
max_depth = 6
tree = pg.grow_tree(pset, init_max_depth, max_depth)
tree_str = pg.interpreter(pset, tree)
assert np.array_equal(tree, np.array([ 1, 4, 2, 2, 1, 5, 6, 2, 7, 8, 1, 1, 9, 10, 11, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
assert tree_str == 'add(0.6948470578046806, sub(sub(add(0.23851088045334068, 0.7485138905722925), sub(0.9421360052961212, 0.5983584346144624)), add(add(0.6952518271609337, 0.41227782045980343), 0.13638623666489624)))'
assert pset.num_primitives == 11
assert pset.ephemeral_cache == {3}
assert pset.ephemeral_constants == {4: (0.6948470578046806, None), 5: (0.23851088045334068, None), 6: (0.7485138905722925, None), 7: (0.9421360052961212, None), 8: (0.5983584346144624, None), 9: (0.6952518271609337, None), 10: (0.41227782045980343, None), 11: (0.13638623666489624, None)}
def test_grow_tree_ephemeral_typed1():
np.random.seed(45345)
pset = pg.PrimitiveSet(typed=True)
pset.addFunction(op.add, 2, [float, float, float])
pset.addFunction(op.sub, 2, [float, float, float])
pset.addTerminal(float_constants, types=[float], ephemeral=True)
init_max_depth = 5
max_depth = 6
tree = pg.grow_tree(pset, init_max_depth, max_depth)
tree_str = pg.interpreter(pset, tree)
assert np.array_equal(tree, np.array([ 1, 4, 2, 2, 1, 5, 6, 2, 7, 8, 1, 1, 9, 10, 11, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))
assert tree_str == 'add(0.6948470578046806, sub(sub(add(0.23851088045334068, 0.7485138905722925), sub(0.9421360052961212, 0.5983584346144624)), add(add(0.6952518271609337, 0.41227782045980343), 0.13638623666489624)))'
assert pset.num_primitives == 11
assert pset.ephemeral_cache == {3}
assert pset.ephemeral_constants == {4: (0.6948470578046806, [float]), 5: (0.23851088045334068, [float]), 6: (0.7485138905722925, [float]), 7: (0.9421360052961212, [float]), 8: (0.5983584346144624, [float]), 9: (0.6952518271609337, [float]), 10: (0.41227782045980343, [float]), 11: (0.13638623666489624, [float])}
| 46.512821 | 618 | 0.552413 | 4,437 | 23,582 | 2.860041 | 0.028623 | 0.089992 | 0.131442 | 0.170843 | 0.877384 | 0.853586 | 0.831363 | 0.820883 | 0.805595 | 0.799685 | 0 | 0.193336 | 0.251633 | 23,582 | 506 | 619 | 46.604743 | 0.525725 | 0 | 0 | 0.764211 | 0 | 0.012632 | 0.072189 | 0.024431 | 0 | 0 | 0 | 0 | 0.136842 | 1 | 0.056842 | false | 0 | 0.006316 | 0.002105 | 0.069474 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2e55b82512a1f84e7ea9a44070bd526523d9d56b | 6,192 | py | Python | server_settings.py | RAiU14/Chat_Filter_Bot | 9a3aeb144e178516f8b853e9c95afc4955d2c5cb | [
"MIT"
] | null | null | null | server_settings.py | RAiU14/Chat_Filter_Bot | 9a3aeb144e178516f8b853e9c95afc4955d2c5cb | [
"MIT"
] | null | null | null | server_settings.py | RAiU14/Chat_Filter_Bot | 9a3aeb144e178516f8b853e9c95afc4955d2c5cb | [
"MIT"
] | null | null | null | from typing import Union
from imports import *
# Adds a channelid to a list of ignored channels into a JSON file.
def add_ignore_channels(filename: str, obj: Union[int, list]):
channelid = []
for item in obj:
channelid.append(item.id)
try:
data = {'channel_id': [], 'role_id': []}
if os.path.isfile(filename):
with open(filename) as json_reading:
data = json.load(json_reading)
data['channel_id'].extend(channelid) # Add more value to the list.
data['channel_id'] = list(set(data['channel_id'])) # Removes duplicates.
with open(filename, 'w') as json_writing:
json.dump(data, json_writing, indent=4)
embed = discord.Embed(title="Sucess!", description="Added the following channel to ignore list!",
colour=discord.Color.green())
for x in obj:
embed.add_field(name=x.name, value=x.id)
return embed
except Exception:
embed = discord.Embed(title="Oops!", description="Some error occured, try maybe?",
color=discord.Color.red())
return embed
# Remove a channel id from the list of ignored channels present in the JSON file.
def del_ingored_channel(filename: str, obj: list = None):
channelid = []
data = {'channel_id': [], 'role_id': []}
try:
if obj is None:
with open(filename) as json_reading:
data = json.load(json_reading)
data[
'channel_id'].clear() # This is for removing everything in channel_id. [list.clear() will clear the complete list.]
with open(filename, 'w') as json_writing:
json.dump(data, json_writing, indent=4)
embed = discord.Embed(title='Success!', description="The ignore channel list is now cleared.",
colour=discord.Colour.green())
return embed
if len(obj) == 1:
channelid = [obj[0].id]
else:
for item in obj:
channelid.append(item.id)
print(type(channelid))
if os.path.isfile(filename):
with open(filename) as json_reading:
data = json.load(json_reading)
data['channel_id'] = list(set(data['channel_id']) - set(channelid)) # This will delete value from the list.
with open(filename, 'w') as json_writing:
json.dump(data, json_writing, indent=4)
embed = discord.Embed(title="Sucess!", description="Deleted the following channel from the ignore list!",
colour=discord.Color.green())
for x in obj:
embed.add_field(name=x.name, value=x.id)
return embed
except Exception:
embed = discord.Embed(title="Oops!", description="Some error occured, try maybe?",
color=discord.Color.red())
return embed
# Adds a list of ignored roles to JSON file.
def add_ignore_roles(filename: str, obj: list):
roleid = []
try:
for item in obj:
roleid.append(item.id)
data = {'channel_id': [], 'role_id': []}
if os.path.isfile(filename):
with open(filename) as json_reading:
data = json.load(json_reading)
data['role_id'].extend(roleid)
data['role_id'] = list(set(data['role_id'])) # Removes duplicates.
with open(filename, 'w') as json_writing:
json.dump(data, json_writing, indent=4)
embed = discord.Embed(title="Sucess!", description="Added the following roles to ignore list!",
colour=discord.Color.green())
for x in obj:
embed.add_field(name=x.name, value=x.id)
return embed
except Exception:
embed = discord.Embed(title="Oops!", description="Some error occured, try maybe?",
color=discord.Color.red())
return embed
# Remove a role id from the list of ignored role present in the JSON file.
def del_ingored_roles(filename: str, obj: list = None):
roleid = []
data = {'channel_id': [], 'role_id': []}
try:
if obj is None:
with open(filename) as json_reading:
data = json.load(json_reading)
data[
'role_id'].clear() # This is for removing everything in channel_id. [list.clear() will clear the complete list.]
with open(filename, 'w') as json_writing:
json.dump(data, json_writing, indent=4)
embed = discord.Embed(title='Success!', description="The ignore channel list is now cleared.",
colour=discord.Colour.green())
return embed
if len(obj) == 1:
roleid = [obj[0].id]
else:
for item in obj:
roleid.append(item.id)
if os.path.isfile(filename):
with open(filename) as json_reading:
data = json.load(json_reading)
data['role_id'] = list(set(data['role_id']) - set(roleid)) # This will delete value from the list.
with open(filename, 'w') as json_writing:
json.dump(data, json_writing, indent=4)
embed = discord.Embed(title="Sucess!", description="Deleted the following channel from the ignore list!",
colour=discord.Color.green())
for x in obj:
embed.add_field(name=x.name, value=x.id)
return embed
except Exception:
embed = discord.Embed(title="Oops!", description="Some error occured, try maybe?",
color=discord.Color.red())
return embed
# View all the ignored channels
def view_ignored(filename: str, a_type: str):
# make it a way where if a_type is none, then display everything
try:
with open(filename) as json_reading:
data = json.load(json_reading)
if a_type == 'channel':
ignored_channels = data['channel_id']
return ignored_channels
else:
ignored_roles = data['role_id']
return ignored_roles
except Exception:
return False
| 43 | 132 | 0.580588 | 769 | 6,192 | 4.583875 | 0.141743 | 0.035745 | 0.059007 | 0.062411 | 0.840284 | 0.808511 | 0.796028 | 0.796028 | 0.710071 | 0.710071 | 0 | 0.002342 | 0.310562 | 6,192 | 143 | 133 | 43.300699 | 0.823378 | 0.109981 | 0 | 0.795276 | 0 | 0 | 0.117882 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03937 | false | 0 | 0.015748 | 0 | 0.15748 | 0.007874 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2e5864d4906a071a3aff17b4b3f89d03ed455939 | 147 | py | Python | app/account/__init__.py | Mateus-Brito/flask-start | 3a772d65708c7ef050d4060db8182b88e6b7bc5f | [
"MIT"
] | 1 | 2021-11-22T17:40:39.000Z | 2021-11-22T17:40:39.000Z | app/account/__init__.py | Mateus-Brito/flask-start | 3a772d65708c7ef050d4060db8182b88e6b7bc5f | [
"MIT"
] | 1 | 2018-08-25T20:44:47.000Z | 2018-08-25T20:44:47.000Z | app/account/__init__.py | Mateus-Brito/flask-start | 3a772d65708c7ef050d4060db8182b88e6b7bc5f | [
"MIT"
] | null | null | null | from flask import Blueprint
account_controller = Blueprint('account_controller', __name__, template_folder='templates')
from .controller import * | 29.4 | 91 | 0.823129 | 16 | 147 | 7.125 | 0.625 | 0.280702 | 0.45614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 147 | 5 | 92 | 29.4 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0.182432 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
2e777ec5b2edfb1bfe40e6eb9de102e0c7c788ae | 47 | py | Python | bikeshed/refs/__init__.py | saschanaz/bikeshed | fb1e763a4f49852a7dabe8d783c6980416b238ef | [
"CC0-1.0"
] | 775 | 2015-01-06T16:58:59.000Z | 2022-03-31T23:49:10.000Z | bikeshed/refs/__init__.py | saschanaz/bikeshed | fb1e763a4f49852a7dabe8d783c6980416b238ef | [
"CC0-1.0"
] | 1,495 | 2015-01-06T01:06:00.000Z | 2022-03-31T20:16:13.000Z | bikeshed/refs/__init__.py | frivoal/bikeshed | 132fff3bd80d0059b5a2ac0cd4e3317db34dec12 | [
"CC0-1.0"
] | 196 | 2015-01-26T23:56:59.000Z | 2022-03-23T20:35:59.000Z | from .ReferenceManager import ReferenceManager
| 23.5 | 46 | 0.893617 | 4 | 47 | 10.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 47 | 1 | 47 | 47 | 0.976744 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
cf071a2fa0714e702dadb5d1044cea7f5dc07518 | 15 | py | Python | code/sample_2-3-12.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | 1 | 2022-03-29T13:50:12.000Z | 2022-03-29T13:50:12.000Z | code/sample_2-3-12.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | null | null | null | code/sample_2-3-12.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | null | null | null | print(10 <= 9)
| 7.5 | 14 | 0.533333 | 3 | 15 | 2.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.2 | 15 | 1 | 15 | 15 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
cf711e3a73dc6ac48e89137339cbe8ad40418777 | 9,989 | py | Python | utilities/train_eval_classification.py | saiabinesh/EdgeNets | 2b232d3f7fb60658755dad1ebca0ffc895cc795e | [
"MIT"
] | 392 | 2019-06-08T00:34:58.000Z | 2022-03-26T18:46:44.000Z | utilities/train_eval_classification.py | saiabinesh/EdgeNets | 2b232d3f7fb60658755dad1ebca0ffc895cc795e | [
"MIT"
] | 37 | 2019-06-23T07:37:36.000Z | 2022-03-02T17:24:30.000Z | utilities/train_eval_classification.py | saiabinesh/EdgeNets | 2b232d3f7fb60658755dad1ebca0ffc895cc795e | [
"MIT"
] | 87 | 2019-06-11T16:32:07.000Z | 2022-01-30T14:44:29.000Z | #============================================
__author__ = "Sachin Mehta"
__maintainer__ = "Sachin Mehta"
#============================================
import time
import torch
from utilities.utils import AverageMeter
from utilities.metrics.classification_accuracy import accuracy
from torch.nn import functional as F
from utilities.print_utils import *
'''
Training loop
'''
def train(data_loader, model, criteria, optimizer, epoch, device='cuda'):
batch_time = AverageMeter()
data_time = AverageMeter()
losses = AverageMeter()
top1 = AverageMeter()
top5 = AverageMeter()
# switch to train mode
model.train()
end = time.time()
for i, (input, target) in enumerate(data_loader):
# measure data loading time
data_time.update(time.time() - end)
input = input.to(device)
target = target.to(device)
# compute output
output = model(input)
# compute loss
loss = criteria(output, target)
# measure accuracy and record loss
prec1, prec5 = accuracy(output, target, topk=(1, 5))
#losses.update(loss.data[0], input.size(0))
losses.update(loss.item(), input.size(0))
top1.update(prec1[0].item(), input.size(0))
top5.update(prec5[0].item(), input.size(0))
# compute gradient and do SGD step
optimizer.zero_grad()
loss.backward()
optimizer.step()
# measure elapsed time
batch_time.update(time.time() - end)
end = time.time()
if i % 10 == 0: #print after every 100 batches
print_log_message("Epoch: %d[%d/%d]\t\tBatch Time:%.4f\t\tLoss:%.4f\t\ttop1:%.4f (%.4f)\t\ttop5:%.4f (%.4f)" %
(epoch, i, len(data_loader), batch_time.avg, losses.avg, top1.val, top1.avg, top5.val, top5.avg))
return top1.avg, losses.avg
'''
Validation loop
'''
def validate(data_loader, model, criteria=None, device='cuda'):
batch_time = AverageMeter()
top1 = AverageMeter()
top5 = AverageMeter()
if criteria:
losses = AverageMeter()
# switch to evaluate mode
model.eval()
# with torch.no_grad():
end = time.time()
with torch.no_grad():
for i, (input, target) in enumerate(data_loader):
input = input.to(device)
target = target.to(device)
# compute output
output = model(input)
if criteria:
loss = criteria(output, target)
# measure accuracy and record loss
prec1, prec5 = accuracy(output, target, topk=(1, 5))
if criteria:
losses.update(loss.item(), input.size(0))
top1.update(prec1[0].item(), input.size(0))
top5.update(prec5[0].item(), input.size(0))
# measure elapsed time
batch_time.update(time.time() - end)
end = time.time()
if i % 10 == 0 and criteria: # print after every 100 batches
print_log_message("Batch:[%d/%d]\t\tBatchTime:%.3f\t\tLoss:%.3f\t\ttop1:%.3f (%.3f)\t\ttop5:%.3f(%.3f)" %
(i, len(data_loader), batch_time.avg, losses.avg, top1.val, top1.avg, top5.val, top5.avg))
elif i % 10:
print_log_message(
"Batch:[%d/%d]\t\tBatchTime:%.3f\t\ttop1:%.3f (%.3f)\t\ttop5:%.3f(%.3f)" %
(i, len(data_loader), batch_time.avg, top1.val, top1.avg, top5.val, top5.avg))
print_info_message(' * Prec@1:%.3f Prec@5:%.3f' % (top1.avg, top5.avg))
if criteria:
return top1.avg, losses.avg
else:
return top1.avg
def train_multi(data_loader, model, criteria, optimizer, epoch, device='cuda'):
batch_time = AverageMeter()
losses = AverageMeter()
prec = AverageMeter()
rec = AverageMeter()
# switch to train mode
model.train()
end = time.time()
tp, fp, fn, tn, count = 0, 0, 0, 0, 0
p_o, r_o, f_o = 0.0, 0.0, 0.0
for i, (input, target) in enumerate(data_loader):
target = target.to(device=device)
target = target.max(dim=1)[0]
# compute output
output = model(input)
loss = criteria(output, target.float()) * 80.0
# measure accuracy and record loss
pred = output.gt(0.0).long()
tp += (pred + target).eq(2).sum(dim=0)
fp += (pred - target).eq(1).sum(dim=0)
fn += (pred - target).eq(-1).sum(dim=0)
tn += (pred + target).eq(0).sum(dim=0)
count += input.size(0)
this_tp = (pred + target).eq(2).sum()
this_fp = (pred - target).eq(1).sum()
this_fn = (pred - target).eq(-1).sum()
this_tn = (pred + target).eq(0).sum()
this_acc = (this_tp + this_tn).float() / (this_tp + this_tn + this_fp + this_fn).float()
this_prec = this_tp.float() / (this_tp + this_fp).float() * 100.0 if this_tp + this_fp != 0 else 0.0
this_rec = this_tp.float() / (this_tp + this_fn).float() * 100.0 if this_tp + this_fn != 0 else 0.0
losses.update(float(loss), input.size(0))
prec.update(float(this_prec), input.size(0))
rec.update(float(this_rec), input.size(0))
# compute gradient and do SGD step
optimizer.zero_grad()
loss.backward()
optimizer.step()
# measure elapsed time
batch_time.update(time.time() - end)
end = time.time()
p_c = [float(tp[i].float() / (tp[i] + fp[i]).float()) * 100.0 if tp[i] > 0 else 0.0 for i in range(len(tp))]
r_c = [float(tp[i].float() / (tp[i] + fn[i]).float()) * 100.0 if tp[i] > 0 else 0.0 for i in range(len(tp))]
f_c = [2 * p_c[i] * r_c[i] / (p_c[i] + r_c[i]) if tp[i] > 0 else 0.0 for i in range(len(tp))]
p_o = tp.sum().float() / (tp + fp).sum().float() * 100.0
r_o = tp.sum().float() / (tp + fn).sum().float() * 100.0
f_o = 2 * p_o * r_o / (p_o + r_o)
if i % 100 == 0:
print_log_message('Epoch: [{0}][{1}/{2}]\t'
'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\t'
'Loss {loss.val:.4f} ({loss.avg:.4f})\t'
'Precision {prec.val:.2f} ({prec.avg:.2f})\t'
'Recall {rec.val:.2f} ({rec.avg:.2f})'.format(
epoch, i, len(data_loader), batch_time=batch_time,
loss=losses, prec=prec, rec=rec))
return f_o, losses.avg
def validate_multi(data_loader, model, criteria, device='cuda'):
batch_time = AverageMeter()
losses = AverageMeter()
prec = AverageMeter()
rec = AverageMeter()
# switch to evaluate mode
model.eval()
end = time.time()
tp, fp, fn, tn, count = 0, 0, 0, 0, 0
tp_size, fn_size = 0, 0
with torch.no_grad():
for i, (input, target) in enumerate(data_loader):
input = input.to(device=device)
target = target.to(device=device)
original_target = target
target = target.max(dim=1)[0]
# compute output
output = model(input)
loss = criteria(output, target.float())
# measure accuracy and record loss
pred = output.data.gt(0.0).long()
tp += (pred + target).eq(2).sum(dim=0)
fp += (pred - target).eq(1).sum(dim=0)
fn += (pred - target).eq(-1).sum(dim=0)
tn += (pred + target).eq(0).sum(dim=0)
three_pred = pred.unsqueeze(1).expand(-1, 3, -1) # n, 3, 80
tp_size += (three_pred + original_target).eq(2).sum(dim=0)
fn_size += (three_pred - original_target).eq(-1).sum(dim=0)
count += input.size(0)
this_tp = (pred + target).eq(2).sum()
this_fp = (pred - target).eq(1).sum()
this_fn = (pred - target).eq(-1).sum()
this_tn = (pred + target).eq(0).sum()
this_acc = (this_tp + this_tn).float() / (this_tp + this_tn + this_fp + this_fn).float()
this_prec = this_tp.float() / (this_tp + this_fp).float() * 100.0 if this_tp + this_fp != 0 else 0.0
this_rec = this_tp.float() / (this_tp + this_fn).float() * 100.0 if this_tp + this_fn != 0 else 0.0
losses.update(float(loss), input.size(0))
prec.update(float(this_prec), input.size(0))
rec.update(float(this_rec), input.size(0))
# measure elapsed time
batch_time.update(time.time() - end)
end = time.time()
p_c = [float(tp[i].float() / (tp[i] + fp[i]).float()) * 100.0 if tp[i] > 0 else 0.0 for i in range(len(tp))]
r_c = [float(tp[i].float() / (tp[i] + fn[i]).float()) * 100.0 if tp[i] > 0 else 0.0 for i in range(len(tp))]
f_c = [2 * p_c[i] * r_c[i] / (p_c[i] + r_c[i]) if tp[i] > 0 else 0.0 for i in range(len(tp))]
mean_p_c = sum(p_c) / len(p_c)
mean_r_c = sum(r_c) / len(r_c)
mean_f_c = sum(f_c) / len(f_c)
p_o = tp.sum().float() / (tp + fp).sum().float() * 100.0
r_o = tp.sum().float() / (tp + fn).sum().float() * 100.0
f_o = 2 * p_o * r_o / (p_o + r_o)
if i % 100 == 0:
print_log_message('Test: [{0}/{1}]\t'
'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\t'
'Loss {loss.val:.4f} ({loss.avg:.4f})\t'
'Precision {prec.val:.2f} ({prec.avg:.2f})\t'
'Recall {rec.val:.2f} ({rec.avg:.2f})'.format(
i, len(data_loader), batch_time=batch_time, loss=losses,
prec=prec, rec=rec))
print('P_C {:.2f} R_C {:.2f} F_C {:.2f} P_O {:.2f} R_O {:.2f} F_O {:.2f}'
.format(mean_p_c, mean_r_c, mean_f_c, p_o, r_o, f_o))
print_info_message(' * P_C {:.2f} R_C {:.2f} F_C {:.2f} P_O {:.2f} R_O {:.2f} F_O {:.2f}'
.format(mean_p_c, mean_r_c, mean_f_c, p_o, r_o, f_o))
return f_o, losses.avg | 37.133829 | 127 | 0.534087 | 1,455 | 9,989 | 3.52921 | 0.094158 | 0.010127 | 0.03739 | 0.013632 | 0.854333 | 0.794547 | 0.780331 | 0.751509 | 0.72483 | 0.719377 | 0 | 0.040374 | 0.293323 | 9,989 | 269 | 128 | 37.133829 | 0.687066 | 0.069176 | 0 | 0.726257 | 0 | 0.027933 | 0.08821 | 0.025358 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022346 | false | 0 | 0.03352 | 0 | 0.083799 | 0.050279 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cf887193c27ed79cf906809fa820157e34f294da | 16,620 | py | Python | lib/antlr-3.0.1/runtime/Python/unittests/testtree.py | dherre3/mclab-core | ccdcd6f46ee42285c7ad055ff0a9ea3361112e11 | [
"Apache-2.0"
] | 11 | 2015-05-31T16:11:35.000Z | 2021-02-16T00:04:48.000Z | lib/antlr-3.0.1/runtime/Python/unittests/testtree.py | sshrdp/mclab | 1843078edb13e647c0261d1944320ffbcf02ad99 | [
"Apache-2.0"
] | 12 | 2015-05-04T16:21:04.000Z | 2019-04-24T21:49:33.000Z | lib/antlr-3.0.1/runtime/Python/unittests/testtree.py | sshrdp/mclab | 1843078edb13e647c0261d1944320ffbcf02ad99 | [
"Apache-2.0"
] | 13 | 2015-05-31T17:16:45.000Z | 2021-02-09T17:08:26.000Z | # -*- coding: utf-8 -*-
import os
import unittest
from StringIO import StringIO
from antlr3.tree import CommonTreeNodeStream, CommonTree
from antlr3 import CommonToken, UP, DOWN, EOF
class TestTreeNodeStream(unittest.TestCase):
"""Test case for the TreeNodeStream class."""
def newStream(self, t):
"""Build new stream; let's us override to test other streams."""
return CommonTreeNodeStream(t)
def testSingleNode(self):
t = CommonTree(CommonToken(101))
stream = self.newStream(t)
expecting = "101"
found = self.toNodesOnlyString(stream)
self.failUnlessEqual(expecting, found)
expecting = "101"
found = str(stream)
self.failUnlessEqual(expecting, found)
def test4Nodes(self):
# ^(101 ^(102 103) 104)
t = CommonTree(CommonToken(101))
t.addChild(CommonTree(CommonToken(102)))
t.getChild(0).addChild(CommonTree(CommonToken(103)))
t.addChild(CommonTree(CommonToken(104)))
stream = self.newStream(t)
expecting = "101 102 103 104"
found = self.toNodesOnlyString(stream)
self.failUnlessEqual(expecting, found)
expecting = "101 2 102 2 103 3 104 3"
found = str(stream)
self.failUnlessEqual(expecting, found)
def testList(self):
root = CommonTree(None)
t = CommonTree(CommonToken(101))
t.addChild(CommonTree(CommonToken(102)))
t.getChild(0).addChild(CommonTree(CommonToken(103)))
t.addChild(CommonTree(CommonToken(104)))
u = CommonTree(CommonToken(105))
root.addChild(t)
root.addChild(u)
stream = CommonTreeNodeStream(root)
expecting = "101 102 103 104 105"
found = self.toNodesOnlyString(stream)
self.failUnlessEqual(expecting, found)
expecting = "101 2 102 2 103 3 104 3 105"
found = str(stream)
self.failUnlessEqual(expecting, found)
def testFlatList(self):
root = CommonTree(None)
root.addChild(CommonTree(CommonToken(101)))
root.addChild(CommonTree(CommonToken(102)))
root.addChild(CommonTree(CommonToken(103)))
stream = CommonTreeNodeStream(root)
expecting = "101 102 103"
found = self.toNodesOnlyString(stream)
self.failUnlessEqual(expecting, found)
expecting = "101 102 103"
found = str(stream)
self.failUnlessEqual(expecting, found)
def testListWithOneNode(self):
root = CommonTree(None)
root.addChild(CommonTree(CommonToken(101)))
stream = CommonTreeNodeStream(root)
expecting = "101"
found = self.toNodesOnlyString(stream)
self.failUnlessEqual(expecting, found)
expecting = "101"
found = str(stream)
self.failUnlessEqual(expecting, found)
def testAoverB(self):
t = CommonTree(CommonToken(101))
t.addChild(CommonTree(CommonToken(102)))
stream = self.newStream(t)
expecting = "101 102"
found = self.toNodesOnlyString(stream)
self.failUnlessEqual(expecting, found)
expecting = "101 2 102 3"
found = str(stream)
self.failUnlessEqual(expecting, found)
def testLT(self):
# ^(101 ^(102 103) 104)
t = CommonTree(CommonToken(101))
t.addChild(CommonTree(CommonToken(102)))
t.getChild(0).addChild(CommonTree(CommonToken(103)))
t.addChild(CommonTree(CommonToken(104)))
stream = self.newStream(t)
self.failUnlessEqual(101, stream.LT(1).getType())
self.failUnlessEqual(DOWN, stream.LT(2).getType())
self.failUnlessEqual(102, stream.LT(3).getType())
self.failUnlessEqual(DOWN, stream.LT(4).getType())
self.failUnlessEqual(103, stream.LT(5).getType())
self.failUnlessEqual(UP, stream.LT(6).getType())
self.failUnlessEqual(104, stream.LT(7).getType())
self.failUnlessEqual(UP, stream.LT(8).getType())
self.failUnlessEqual(EOF, stream.LT(9).getType())
# check way ahead
self.failUnlessEqual(EOF, stream.LT(100).getType())
def testMarkRewindEntire(self):
# ^(101 ^(102 103 ^(106 107) ) 104 105)
# stream has 7 real + 6 nav nodes
# Sequence of types: 101 DN 102 DN 103 106 DN 107 UP UP 104 105 UP EOF
r0 = CommonTree(CommonToken(101))
r1 = CommonTree(CommonToken(102))
r0.addChild(r1)
r1.addChild(CommonTree(CommonToken(103)))
r2 = CommonTree(CommonToken(106))
r2.addChild(CommonTree(CommonToken(107)))
r1.addChild(r2)
r0.addChild(CommonTree(CommonToken(104)))
r0.addChild(CommonTree(CommonToken(105)))
stream = CommonTreeNodeStream(r0)
m = stream.mark() # MARK
for _ in range(13): # consume til end
stream.LT(1)
stream.consume()
self.failUnlessEqual(EOF, stream.LT(1).getType())
self.failUnlessEqual(UP, stream.LT(-1).getType())
stream.rewind(m) # REWIND
# consume til end again :)
for _ in range(13): # consume til end
stream.LT(1)
stream.consume()
self.failUnlessEqual(EOF, stream.LT(1).getType())
self.failUnlessEqual(UP, stream.LT(-1).getType())
def testMarkRewindInMiddle(self):
# ^(101 ^(102 103 ^(106 107) ) 104 105)
# stream has 7 real + 6 nav nodes
# Sequence of types: 101 DN 102 DN 103 106 DN 107 UP UP 104 105 UP EOF
r0 = CommonTree(CommonToken(101))
r1 = CommonTree(CommonToken(102))
r0.addChild(r1)
r1.addChild(CommonTree(CommonToken(103)))
r2 = CommonTree(CommonToken(106))
r2.addChild(CommonTree(CommonToken(107)))
r1.addChild(r2)
r0.addChild(CommonTree(CommonToken(104)))
r0.addChild(CommonTree(CommonToken(105)))
stream = CommonTreeNodeStream(r0)
for _ in range(7): # consume til middle
#System.out.println(tream.LT(1).getType())
stream.consume()
self.failUnlessEqual(107, stream.LT(1).getType())
m = stream.mark() # MARK
stream.consume() # consume 107
stream.consume() # consume UP
stream.consume() # consume UP
stream.consume() # consume 104
stream.rewind(m) # REWIND
self.failUnlessEqual(107, stream.LT(1).getType())
stream.consume()
self.failUnlessEqual(UP, stream.LT(1).getType())
stream.consume()
self.failUnlessEqual(UP, stream.LT(1).getType())
stream.consume()
self.failUnlessEqual(104, stream.LT(1).getType())
stream.consume()
# now we're past rewind position
self.failUnlessEqual(105, stream.LT(1).getType())
stream.consume()
self.failUnlessEqual(UP, stream.LT(1).getType())
stream.consume()
self.failUnlessEqual(EOF, stream.LT(1).getType())
self.failUnlessEqual(UP, stream.LT(-1).getType())
def testMarkRewindNested(self):
# ^(101 ^(102 103 ^(106 107) ) 104 105)
# stream has 7 real + 6 nav nodes
# Sequence of types: 101 DN 102 DN 103 106 DN 107 UP UP 104 105 UP EOF
r0 = CommonTree(CommonToken(101))
r1 = CommonTree(CommonToken(102))
r0.addChild(r1)
r1.addChild(CommonTree(CommonToken(103)))
r2 = CommonTree(CommonToken(106))
r2.addChild(CommonTree(CommonToken(107)))
r1.addChild(r2)
r0.addChild(CommonTree(CommonToken(104)))
r0.addChild(CommonTree(CommonToken(105)))
stream = CommonTreeNodeStream(r0)
m = stream.mark() # MARK at start
stream.consume() # consume 101
stream.consume() # consume DN
m2 = stream.mark() # MARK on 102
stream.consume() # consume 102
stream.consume() # consume DN
stream.consume() # consume 103
stream.consume() # consume 106
stream.rewind(m2) # REWIND to 102
self.failUnlessEqual(102, stream.LT(1).getType())
stream.consume()
self.failUnlessEqual(DOWN, stream.LT(1).getType())
stream.consume()
# stop at 103 and rewind to start
stream.rewind(m) # REWIND to 101
self.failUnlessEqual(101, stream.LT(1).getType())
stream.consume()
self.failUnlessEqual(DOWN, stream.LT(1).getType())
stream.consume()
self.failUnlessEqual(102, stream.LT(1).getType())
stream.consume()
self.failUnlessEqual(DOWN, stream.LT(1).getType())
def testSeek(self):
# ^(101 ^(102 103 ^(106 107) ) 104 105)
# stream has 7 real + 6 nav nodes
# Sequence of types: 101 DN 102 DN 103 106 DN 107 UP UP 104 105 UP EOF
r0 = CommonTree(CommonToken(101))
r1 = CommonTree(CommonToken(102))
r0.addChild(r1)
r1.addChild(CommonTree(CommonToken(103)))
r2 = CommonTree(CommonToken(106))
r2.addChild(CommonTree(CommonToken(107)))
r1.addChild(r2)
r0.addChild(CommonTree(CommonToken(104)))
r0.addChild(CommonTree(CommonToken(105)))
stream = CommonTreeNodeStream(r0)
stream.consume() # consume 101
stream.consume() # consume DN
stream.consume() # consume 102
stream.seek(7) # seek to 107
self.failUnlessEqual(107, stream.LT(1).getType())
stream.consume() # consume 107
stream.consume() # consume UP
stream.consume() # consume UP
self.failUnlessEqual(104, stream.LT(1).getType())
def testSeekFromStart(self):
# ^(101 ^(102 103 ^(106 107) ) 104 105)
# stream has 7 real + 6 nav nodes
# Sequence of types: 101 DN 102 DN 103 106 DN 107 UP UP 104 105 UP EOF
r0 = CommonTree(CommonToken(101))
r1 = CommonTree(CommonToken(102))
r0.addChild(r1)
r1.addChild(CommonTree(CommonToken(103)))
r2 = CommonTree(CommonToken(106))
r2.addChild(CommonTree(CommonToken(107)))
r1.addChild(r2)
r0.addChild(CommonTree(CommonToken(104)))
r0.addChild(CommonTree(CommonToken(105)))
stream = CommonTreeNodeStream(r0)
stream.seek(7) # seek to 107
self.failUnlessEqual(107, stream.LT(1).getType())
stream.consume() # consume 107
stream.consume() # consume UP
stream.consume() # consume UP
self.failUnlessEqual(104, stream.LT(1).getType())
def toNodesOnlyString(self, nodes):
buf = []
for i in range(nodes.size()):
t = nodes.LT(i+1)
type = nodes.getTreeAdaptor().getType(t)
if not (type==DOWN or type==UP):
buf.append(str(type))
return ' '.join(buf)
class TestCommonTreeNodeStream(unittest.TestCase):
"""Test case for the CommonTreeNodeStream class."""
def testPushPop(self):
# ^(101 ^(102 103) ^(104 105) ^(106 107) 108 109)
# stream has 9 real + 8 nav nodes
# Sequence of types: 101 DN 102 DN 103 UP 104 DN 105 UP 106 DN 107 UP 108 109 UP
r0 = CommonTree(CommonToken(101))
r1 = CommonTree(CommonToken(102))
r1.addChild(CommonTree(CommonToken(103)))
r0.addChild(r1)
r2 = CommonTree(CommonToken(104))
r2.addChild(CommonTree(CommonToken(105)))
r0.addChild(r2)
r3 = CommonTree(CommonToken(106))
r3.addChild(CommonTree(CommonToken(107)))
r0.addChild(r3)
r0.addChild(CommonTree(CommonToken(108)))
r0.addChild(CommonTree(CommonToken(109)))
stream = CommonTreeNodeStream(r0)
expecting = "101 2 102 2 103 3 104 2 105 3 106 2 107 3 108 109 3"
found = str(stream)
self.failUnlessEqual(expecting, found)
# Assume we want to hit node 107 and then "call 102" then return
indexOf102 = 2
indexOf107 = 12
for _ in range(indexOf107):# consume til 107 node
stream.consume()
# CALL 102
self.failUnlessEqual(107, stream.LT(1).getType())
stream.push(indexOf102)
self.failUnlessEqual(102, stream.LT(1).getType())
stream.consume() # consume 102
self.failUnlessEqual(DOWN, stream.LT(1).getType())
stream.consume() # consume DN
self.failUnlessEqual(103, stream.LT(1).getType())
stream.consume() # consume 103
self.failUnlessEqual(UP, stream.LT(1).getType())
# RETURN
stream.pop()
self.failUnlessEqual(107, stream.LT(1).getType())
def testNestedPushPop(self):
# ^(101 ^(102 103) ^(104 105) ^(106 107) 108 109)
# stream has 9 real + 8 nav nodes
# Sequence of types: 101 DN 102 DN 103 UP 104 DN 105 UP 106 DN 107 UP 108 109 UP
r0 = CommonTree(CommonToken(101))
r1 = CommonTree(CommonToken(102))
r1.addChild(CommonTree(CommonToken(103)))
r0.addChild(r1)
r2 = CommonTree(CommonToken(104))
r2.addChild(CommonTree(CommonToken(105)))
r0.addChild(r2)
r3 = CommonTree(CommonToken(106))
r3.addChild(CommonTree(CommonToken(107)))
r0.addChild(r3)
r0.addChild(CommonTree(CommonToken(108)))
r0.addChild(CommonTree(CommonToken(109)))
stream = CommonTreeNodeStream(r0)
# Assume we want to hit node 107 and then "call 102", which
# calls 104, then return
indexOf102 = 2
indexOf107 = 12
for _ in range(indexOf107): # consume til 107 node
stream.consume()
self.failUnlessEqual(107, stream.LT(1).getType())
# CALL 102
stream.push(indexOf102)
self.failUnlessEqual(102, stream.LT(1).getType())
stream.consume() # consume 102
self.failUnlessEqual(DOWN, stream.LT(1).getType())
stream.consume() # consume DN
self.failUnlessEqual(103, stream.LT(1).getType())
stream.consume() # consume 103
# CALL 104
indexOf104 = 6
stream.push(indexOf104)
self.failUnlessEqual(104, stream.LT(1).getType())
stream.consume() # consume 102
self.failUnlessEqual(DOWN, stream.LT(1).getType())
stream.consume() # consume DN
self.failUnlessEqual(105, stream.LT(1).getType())
stream.consume() # consume 103
self.failUnlessEqual(UP, stream.LT(1).getType())
# RETURN (to UP node in 102 subtree)
stream.pop()
self.failUnlessEqual(UP, stream.LT(1).getType())
# RETURN (to empty stack)
stream.pop()
self.failUnlessEqual(107, stream.LT(1).getType())
def testPushPopFromEOF(self):
# ^(101 ^(102 103) ^(104 105) ^(106 107) 108 109)
# stream has 9 real + 8 nav nodes
# Sequence of types: 101 DN 102 DN 103 UP 104 DN 105 UP 106 DN 107 UP 108 109 UP
r0 = CommonTree(CommonToken(101))
r1 = CommonTree(CommonToken(102))
r1.addChild(CommonTree(CommonToken(103)))
r0.addChild(r1)
r2 = CommonTree(CommonToken(104))
r2.addChild(CommonTree(CommonToken(105)))
r0.addChild(r2)
r3 = CommonTree(CommonToken(106))
r3.addChild(CommonTree(CommonToken(107)))
r0.addChild(r3)
r0.addChild(CommonTree(CommonToken(108)))
r0.addChild(CommonTree(CommonToken(109)))
stream = CommonTreeNodeStream(r0)
while stream.LA(1) != EOF:
stream.consume()
indexOf102 = 2
indexOf104 = 6
self.failUnlessEqual(EOF, stream.LT(1).getType())
# CALL 102
stream.push(indexOf102)
self.failUnlessEqual(102, stream.LT(1).getType())
stream.consume() # consume 102
self.failUnlessEqual(DOWN, stream.LT(1).getType())
stream.consume() # consume DN
self.failUnlessEqual(103, stream.LT(1).getType())
stream.consume() # consume 103
self.failUnlessEqual(UP, stream.LT(1).getType())
# RETURN (to empty stack)
stream.pop()
self.failUnlessEqual(EOF, stream.LT(1).getType())
# CALL 104
stream.push(indexOf104)
self.failUnlessEqual(104, stream.LT(1).getType())
stream.consume() # consume 102
self.failUnlessEqual(DOWN, stream.LT(1).getType())
stream.consume() # consume DN
self.failUnlessEqual(105, stream.LT(1).getType())
stream.consume() # consume 103
self.failUnlessEqual(UP, stream.LT(1).getType())
# RETURN (to empty stack)
stream.pop()
self.failUnlessEqual(EOF, stream.LT(1).getType())
if __name__ == "__main__":
unittest.main(testRunner=unittest.TextTestRunner(verbosity=2))
| 35.137421 | 88 | 0.613237 | 1,911 | 16,620 | 5.326531 | 0.084772 | 0.169172 | 0.046861 | 0.080165 | 0.864721 | 0.839179 | 0.817467 | 0.788191 | 0.756165 | 0.71451 | 0 | 0.102131 | 0.265945 | 16,620 | 472 | 89 | 35.211864 | 0.732213 | 0.145608 | 0 | 0.826471 | 0 | 0 | 0.01391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.014706 | 0 | 0.076471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d84b58b10fc6d23e4b3f5fe80c0aa69a675237c1 | 3,956 | py | Python | lib/restler/decorators.py | agostodev/agar | 66b7937a35ae93717d5e9683c7dc7c80c4bcc5d6 | [
"MIT"
] | 1 | 2015-07-22T15:58:06.000Z | 2015-07-22T15:58:06.000Z | lib/restler/decorators.py | agostodev/agar | 66b7937a35ae93717d5e9683c7dc7c80c4bcc5d6 | [
"MIT"
] | null | null | null | lib/restler/decorators.py | agostodev/agar | 66b7937a35ae93717d5e9683c7dc7c80c4bcc5d6 | [
"MIT"
] | null | null | null |
def ae_common_encoder(obj):
"""
Common type specific encoders for app engine
"""
from google.appengine.api import users
if isinstance(obj, users.User):
return obj.user_id() or obj.email()
def ae_common_extra_types(obj):
"""
Non-core python or app engine specific types that can be serialized
"""
from webapp2 import cached_property
if isinstance(obj, cached_property):
return True
else:
return False
def ae_db_serializer(cls):
"""
Restler class decorator for google.appengine.ext.db.Model for serialization
"""
from google.appengine.ext import blobstore, db
@classmethod
def restler_collection_types(cls, obj):
"""
Allows Restler to handle a collection type by retrieving the models
"""
if isinstance(obj, db.Query):
return True
else:
return False
@classmethod
def restler_encoder(cls, obj):
"""
Type specific encoders
"""
if isinstance(obj, db.GeoPt):
return "%s %s" % (obj.lat, obj.lon)
if isinstance(obj, db.IM):
return "%s %s" % (obj.protocol, obj.address)
if isinstance(obj, blobstore.BlobInfo):
return str(obj.key()) # TODO is this correct?
if ae_common_encoder(obj):
return ae_common_encoder(obj)
@classmethod
def restler_kind(cls, model):
"""
The lowercase model classname
"""
return model.kind().lower()
@classmethod
def restler_properties(cls, model):
"""
List of model property names if *include_all_fields=True*
Property must be from **google.appengine.ext.db.Property**
"""
return list(model.properties().iterkeys())
@classmethod
def restler_extra_types(cls, obj):
"""
Non-core python or app engine specific types that can be serialized
"""
return ae_common_extra_types(obj)
cls.restler_collection_types = restler_collection_types
cls.restler_encoder = restler_encoder
cls.restler_kind = restler_kind
cls.restler_properties = restler_properties
cls.restler_extra_types = restler_extra_types
return cls
def ae_ndb_serializer(cls):
"""
Restler class decorator for google.appengine.ext.ndb.Model for serialization
"""
from google.appengine.ext import ndb
@classmethod
def restler_collection_types(cls, obj):
"""
Allows Restler to handle a collection type by retrieving the models
"""
if isinstance(obj, ndb.query.Query):
return True
else:
return False
@classmethod
def restler_encoder(cls, obj):
"""
Type specific encoders
"""
if isinstance(obj, ndb.GeoPt):
return "%s %s" % (obj.lat, obj.lon)
if ae_common_encoder(obj):
return ae_common_encoder(obj)
@classmethod
def restler_kind(cls, model):
"""
The lowercase model classname
"""
try:
return model.__class__.__name__.lower()
except:
# TODO When is this the case?
return model.__name__.lower()
@classmethod
def restler_properties(cls, model):
"""
List of model property names if *include_all_fields=True*
Property must be from **google.appengine.ext.ndb.Property**
"""
return list(model._properties.iterkeys())
@classmethod
def restler_extra_types(cls, obj):
"""
Non-core python or app engine specific types that can be serialized
"""
return ae_common_extra_types(obj)
cls.restler_collection_types = restler_collection_types
cls.restler_encoder = restler_encoder
cls.restler_kind = restler_kind
cls.restler_properties = restler_properties
cls.restler_extra_types = restler_extra_types
return cls
| 27.664336 | 80 | 0.62639 | 465 | 3,956 | 5.146237 | 0.2 | 0.050146 | 0.087756 | 0.03761 | 0.810698 | 0.792729 | 0.792729 | 0.792729 | 0.751776 | 0.684079 | 0 | 0.000355 | 0.287917 | 3,956 | 142 | 81 | 27.859155 | 0.84913 | 0.234833 | 0 | 0.653333 | 0 | 0 | 0.005521 | 0 | 0 | 0 | 0 | 0.014085 | 0 | 1 | 0.186667 | false | 0 | 0.053333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d8de9985ae8cb4722e735481eabda51f7c696bc0 | 53,083 | py | Python | deepobs/pytorch/runners/runner.py | H0merJayS1mpson/deepobscustom | e85816ce42466326dac18841c58b79f87a4a1a7c | [
"MIT"
] | null | null | null | deepobs/pytorch/runners/runner.py | H0merJayS1mpson/deepobscustom | e85816ce42466326dac18841c58b79f87a4a1a7c | [
"MIT"
] | null | null | null | deepobs/pytorch/runners/runner.py | H0merJayS1mpson/deepobscustom | e85816ce42466326dac18841c58b79f87a4a1a7c | [
"MIT"
] | null | null | null | """Module implementing StandardRunner."""
from __future__ import print_function
import torch
import importlib
import abc
from deepobs import config as global_config
from .. import config
from .. import testproblems
from . import runner_utils
from deepobs.abstract_runner.abstract_runner import Runner
import numpy as np
import warnings
from random import seed
from copy import deepcopy
class PTRunner(Runner):
"""The abstract class for runner in the pytorch framework."""
def __init__(self, optimizer_class, hyperparameter_names):
super(PTRunner, self).__init__(optimizer_class, hyperparameter_names)
@abc.abstractmethod
def training(self, tproblem,
hyperparams,
num_epochs,
print_train_iter,
train_log_interval,
tb_log,
tb_log_dir,
**training_params):
return
@staticmethod
def create_testproblem(testproblem, initializations, batch_size, weight_decay, random_seed):
"""Sets up the deepobs.pytorch.testproblems.testproblem instance.
Args:
testproblem (str): The name of the testproblem.
batch_size (int): Batch size that is used for training
weight_decay (float): Regularization factor
random_seed (int): The random seed of the framework
Returns:
deepobs.pytorch.testproblems.testproblem: An instance of deepobs.pytorch.testproblems.testproblem
"""
# set the seed and GPU determinism
if config.get_is_deterministic():
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
else:
torch.backends.cudnn.deterministic = False
torch.backends.cudnn.benchmark = True
seed(random_seed)
np.random.seed(random_seed)
torch.manual_seed(random_seed)
# Find testproblem by name and instantiate with batch size and weight decay.
try:
testproblem_mod = importlib.import_module(testproblem)
testproblem_cls = getattr(testproblem_mod, testproblem)
print("Loading local testproblem.")
except:
testproblem_cls = getattr(testproblems, testproblem)
# if the user specified a weight decay, use that one
if weight_decay is not None:
tproblem = testproblem_cls(batch_size, weight_decay)
# else use the default of the testproblem
else:
tproblem = testproblem_cls(batch_size)
# Set up the testproblem.
tproblem.set_up(initializations)
return tproblem
# Wrapper functions for the evaluation phase.
@staticmethod
def evaluate(tproblem, phase, get_next_batch=True):
"""Evaluates the performance of the current state of the model
of the testproblem instance.
Has to be called in the beggining of every epoch within the
training method. Returns the losses and accuracies.
Args:
tproblem (testproblem): The testproblem instance to evaluate
phase (str): The phase of the evaluation. Must be one of 'TRAIN', 'VALID' or 'TEST'
Returns:
float: The loss of the current state.
float: The accuracy of the current state.
:param get_next_batch:
"""
if phase == 'TEST':
tproblem.test_init_op()
msg = "TEST:"
elif phase == 'TRAIN':
tproblem.train_eval_init_op()
msg = "TRAIN:"
elif phase == 'VALID':
tproblem.valid_init_op()
msg = "VALID:"
# evaluation loop over every batch of the corresponding evaluation set
loss = 0.0
accuracy = 0.0
batchCount = 0.0
i = 0
while True:
try:
batch_loss, batch_accuracy = tproblem.get_batch_loss_and_accuracy(get_next_batch=get_next_batch)
batchCount += 1.0
loss += batch_loss.item()
accuracy += batch_accuracy
except StopIteration:
break
loss /= batchCount
accuracy /= batchCount
if accuracy != 0.0:
print("{0:s} loss {1:g}, acc {2:f}".format(msg, loss, accuracy))
else:
print("{0:s} loss {1:g}".format(msg, loss))
return loss, accuracy
def evaluate_all(self,
epoch_count,
num_epochs,
tproblem,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies,
get_next_batch=True):
print("********************************")
print("Evaluating after {0:d} of {1:d} epochs...".format(epoch_count, num_epochs))
loss_, acc_ = self.evaluate(tproblem, phase='TRAIN', get_next_batch=get_next_batch)
train_losses.append(loss_)
train_accuracies.append(acc_)
loss_, acc_ = self.evaluate(tproblem, phase='VALID', get_next_batch=get_next_batch)
valid_losses.append(loss_)
valid_accuracies.append(acc_)
loss_, acc_ = self.evaluate(tproblem, phase='TEST', get_next_batch=get_next_batch)
test_losses.append(loss_)
test_accuracies.append(acc_)
print("********************************")
class StandardRunner(PTRunner):
"""A standard runner. Can run a normal training loop with fixed
hyperparams. It should be used as a template to implement custom runners.
"""
def __init__(self, optimizer_class, hyperparameter_names):
super(StandardRunner, self).__init__(optimizer_class, hyperparameter_names)
def training(self,
tproblem,
hyperparams,
num_epochs,
print_train_iter,
train_log_interval,
tb_log,
tb_log_dir):
opt = self._optimizer_class(tproblem.net.parameters(), **hyperparams)
# Lists to log train/test loss and accuracy.
train_losses = []
valid_losses = []
test_losses = []
train_accuracies = []
valid_accuracies = []
test_accuracies = []
minibatch_train_losses = []
if tb_log:
try:
from torch.utils.tensorboard import SummaryWriter
summary_writer = SummaryWriter(log_dir=tb_log_dir)
except ImportError as e:
warnings.warn('Not possible to use tensorboard for pytorch. Reason: ' + e.msg, RuntimeWarning)
tb_log = False
global_step = 0
for epoch_count in range(num_epochs + 1):
# Evaluate at beginning of epoch.
self.evaluate_all(epoch_count,
num_epochs,
tproblem,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies)
# Break from train loop after the last round of evaluation
if epoch_count == num_epochs:
break
### Training ###
# set to training mode
tproblem.train_init_op()
batch_count = 0
while True:
try:
opt.zero_grad()
batch_loss, _ = tproblem.get_batch_loss_and_accuracy()
batch_loss.backward()
opt.step()
if batch_count % train_log_interval == 0:
minibatch_train_losses.append(batch_loss.item())
if print_train_iter:
print("Epoch {0:d}, step {1:d}: loss {2:g}".format(epoch_count, batch_count, batch_loss))
if tb_log:
summary_writer.add_scalar('loss', batch_loss.item(), global_step)
batch_count += 1
global_step += 1
except StopIteration:
break
if not np.isfinite(batch_loss.item()):
self._abort_routine(
epoch_count,
num_epochs,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies,
minibatch_train_losses)
break
else:
continue
if tb_log:
summary_writer.close()
# Put results into output dictionary.
output = {
"train_losses": train_losses,
'valid_losses': valid_losses,
"test_losses": test_losses,
"minibatch_train_losses": minibatch_train_losses,
"train_accuracies": train_accuracies,
'valid_accuracies': valid_accuracies,
"test_accuracies": test_accuracies
}
return output
class LearningRateScheduleRunner(PTRunner):
"""A runner for learning rate schedules. Can run a normal training loop with fixed hyperparams or a learning rate
schedule. It should be used as a template to implement custom runners.
"""
def __init__(self, optimizer_class, hyperparameter_names):
super(LearningRateScheduleRunner, self).__init__(optimizer_class, hyperparameter_names)
def _add_training_params_to_argparse(self, parser, args, training_params):
try:
args['lr_sched_epochs'] = training_params['lr_sched_epochs']
except KeyError:
parser.add_argument(
"--lr_sched_epochs",
nargs="+",
type=int,
help="""One or more epoch numbers (positive integers) that mark
learning rate changes. The base learning rate has to be passed via
'--learing_rate' and the factors by which to change have to be passed
via '--lr_sched_factors'. Example: '--lr 0.3 --lr_sched_epochs 50 100
--lr_sched_factors 0.1 0.01' will start with a learning rate of 0.3,
then decrease to 0.1*0.3=0.03 after training for 50 epochs, and
decrease to 0.01*0.3=0.003' after training for 100 epochs.""")
try:
args['lr_sched_factors'] = training_params['lr_sched_factors']
except KeyError:
parser.add_argument(
"--lr_sched_factors",
nargs="+",
type=float,
help="""One or more factors (floats) by which to change the learning
rate. The base learning rate has to be passed via '--learing_rate' and
the epochs at which to change the learning rate have to be passed via
'--lr_sched_factors'. Example: '--lr 0.3 --lr_sched_epochs 50 100
--lr_sched_factors 0.1 0.01' will start with a learning rate of 0.3,
then decrease to 0.1*0.3=0.03 after training for 50 epochs, and
decrease to 0.01*0.3=0.003' after training for 100 epochs.""")
def training(self,
tproblem,
hyperparams,
num_epochs,
print_train_iter,
train_log_interval,
tb_log,
tb_log_dir,
# the following are the training_params
lr_sched_epochs=None,
lr_sched_factors=None):
"""Performs the training and stores the metrices.
Args:
tproblem (deepobs.[tensorflow/pytorch].testproblems.testproblem): The testproblem instance to train on.
hyperparams (dict): The optimizer hyperparameters to use for the training.
num_epochs (int): The number of training epochs.
print_train_iter (bool): Whether to print the training progress at every train_log_interval
train_log_interval (int): Mini-batch interval for logging.
tb_log (bool): Whether to use tensorboard logging or not
tb_log_dir (str): The path where to save tensorboard events.
lr_sched_epochs (list): The epochs where to adjust the learning rate.
lr_sched_factors (list): The corresponding factors by which to adjust the learning rate.
Returns:
dict: The logged metrices. Is of the form: \
{'test_losses' : [...], \
'valid_losses': [...], \
'train_losses': [...], \
'test_accuracies': [...], \
'valid_accuracies': [...], \
'train_accuracies': [...] \
} \
where the metrices values are lists that were filled during training.
"""
opt = self._optimizer_class(tproblem.net.parameters(), **hyperparams)
if lr_sched_epochs is not None:
lr_schedule = runner_utils.make_lr_schedule(optimizer=opt, lr_sched_epochs=lr_sched_epochs,
lr_sched_factors=lr_sched_factors)
# Lists to log train/test loss and accuracy.
train_losses = []
valid_losses = []
test_losses = []
train_accuracies = []
valid_accuracies = []
test_accuracies = []
minibatch_train_losses = []
for epoch_count in range(num_epochs + 1):
# Evaluate at beginning of epoch.
self.evaluate_all(epoch_count,
num_epochs,
tproblem,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies)
# Break from train loop after the last round of evaluation
if epoch_count == num_epochs:
break
### Training ###
if lr_sched_epochs is not None:
# get the next learning rate
lr_schedule.step(epoch_count)
if epoch_count in lr_sched_epochs:
print("Setting learning rate to {0}".format(lr_schedule.get_lr()))
# set to training mode
tproblem.train_init_op()
batch_count = 0
while True:
try:
opt.zero_grad()
batch_loss, _ = tproblem.get_batch_loss_and_accuracy()
batch_loss.backward()
opt.step()
if batch_count % train_log_interval == 0:
minibatch_train_losses.append(batch_loss.item())
if print_train_iter:
print("Epoch {0:d}, step {1:d}: loss {2:g}".format(epoch_count, batch_count, batch_loss))
batch_count += 1
except StopIteration:
break
# break from training if it goes wrong
if not np.isfinite(batch_loss.item()):
self._abort_routine(epoch_count,
num_epochs,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies)
break
else:
continue
# Put results into output dictionary.
output = {
"train_losses": train_losses,
"valid_losses": valid_losses,
"test_losses": test_losses,
"minibatch_train_losses": minibatch_train_losses,
"train_accuracies": train_accuracies,
'valid_accuracies': valid_accuracies,
"test_accuracies": test_accuracies
}
return output
class CustomRunner(PTRunner):
"""A standard runner. Can run a normal training loop with fixed
hyperparams. It should be used as a template to implement custom runners.
"""
def evaluate(tproblem, phase, get_next_batch=True):
"""Evaluates the performance of the current state of the model
of the testproblem instance.
Has to be called in the beggining of every epoch within the
training method. Returns the losses and accuracies.
Args:
tproblem (testproblem): The testproblem instance to evaluate
phase (str): The phase of the evaluation. Must be one of 'TRAIN', 'VALID' or 'TEST'
Returns:
float: The loss of the current state.
float: The accuracy of the current state.
:param get_next_batch:
"""
if phase == 'TEST':
tproblem.test_init_op()
msg = "TEST:"
elif phase == 'TRAIN':
tproblem.train_eval_init_op()
msg = "TRAIN:"
elif phase == 'VALID':
tproblem.valid_init_op()
msg = "VALID:"
# evaluation loop over every batch of the corresponding evaluation set
loss = 0.0
accuracy = 0.0
batchCount = 0.0
while True:
try:
batch_loss, batch_accuracy = tproblem.get_batch_loss_and_accuracy(get_next_batch=get_next_batch)
batchCount += 1.0
loss += batch_loss.item()
accuracy += batch_accuracy
except StopIteration:
break
loss /= batchCount
accuracy /= batchCount
if accuracy != 0.0:
print("{0:s} loss {1:g}, acc {2:f}".format(msg, loss, accuracy))
else:
print("{0:s} loss {1:g}".format(msg, loss))
return loss, accuracy
def evaluate_all(self,
epoch_count,
num_epochs,
tproblem,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies,
get_next_batch=True):
print("********************************")
print("Evaluating after {0:d} of {1:d} epochs...".format(epoch_count, num_epochs))
loss_, acc_ = CustomRunner.evaluate(tproblem, phase='TRAIN', get_next_batch=get_next_batch)
train_losses.append(loss_)
train_accuracies.append(acc_)
loss_, acc_ = CustomRunner.evaluate(tproblem, phase='VALID', get_next_batch=get_next_batch)
valid_losses.append(loss_)
valid_accuracies.append(acc_)
loss_, acc_ = CustomRunner.evaluate(tproblem, phase='TEST', get_next_batch=get_next_batch)
test_losses.append(loss_)
test_accuracies.append(acc_)
print("********************************")
def __init__(self, optimizer_class, hyperparameter_names):
super(CustomRunner, self).__init__(optimizer_class, hyperparameter_names)
def create_testproblem(self, testproblem, initializations, batch_size, weight_decay, random_seed):
"""Sets up the deepobs.pytorch.testproblems.testproblem instance.
Args:
testproblem (str): The name of the testproblem.
batch_size (int): Batch size that is used for training
weight_decay (float): Regularization factor
random_seed (int): The random seed of the framework
:param initializations: dictionary of the initialazation Methods per layer-Name
Returns:
deepobs.pytorch.testproblems.testproblem: An instance of deepobs.pytorch.testproblems.testproblem
"""
# set the seed and GPU determinism
if config.get_is_deterministic():
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
else:
torch.backends.cudnn.deterministic = False
torch.backends.cudnn.benchmark = True
seed(random_seed)
np.random.seed(random_seed)
torch.manual_seed(random_seed)
# Find testproblem by name and instantiate with batch size and weight decay.
try:
testproblem_mod = importlib.import_module(testproblem)
testproblem_cls = getattr(testproblem_mod, testproblem)
print("Loading local testproblem.")
except:
testproblem_cls = getattr(testproblems, testproblem)
# if the user specified a weight decay, use that one
if weight_decay is not None:
tproblem = testproblem_cls(batch_size, weight_decay)
# else use the default of the testproblem
else:
tproblem = testproblem_cls(batch_size)
# Set up the testproblem.
tproblem.set_up(initializations)
return tproblem
def training(self,
tproblem,
hyperparams,
num_epochs,
print_train_iter,
train_log_interval,
tb_log,
tb_log_dir):
opt = self._optimizer_class(tproblem.net.parameters(), **hyperparams)
# Lists to log train/test loss and accuracy.
train_losses = []
valid_losses = []
test_losses = []
train_accuracies = []
valid_accuracies = []
test_accuracies = []
minibatch_train_losses = []
if tb_log:
try:
from torch.utils.tensorboard import SummaryWriter
summary_writer = SummaryWriter(log_dir=tb_log_dir)
except ImportError as e:
warnings.warn('Not possible to use tensorboard for pytorch. Reason: ' + e.msg, RuntimeWarning)
tb_log = False
global_step = 0
for epoch_count in range(num_epochs + 1):
# Evaluate at beginning of epoch.
self.evaluate_all(epoch_count,
num_epochs,
tproblem,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies)
# Break from train loop after the last round of evaluation
if epoch_count == num_epochs:
break
### Training ###
# set to training mode
tproblem.train_init_op()
batch_count = 0
while True:
try:
#opt.zero_grad()
def closure(backward=True, get_next_batch=True):
opt.zero_grad()
batch_loss, _ = tproblem.get_batch_loss_and_accuracy(get_next_batch=get_next_batch)
if backward:
batch_loss.backward()
return batch_loss
batch_loss = opt.step(closure)
if batch_count % train_log_interval == 0:
minibatch_train_losses.append(batch_loss.item())
if print_train_iter:
print("Epoch {0:d}, step {1:d}: loss {2:g}".format(epoch_count, batch_count, batch_loss))
if tb_log:
summary_writer.add_scalar('loss', batch_loss.item(), global_step)
batch_count += 1
global_step += 1
except StopIteration:
break
if not np.isfinite(batch_loss.item()):
self._abort_routine(
epoch_count,
num_epochs,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies,
minibatch_train_losses)
break
else:
continue
if tb_log:
summary_writer.close()
# Put results into output dictionary.
output = {
"train_losses": train_losses,
'valid_losses': valid_losses,
"test_losses": test_losses,
"minibatch_train_losses": minibatch_train_losses,
"train_accuracies": train_accuracies,
'valid_accuracies': valid_accuracies,
"test_accuracies": test_accuracies
}
return output
class CustomLearningRateScheduleRunner(PTRunner):
"""A runner for learning rate schedules. Can run a normal training loop with fixed hyperparams or a learning rate
schedule. It should be used as a template to implement custom runners.
"""
def __init__(self, optimizer_class, hyperparameter_names):
super(CustomLearningRateScheduleRunner, self).__init__(optimizer_class, hyperparameter_names)
def _add_training_params_to_argparse(self, parser, args, training_params):
try:
args['lr_sched_epochs'] = training_params['lr_sched_epochs']
except KeyError:
parser.add_argument(
"--lr_sched_epochs",
nargs="+",
type=int,
help="""One or more epoch numbers (positive integers) that mark
learning rate changes. The base learning rate has to be passed via
'--learing_rate' and the factors by which to change have to be passed
via '--lr_sched_factors'. Example: '--lr 0.3 --lr_sched_epochs 50 100
--lr_sched_factors 0.1 0.01' will start with a learning rate of 0.3,
then decrease to 0.1*0.3=0.03 after training for 50 epochs, and
decrease to 0.01*0.3=0.003' after training for 100 epochs.""")
try:
args['lr_sched_factors'] = training_params['lr_sched_factors']
except KeyError:
parser.add_argument(
"--lr_sched_factors",
nargs="+",
type=float,
help=
"""One or more factors (floats) by which to change the learning
rate. The base learning rate has to be passed via '--learing_rate' and
the epochs at which to change the learning rate have to be passed via
'--lr_sched_factors'. Example: '--lr 0.3 --lr_sched_epochs 50 100
--lr_sched_factors 0.1 0.01' will start with a learning rate of 0.3,
then decrease to 0.1*0.3=0.03 after training for 50 epochs, and
decrease to 0.01*0.3=0.003' after training for 100 epochs.""")
def create_testproblem(self, testproblem, initializations, batch_size, weight_decay, random_seed):
"""Sets up the deepobs.pytorch.testproblems.testproblem instance.
Args:
testproblem (str): The name of the testproblem.
batch_size (int): Batch size that is used for training
weight_decay (float): Regularization factor
random_seed (int): The random seed of the framework
:param initializations: dictionary of the initialazation Methods per layer-Name
Returns:
deepobs.pytorch.testproblems.testproblem: An instance of deepobs.pytorch.testproblems.testproblem
"""
# set the seed and GPU determinism
if config.get_is_deterministic():
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
else:
torch.backends.cudnn.deterministic = False
torch.backends.cudnn.benchmark = True
seed(random_seed)
np.random.seed(random_seed)
torch.manual_seed(random_seed)
# Find testproblem by name and instantiate with batch size and weight decay.
try:
testproblem_mod = importlib.import_module(testproblem)
testproblem_cls = getattr(testproblem_mod, testproblem)
print("Loading local testproblem.")
except:
testproblem_cls = getattr(testproblems, testproblem)
# if the user specified a weight decay, use that one
if weight_decay is not None:
tproblem = testproblem_cls(batch_size, weight_decay)
# else use the default of the testproblem
else:
tproblem = testproblem_cls(batch_size)
# Set up the testproblem.
tproblem.set_up(initializations)
return tproblem
def evaluate(tproblem, phase, get_next_batch=True):
"""Evaluates the performance of the current state of the model
of the testproblem instance.
Has to be called in the beggining of every epoch within the
training method. Returns the losses and accuracies.
Args:
tproblem (testproblem): The testproblem instance to evaluate
phase (str): The phase of the evaluation. Must be one of 'TRAIN', 'VALID' or 'TEST'
Returns:
float: The loss of the current state.
float: The accuracy of the current state.
:param get_next_batch:
"""
if phase == 'TEST':
tproblem.test_init_op()
msg = "TEST:"
elif phase == 'TRAIN':
tproblem.train_eval_init_op()
msg = "TRAIN:"
elif phase == 'VALID':
tproblem.valid_init_op()
msg = "VALID:"
# evaluation loop over every batch of the corresponding evaluation set
loss = 0.0
accuracy = 0.0
batchCount = 0.0
while True:
try:
batch_loss, batch_accuracy = tproblem.get_batch_loss_and_accuracy(get_next_batch=get_next_batch)
batchCount += 1.0
loss += batch_loss.item()
accuracy += batch_accuracy
except StopIteration:
break
loss /= batchCount
accuracy /= batchCount
if accuracy != 0.0:
print("{0:s} loss {1:g}, acc {2:f}".format(msg, loss, accuracy))
else:
print("{0:s} loss {1:g}".format(msg, loss))
return loss, accuracy
def evaluate_all(self,
epoch_count,
num_epochs,
tproblem,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies,
get_next_batch=True):
print("********************************")
print("Evaluating after {0:d} of {1:d} epochs...".format(epoch_count, num_epochs))
loss_, acc_ = CustomLearningRateScheduleRunner.evaluate(tproblem, phase='TRAIN', get_next_batch=get_next_batch)
train_losses.append(loss_)
train_accuracies.append(acc_)
loss_, acc_ = CustomLearningRateScheduleRunner.evaluate(tproblem, phase='VALID', get_next_batch=get_next_batch)
valid_losses.append(loss_)
valid_accuracies.append(acc_)
loss_, acc_ = CustomLearningRateScheduleRunner.evaluate(tproblem, phase='TEST', get_next_batch=get_next_batch)
test_losses.append(loss_)
test_accuracies.append(acc_)
print("********************************")
def training(self,
tproblem,
hyperparams,
num_epochs,
print_train_iter,
train_log_interval,
tb_log,
tb_log_dir,
# the following are the training_params
lr_sched_epochs=None,
lr_sched_factors=None):
"""Performs the training and stores the metrices.
Args:
tproblem (deepobs.[tensorflow/pytorch].testproblems.testproblem): The testproblem instance to train on.
hyperparams (dict): The optimizer hyperparameters to use for the training.
num_epochs (int): The number of training epochs.
print_train_iter (bool): Whether to print the training progress at every train_log_interval
train_log_interval (int): Mini-batch interval for logging.
tb_log (bool): Whether to use tensorboard logging or not
tb_log_dir (str): The path where to save tensorboard events.
lr_sched_epochs (list): The epochs where to adjust the learning rate.
lr_sched_factors (list): The corresponding factors by which to adjust the learning rate.
Returns:
dict: The logged metrices. Is of the form: \
{'test_losses' : [...], \
'valid_losses': [...], \
'train_losses': [...], \
'test_accuracies': [...], \
'valid_accuracies': [...], \
'train_accuracies': [...] \
} \
where the metrices values are lists that were filled during training.
"""
opt = self._optimizer_class(tproblem.net.parameters(), **hyperparams)
if lr_sched_epochs is not None:
lr_schedule = runner_utils.make_lr_schedule(optimizer=opt, lr_sched_epochs=lr_sched_epochs,
lr_sched_factors=lr_sched_factors)
# Lists to log train/test loss and accuracy.
train_losses = []
valid_losses = []
test_losses = []
train_accuracies = []
valid_accuracies = []
test_accuracies = []
minibatch_train_losses = []
if tb_log:
try:
from torch.utils.tensorboard import SummaryWriter
summary_writer = SummaryWriter(log_dir=tb_log_dir)
except ImportError as e:
warnings.warn('Not possible to use tensorboard for pytorch. Reason: ' + e.msg, RuntimeWarning)
tb_log = False
global_step = 0
for epoch_count in range(num_epochs + 1):
# Evaluate at beginning of epoch.
self.evaluate_all(epoch_count,
num_epochs,
tproblem,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies)
# Break from train loop after the last round of evaluation
if epoch_count == num_epochs:
break
### Training ###
if lr_sched_epochs is not None:
# get the next learning rate
lr_schedule.step(epoch_count)
if epoch_count in lr_sched_epochs:
print("Setting learning rate to {0}".format(lr_schedule.get_lr()))
# set to training mode
tproblem.train_init_op()
batch_count = 0
while True:
try:
opt.zero_grad()
def closure(backward=True, get_next_batch=True):
# opt.zero_grad()
batch_loss, _ = tproblem.get_batch_loss_and_accuracy(get_next_batch=get_next_batch)
if backward:
batch_loss.backward()
return batch_loss
batch_loss = opt.step(closure)
if batch_count % train_log_interval == 0:
minibatch_train_losses.append(batch_loss.item())
if print_train_iter:
print(
"Epoch {0:d}, step {1:d}: loss {2:g}".format(epoch_count, batch_count, batch_loss))
if tb_log:
summary_writer.add_scalar('loss', batch_loss.item(), global_step)
batch_count += 1
global_step += 1
except StopIteration:
break
if not np.isfinite(batch_loss.item()):
self._abort_routine(
epoch_count,
num_epochs,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies,
minibatch_train_losses)
break
else:
continue
if tb_log:
summary_writer.close()
# Put results into output dictionary.
output = {
"train_losses": train_losses,
'valid_losses': valid_losses,
"test_losses": test_losses,
"minibatch_train_losses": minibatch_train_losses,
"train_accuracies": train_accuracies,
'valid_accuracies': valid_accuracies,
"test_accuracies": test_accuracies
}
return output
# opt = self._optimizer_class(tproblem.net.parameters(), **hyperparams)
#
# if lr_sched_epochs is not None:
# lr_schedule = runner_utils.make_lr_schedule(optimizer=opt, lr_sched_epochs=lr_sched_epochs,
# lr_sched_factors=lr_sched_factors)
#
# # Lists to log train/test loss and accuracy.
# train_losses = []
# valid_losses = []
# test_losses = []
# train_accuracies = []
# valid_accuracies = []
# test_accuracies = []
#
# minibatch_train_losses = []
#
# for epoch_count in range(num_epochs + 1):
# # Evaluate at beginning of epoch.
# self.evaluate_all(epoch_count,
# num_epochs,
# tproblem,
# train_losses,
# valid_losses,
# test_losses,
# train_accuracies,
# valid_accuracies,
# test_accuracies)
#
# # Break from train loop after the last round of evaluation
# if epoch_count == num_epochs:
# break
#
# ### Training ###
# if lr_sched_epochs is not None:
# # get the next learning rate
# lr_schedule.step(epoch_count)
#
# if epoch_count in lr_sched_epochs:
# print("Setting learning rate to {0}".format(lr_schedule.get_lr()))
#
# # set to training mode
# tproblem.train_init_op()
# batch_count = 0
# while True:
# try:
# def closure(backward=True, get_next_batch=True):
# # opt.zero_grad()
# batch_loss, _ = tproblem.get_batch_loss_and_accuracy(get_next_batch=get_next_batch)
# if backward:
# batch_loss.backward()
# return batch_loss
#
# batch_loss = opt.step(closure)
#
#
# if batch_count % train_log_interval == 0:
# minibatch_train_losses.append(batch_loss.item())
# if print_train_iter:
# print("Epoch {0:d}, step {1:d}: loss {2:g}".format(epoch_count, batch_count, batch_loss))
# batch_count += 1
#
# except StopIteration:
# break
#
# # break from training if it goes wrong
# if not np.isfinite(batch_loss.item()):
# self._abort_routine(epoch_count,
# num_epochs,
# train_losses,
# valid_losses,
# test_losses,
# train_accuracies,
# valid_accuracies,
# test_accuracies)
# break
# else:
# continue
#
# # Put results into output dictionary.
# output = {
# "train_losses": train_losses,
# "valid_losses": valid_losses,
# "test_losses": test_losses,
# "minibatch_train_losses": minibatch_train_losses,
# "train_accuracies": train_accuracies,
# 'valid_accuracies': valid_accuracies,
# "test_accuracies": test_accuracies
# }
#
# return output
class PalRunner(PTRunner):
"""A standard runner. Can run a normal training loop with fixed
hyperparams. It should be used as a template to implement custom runners.
"""
def evaluate(tproblem, phase, get_next_batch=True):
"""Evaluates the performance of the current state of the model
of the testproblem instance.
Has to be called in the beggining of every epoch within the
training method. Returns the losses and accuracies.
Args:
tproblem (testproblem): The testproblem instance to evaluate
phase (str): The phase of the evaluation. Must be one of 'TRAIN', 'VALID' or 'TEST'
Returns:
float: The loss of the current state.
float: The accuracy of the current state.
:param get_next_batch:
"""
if phase == 'TEST':
tproblem.test_init_op()
msg = "TEST:"
elif phase == 'TRAIN':
tproblem.train_eval_init_op()
msg = "TRAIN:"
elif phase == 'VALID':
tproblem.valid_init_op()
msg = "VALID:"
# evaluation loop over every batch of the corresponding evaluation set
loss = 0.0
accuracy = 0.0
batchCount = 0.0
i = 0
while True:
try:
batch_loss, batch_accuracy = tproblem.get_batch_loss_and_accuracy(get_next_batch=get_next_batch)
batchCount += 1.0
loss += batch_loss.item()
accuracy += batch_accuracy
except StopIteration:
break
loss /= batchCount
accuracy /= batchCount
if accuracy != 0.0:
print("{0:s} loss {1:g}, acc {2:f}".format(msg, loss, accuracy))
else:
print("{0:s} loss {1:g}".format(msg, loss))
return loss, accuracy
def evaluate_all(self,
epoch_count,
num_epochs,
tproblem,
train_losses,
valid_losses,
test_losses,
train_accuracies,
valid_accuracies,
test_accuracies,
get_next_batch=True):
print("********************************")
print("Evaluating after {0:d} of {1:d} epochs...".format(epoch_count, num_epochs))
loss_, acc_ = CustomRunner.evaluate(tproblem, phase='TRAIN', get_next_batch=get_next_batch)
train_losses.append(loss_)
train_accuracies.append(acc_)
loss_, acc_ = CustomRunner.evaluate(tproblem, phase='VALID', get_next_batch=get_next_batch)
valid_losses.append(loss_)
valid_accuracies.append(acc_)
loss_, acc_ = CustomRunner.evaluate(tproblem, phase='TEST', get_next_batch=get_next_batch)
test_losses.append(loss_)
test_accuracies.append(acc_)
print("********************************")
def __init__(self, optimizer_class, hyperparameter_names):
super(PalRunner, self).__init__(optimizer_class, hyperparameter_names)
def create_testproblem(self, testproblem, initializations, batch_size, weight_decay, random_seed):
"""Sets up the deepobs.pytorch.testproblems.testproblem instance.
Args:
testproblem (str): The name of the testproblem.
batch_size (int): Batch size that is used for training
weight_decay (float): Regularization factor
random_seed (int): The random seed of the framework
:param initializations: dictionary of the initialazation Methods per layer-Name
Returns:
deepobs.pytorch.testproblems.testproblem: An instance of deepobs.pytorch.testproblems.testproblem
"""
# set the seed and GPU determinism
if config.get_is_deterministic():
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
else:
torch.backends.cudnn.deterministic = False
torch.backends.cudnn.benchmark = True
seed(random_seed)
np.random.seed(random_seed)
torch.manual_seed(random_seed)
# Find testproblem by name and instantiate with batch size and weight decay.
try:
testproblem_mod = importlib.import_module(testproblem)
testproblem_cls = getattr(testproblem_mod, testproblem)
print("Loading local testproblem.")
except:
testproblem_cls = getattr(testproblems, testproblem)
# if the user specified a weight decay, use that one
if weight_decay is not None:
tproblem = testproblem_cls(batch_size, weight_decay)
# else use the default of the testproblem
else:
tproblem = testproblem_cls(batch_size)
# Set up the testproblem.
tproblem.set_up(initializations)
return tproblem
def training(self,
tproblem,
hyperparams,
num_epochs,
print_train_iter,
train_log_interval,
tb_log,
tb_log_dir):
opt = self._optimizer_class(tproblem.net.parameters(), **hyperparams)
# Lists to log train/test loss and accuracy.
train_losses = []
valid_losses = []
test_losses = []
train_accuracies = []
valid_accuracies = []
test_accuracies = []
minibatch_train_losses = []
print(type(tproblem.data))
global_step = 0
net = tproblem.net
criterion = tproblem.loss_function()
device = tproblem._device
data = tproblem.data
if tb_log:
try:
from torch.utils.tensorboard import SummaryWriter
summary_writer = SummaryWriter(log_dir=tb_log_dir)
except ImportError as e:
warnings.warn('Not possible to use tensorboard for pytorch. Reason: ' + e.msg, RuntimeWarning)
tb_log = False
def valid(epoch_):
tproblem.valid_init_op()
valid_loss = 0
train_loss = 0
correct = 0
total = 0
with torch.no_grad():
for batch_idx, (inputs, targets) in enumerate(data._train_eval_dataloader):
inputs, targets = inputs.to(device), targets.to(device)
if batch_idx == 0 and epoch_ == 0:
intitial_loss = criterion(net(inputs), targets)
print("Initial Loss ", intitial_loss, "batch ID: ", batch_idx)
#inputs, targets = inputs.to(device), targets.to(device)
outputs = net(inputs)
loss = criterion(outputs, targets)
train_loss += loss.item()
_, predicted = outputs.max(1)
total += targets.size(0)
correct += predicted.eq(targets).sum().item()
acc = correct / total
train_losses.append(train_loss / (batch_idx+1))
valid_accuracies.append(acc)
print("********************************")
print("Evaluating after {0:d} of {1:d} epochs...".format(epoch_, num_epochs))
print("TRAIN:" + "loss {0:g}, acc {1:f}".format((train_loss / (batch_idx+1)), acc))
for batch_idx, (inputs, targets) in enumerate(data._valid_dataloader):
inputs, targets = inputs.to(device), targets.to(device)
outputs = net(inputs)
loss = criterion(outputs, targets)
valid_loss += loss.item()
_, predicted = outputs.max(1)
total += targets.size(0)
correct += predicted.eq(targets).sum().item()
acc = correct / total
valid_losses.append(valid_loss / (batch_idx+1))
valid_accuracies.append(acc)
print("VALID:" + "loss {0:g}, acc {1:f}".format((valid_loss / (batch_idx+1)), acc))
def test(epoch_):
global best_acc
net.eval()
test_loss = 0
correct = 0
total = 0
with torch.no_grad():
for batch_idx, (inputs, targets) in enumerate(data._test_dataloader):
inputs, targets = inputs.to(device), targets.to(device)
outputs = net(inputs)
loss = criterion(outputs, targets)
test_loss += loss.item()
_, predicted = outputs.max(1)
total += targets.size(0)
correct += predicted.eq(targets).sum().item()
acc = correct / total
test_accuracies.append(acc)
test_losses.append(test_loss/(batch_idx+1))
print("TEST:" + "loss {0:g}, acc {1:f}".format(test_loss/(batch_idx+1), acc))
print("********************************")
def train(epoch_):
batch_size = 0
tproblem.train_init_op()
train_loss = 0
correct = 0
total = 0
for batch_idx, (inputs, targets) in enumerate(data._train_dataloader):
inputs, targets = inputs.to(device), targets.to(device)
opt.zero_grad()
batch_size +=1
# print("batch ID: ", batch_idx)
# print(inputs.size())
# if batch_idx == 0 and epoch_ == 0:
# intitial_loss = criterion(net(inputs), targets)
# print("Initial Loss ", intitial_loss, "batch ID: ", batch_idx)
def loss_fn(backward=True):
out_ = net(inputs)
# print(batch_idx, inputs[0])
loss_ = criterion(out_, targets)
# print(out_[127], targets)
# if Vonwo != '':
# print(Vonwo, " ", batch_idx, " ", loss_)
# print(targets, out_.max(1))
if backward:
loss_.backward()
return loss_, out_, batch_idx
loss, outputs = opt.step(loss_fn)
# train_loss += loss.item()
# _, predicted = outputs.max(1)
# total += targets.size(0)
# correct += predicted.eq(targets).sum().item()
# acc = 100. * correct / total
# train_losses.append(train_loss/batch_size)
# train_accuracies.append(acc)
# cur_time = int((time.time() - time_start))
# logger.debug('train time: {:4.2f} min'.format(cur_time / 60))
# logger.info(formatted_str('TRAIN:', epoch_, train_loss / (batch_idx + 1), correct / total))
#
# for s, t in [('time', cur_time), ('epoch', epoch_)]:
# writer.add_scalar('train-%s/accuracy' % s, correct / total, t)
# writer.add_scalar('train-%s/train_loss' % s, train_loss / (batch_idx + 1), t)
minibatch_train_losses=[]
if tb_log:
summary_writer.close()
# Put results into output dictionary.
for epoch_count in range(num_epochs + 1):
valid(epoch_count)
test(epoch_count)
train(epoch_count)
if epoch_count == num_epochs:
print("ENDE")
break
output = {
"train_losses": train_losses,
'valid_losses': valid_losses,
"test_losses": test_losses,
"minibatch_train_losses": minibatch_train_losses,
"train_accuracies": train_accuracies,
'valid_accuracies': valid_accuracies,
"test_accuracies": test_accuracies
}
return output
| 39.060338 | 119 | 0.542189 | 5,455 | 53,083 | 5.04253 | 0.060495 | 0.026793 | 0.023121 | 0.020795 | 0.943287 | 0.936271 | 0.932744 | 0.928346 | 0.921693 | 0.910641 | 0 | 0.009295 | 0.37577 | 53,083 | 1,358 | 120 | 39.089102 | 0.820829 | 0.247217 | 0 | 0.889549 | 0 | 0.003563 | 0.099775 | 0.012914 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038005 | false | 0.007126 | 0.029691 | 0.001188 | 0.095012 | 0.059382 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2b61a2c2d554d1c9bbc5219806b412cbf819770d | 109,898 | py | Python | skidl/libs/microchip_pic18mcu_sklib.py | arjenroodselaar/skidl | 0bf801bd3b74e6ef94bd9aa1b68eef756b568276 | [
"MIT"
] | 700 | 2016-08-16T21:12:50.000Z | 2021-10-10T02:15:18.000Z | skidl/libs/microchip_pic18mcu_sklib.py | 0dvictor/skidl | 458709a10b28a864d25ae2c2b44c6103d4ddb291 | [
"MIT"
] | 118 | 2016-08-16T20:51:05.000Z | 2021-10-10T08:07:18.000Z | skidl/libs/microchip_pic18mcu_sklib.py | 0dvictor/skidl | 458709a10b28a864d25ae2c2b44c6103d4ddb291 | [
"MIT"
] | 94 | 2016-08-25T14:02:28.000Z | 2021-09-12T05:17:08.000Z | from skidl import SKIDL, TEMPLATE, Part, Pin, SchLib
SKIDL_lib_version = '0.0.1'
microchip_pic18mcu = SchLib(tool=SKIDL).add_parts(*[
Part(name='PIC18F13K50-E/P',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8bit CMOS Microcontroller XLP',description='16K Flash, 768 SRAM, 256 EEPROM, USB, nanoWatt XLP, DIP20',ref_prefix='U',num_units=1,fplist=['DIP*W7.62mm*', 'PDIP*W7.62mm*'],do_erc=True,aliases=['PIC18F14K50-E/P', 'PIC18LF14K50-E/P', 'PIC18LF13K50-E/P'],pins=[
Pin(num='1',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='2',name='RA5/IOCA5/OSC1/CLKIN',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA4/IOCA3/AN3/OSC2/CLKOUT',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA3/IOCA3/~MCLR~/Vpp',do_erc=True),
Pin(num='5',name='T0CKI/CCP1/P1A/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='SRQ/C12OUT/P1B/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='PGM/C12IN3-/P1C/AN7/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='T1OSCI/T13CKI/SS/AN8/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='T1OSCO/SDO/AN9/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB7/IOCB7/TX/CK',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='RB6/IOCB6/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='RB5/IOCB5/AN11/RX/DT',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='RB4/IOCB4/AN10/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='INT2/CVREF/C12IN2-/P1D/AN6/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='INT1/C12IN1-/VREF-AN5/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='INT0/C12IN+/VREF+/AN4/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='VUSB',func=Pin.PASSIVE,do_erc=True),
Pin(num='18',name='RA1/IOCA1/D-/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/IOCA0/D+/PGD',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F13K50-E/SO',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8bit CMOS Microcontroller XLP',description='16K Flash, 768 SRAM, 256 EEPROM, USB, nanoWatt XLP, SOIC20',ref_prefix='U',num_units=1,fplist=['SOIC*W*7.5x12.8mm*Pitch1.27mm*'],do_erc=True,aliases=['PIC18LF13K50-E/SO', 'PIC18LF14K50-E/SO', 'PIC18F14K50-E/SO'],pins=[
Pin(num='1',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='2',name='RA5/IOCA5/OSC1/CLKIN',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA4/IOCA3/AN3/OSC2/CLKOUT',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA3/IOCA3/~MCLR~/Vpp',do_erc=True),
Pin(num='5',name='T0CKI/CCP1/P1A/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='SRQ/C12OUT/P1B/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='PGM/C12IN3-/P1C/AN7/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='T1OSCI/T13CKI/SS/AN8/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='T1OSCO/SDO/AN9/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB7/IOCB7/TX/CK',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='RB6/IOCB6/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='RB5/IOCB5/AN11/RX/DT',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='RB4/IOCB4/AN10/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='INT2/CVREF/C12IN2-/P1D/AN6/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='INT1/C12IN1-/VREF-AN5/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='INT0/C12IN+/VREF+/AN4/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='VUSB',func=Pin.PASSIVE,do_erc=True),
Pin(num='18',name='RA1/IOCA1/D-/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/IOCA0/D+/PGD',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F13K50-E/SS',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8bit CMOS Microcontroller XLP',description='16K Flash, 768 SRAM, 256 EEPROM, USB, nanoWatt XLP, SSOP20',ref_prefix='U',num_units=1,fplist=['SSOP*5.3x7.2mm*'],do_erc=True,aliases=['PIC18LF13K50-E/SS', 'PIC18LF14K50-E/SS', 'PIC18F14K50-E/SS'],pins=[
Pin(num='1',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='2',name='RA5/IOCA5/OSC1/CLKIN',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA4/IOCA3/AN3/OSC2/CLKOUT',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA3/IOCA3/~MCLR~/Vpp',do_erc=True),
Pin(num='5',name='T0CKI/CCP1/P1A/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='SRQ/C12OUT/P1B/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='PGM/C12IN3-/P1C/AN7/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='T1OSCI/T13CKI/SS/AN8/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='T1OSCO/SDO/AN9/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB7/IOCB7/TX/CK',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='RB6/IOCB6/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='RB5/IOCB5/AN11/RX/DT',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='RB4/IOCB4/AN10/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='INT2/CVRef/C12IN2-/P1D/AN6/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='INT1/C12IN1-/VREF-AN5/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='INT0/C12IN+/VREF+/AN4/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='VUSB',func=Pin.PASSIVE,do_erc=True),
Pin(num='18',name='RA1/IOCA1/D-/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/IOCA0/D+/PGD',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F2331-I/ML',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 768B SRAM, 256B EEPROM, nanoWatt XLP, ADC, PWM, QFN28',ref_prefix='U',num_units=1,fplist=['QFN*1EP*6x6mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18LF2331-I/ML', 'PIC18LF2431-I/ML', 'PIC18F2431-I/ML'],pins=[
Pin(num='1',name='RA2/AN2/Vref-/CAP1/INDX',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RA3/AN3/Vref+/CAP2/QEA',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA4/AN4/CAP3/QEB',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='5',name='AVSS',func=Pin.PWRIN,do_erc=True),
Pin(num='6',name='RA7/OSC1/CLKI',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='~FLTA~/T1OSI/CCP2/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='~FLTB~/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RB2/PWM2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='INT0/T5CKI/T0CKI/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RB3/PWM3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='INT1/SDI/SDA/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RB4/KBI0/PWM5',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='INT2/SCK/SCL/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RB5/KBI1/PWM4/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='TX/~SS~/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RX/SDO/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='26',name='~MCLR~/Vpp',do_erc=True),
Pin(num='17',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='27',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='RB0/PWM0',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RB1/PWM1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F2331-I/SO',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 768B SRAM, 256B EEPROM, nanoWatt XLP, ADC, PWM, SOIC28',ref_prefix='U',num_units=1,fplist=['SO*W*7.5x17.9mm*Pitch1.27mm*'],do_erc=True,aliases=['PIC18LF2331-I/SO', 'PIC18LF2431-I/SO', 'PIC18F2431-I/SO'],pins=[
Pin(num='1',name='~MCLR~/Vpp',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-/CAP1/INDX',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+/CAP2/QEA',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/AN4/CAP3/QEB',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='AVSS',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='RA7/OSC1/CLKI',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RB0/PWM0',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='~FLTA~/T1OSI/CCP2/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RB1/PWM1',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='~FLTB~/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RB2/PWM2',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='INT0/T5CKI/T0CKI/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RB3/PWM3',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='INT1/SDI/SDA/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RB4/KBI0/PWM5',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='INT2/SCK/SCL/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RB5/KBI1/PWM4/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='TX/~SS~/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='RX/SDO/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='VSS',func=Pin.PWRIN,do_erc=True)]),
Part(name='PIC18F2331-I/SP',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 768B SRAM, 256B EEPROM, nanoWatt XLP, ADC, PWM, SPDIP28',ref_prefix='U',num_units=1,fplist=['DIP*W7.62mm*', 'PDIP*W7.62mm*'],do_erc=True,aliases=['PIC18LF2331-I/SP', 'PIC18LF2431-I/SP', 'PIC18F2431-I/SP'],pins=[
Pin(num='1',name='~MCLR~/Vpp',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-/CAP1/INDX',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+/CAP2/QEA',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/AN4/CAP3/QEB',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='AVSS',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='RA7/OSC1/CLKI',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RB0/PWM0',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='~FLTA~/T1OSI/CCP2/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RB1/PWM1',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='~FLTB~/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RB2/PWM2',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='INT0/T5CKI/T0CKI/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RB3/PWM3',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='INT1/SDI/SDA/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RB4/KBI0/PWM5',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='INT2/SCK/SCL/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RB5/KBI1/PWM4/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='TX/~SS~/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='RX/SDO/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='VSS',func=Pin.PWRIN,do_erc=True)]),
Part(name='PIC18F2450-I/ML',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 8K SRAM, USB, nanoWatt XLP, QFN28',ref_prefix='U',num_units=1,fplist=['QFN*1EP*6x6mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18LF2450-I/ML'],pins=[
Pin(num='1',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA4/T0CKI/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA5/AN4/HLVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='6',name='OSC1/CLKI',do_erc=True),
Pin(num='7',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='8',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='~UOE~/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RB3/AN9/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RB4/AN11/KBI0',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='26',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='17',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='27',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='RB0/AN12/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RB1/AN10/INT1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F2450-I/SO',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 8K SRAM, USB, nanoWatt XLP, SOIC28',ref_prefix='U',num_units=1,fplist=['SOIC*W*7.5x12.8mm*Pitch1.27mm*'],do_erc=True,aliases=['PIC18LF2450-I/SO'],pins=[
Pin(num='1',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/T0CKI/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN4/HLVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='OSC1/CLKI',do_erc=True),
Pin(num='10',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='20',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RB0/AN12/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='~UOE~/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RB1/AN10/INT1',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RB3/AN9/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RB4/AN11/KBI0',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='VSS',func=Pin.PWRIN,do_erc=True)]),
Part(name='PIC18F2450-I/SP',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 8K SRAM, USB, nanoWatt XLP, SPDIP28',ref_prefix='U',num_units=1,fplist=['SPDIP*W7.62mm*', 'DIP*W7.62mm*', 'PDIP*W7.62mm*'],do_erc=True,aliases=['PIC18LF2450-I/SP'],pins=[
Pin(num='1',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/T0CKI/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN4/HLVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='OSC1/CLKI',do_erc=True),
Pin(num='10',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='20',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RB0/AN12/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='~UOE~/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RB1/AN10/INT1',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RB3/AN9/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RB4/AN11/KBI0',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='VSS',func=Pin.PWRIN,do_erc=True)]),
Part(name='PIC18F2455-I/SO',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='32K Flash, 2K SRAM, 256 EEPROM, USB, nanoWatt XLP, SOIC28',ref_prefix='U',num_units=1,fplist=['SO*W*7.5x17.9mm_Pitch1.27mm*'],do_erc=True,aliases=['PIC18LF2455-I/SO', 'PIC18LF2550-I/SO', 'PIC18F2550-I/SO'],pins=[
Pin(num='1',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/T0CKI/C1OUT/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN4/~SS~/HLVDIN/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='OSC1/CLKI',do_erc=True),
Pin(num='10',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='20',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='T1OSO/T13CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RB0/AN12/INT0/FLT0/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='~UOE~/CCP2/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RB1/AN10/INT1/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RB3/AN9/CCP2/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RB4/AN11/KBI0',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='SDO/RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='VSS',func=Pin.PWRIN,do_erc=True)]),
Part(name='PIC18F2455-I/SP',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='32K Flash, 2K SRAM, 256 EEPROM, USB, nanoWatt XLP, SPDIP28',ref_prefix='U',num_units=1,fplist=['SPDIP*28_W7.62mm*', 'DIP*28_W7.62mm*', 'PDIP*28_W7.62mm*'],do_erc=True,aliases=['PIC18LF2455-I/SP', 'PIC18LF2550-I/SP', 'PIC18F2550-I/SP'],pins=[
Pin(num='1',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/T0CKI/C1OUT/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN4/~SS~/HLVDIN/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='OSC1/CLKI',do_erc=True),
Pin(num='10',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='20',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='T1OSO/T13CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RB0/AN12/INT0/FLT0/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='~UOE~/CCP2/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RB1/AN10/INT1/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RB3/AN9/CCP2/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RB4/AN11/KBI0',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='SDO/RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='VSS',func=Pin.PWRIN,do_erc=True)]),
Part(name='PIC18F4331-I/ML',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 768B SRAM, 256B EEPROM, nanoWatt XLP, ADC, PWM, QFN44',ref_prefix='U',num_units=1,fplist=['QFN*1EP*8x8mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18LF4331-I/ML', 'PIC18LF4431-I/ML', 'PIC18F4431-I/ML'],pins=[
Pin(num='1',name='RX/SDO/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='~FLTA~/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='PWM4/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='PWM6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='PWM7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='RB0/PWM0',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB1/PWM1',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='AVSS',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='40',name='SDA/SDI/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB2/PWM2',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/AN2/Vref-/CAP1/INDX',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='41',name='SCL/SCK/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='RB3/PWM3',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/AN3/Vref+/CAP2/QEA',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='RA7/OSC1/CLKI',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='INT1/SDI/SDA/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RA4/AN4/CAP3/QEB',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='43',name='INT2/SCK/SCL/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4/KBI0/PWM5',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN5/LVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='TX/~SS~/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/KBI1/PWM4/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='AN6/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='~FLTA~/T1OSI/CCP2/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='AN7/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='~FLTB~/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='AN8/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='INT0/T5CKI/T0CKI/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='28',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='T5CKI/T0CKI/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='SDO/RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4331-I/P',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 768B SRAM, 256B EEPROM, nanoWatt XLP, ADC, PWM, DIP40',ref_prefix='U',num_units=1,fplist=['DIP*W15.24*', 'PDIP*W15.24*'],do_erc=True,aliases=['PIC18LF4331-I/P', 'PIC18LF4431-I/P', 'PIC18F4431-I/P'],pins=[
Pin(num='1',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-/CAP1/INDX',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+/CAP2/QEA',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/AN4/CAP3/QEB',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN5/LVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='AN6/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='AN7/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='AN8/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='SDO/RD1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='PWM7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='21',name='SDA/SDI/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='AVSS',func=Pin.PWRIN,do_erc=True),
Pin(num='22',name='SCL/SCK/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='RA7/OSC1/CLKI',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='INT1/SDI/SDA/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RB0/PWM0',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='INT2/SCK/SCL/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='RB1/PWM1',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='TX/~SS~/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='RB2/PWM2',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='~FLTA~/T1OSI/CCP2/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RX/SDO/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='RB3/PWM3',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='~FLTB~/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='~FLTA~/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='RB4/KBI0/PWM5',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='INT0/T5CKI/T0CKI/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='PWM4/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='38',name='RB5/KBI1/PWM4/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='T5CKI/T0CKI/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='PWM6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='39',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4331-I/PT',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 768B SRAM, 256B EEPROM, nanoWatt XLP, ADC, PWM, TQFP44',ref_prefix='U',num_units=1,fplist=['TQFP*10x10mm*Pitch0.8mm*'],do_erc=True,aliases=['PIC18LF4331-I/PT', 'PIC18LF4431-I/PT', 'PIC18F4431-I/PT'],pins=[
Pin(num='1',name='RX/SDO/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='~FLTA~/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='PWM4/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='PWM6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='PWM7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='RB0/PWM0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='RB1/PWM1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB2/PWM2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='RA7/OSC1/CLKI',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='SDA/SDI/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB3/PWM3',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/AN2/Vref-/CAP1/INDX',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='41',name='SCL/SCK/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/AN3/Vref+/CAP2/QEA',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='INT1/SDI/SDA/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RA4/AN4/CAP3/QEB',func=Pin.BIDIR,do_erc=True),
Pin(num='43',name='INT2/SCK/SCL/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4/KBI0/PWM5',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN5/LVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='TX/~SS~/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/KBI1/PWM4/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='AN6/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='~FLTA~/T1OSI/CCP2/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='AN7/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='~FLTB~/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='AN8/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='INT0/T5CKI/T0CKI/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='28',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='T5CKI/T0CKI/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='AVSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='13',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='33',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='34',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='39',name='SDO/RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F442-I/P',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller',description='32K Flash, 1536B SRAM, 256 EEPROM, ADC, DIP40',ref_prefix='U',num_units=1,fplist=['DIP*W15.24mm*', 'PDIP*W15.24mm*'],do_erc=True,aliases=['PIC18LF442-I/P', 'PIC18LF452-I/P', 'PIC18F452-I/P'],pins=[
Pin(num='1',name='~MCLR~/VPP',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/TOCKI',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN4/~SS~/LVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='~RD~/AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='~WR~/AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='~CS~/AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='PSP1/RD1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='PSP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='RB7/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='21',name='PSP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='22',name='PSP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='OSC1/CLKIN',do_erc=True),
Pin(num='23',name='SDI/SDA/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RB0/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='24',name='SDO/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='RB1/INT1',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='T10S0/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='RB2/INT2',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='CCP2/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='RB3/CCP2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='PSP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='RB4',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='SCK/SCL/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='PSP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='38',name='RB5/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='PSP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='PSP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='39',name='RB6/PGC',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F442-I/PT',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller',description='32K Flash, 1536B SRAM, 256 EEPROM, ADC, TQFP-44',ref_prefix='U',num_units=1,fplist=['TQFP*10x10mm*Pitch0.8mm*'],do_erc=True,aliases=['PIC18LF442-I/PT', 'PIC18LF452-I/PT', 'PIC18F452-I/PT'],pins=[
Pin(num='1',name='DT/RX/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='PSP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='PSP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='PSP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='PSP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='13',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='33',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='34',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='7',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='RB0/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='RB1/INT1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB2/INT2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='OSC1/CLKI',do_erc=True),
Pin(num='40',name='PSP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB3/CCP2',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='41',name='PSP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='SDI/SDA/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RA4/T0CKI',func=Pin.BIDIR,do_erc=True),
Pin(num='43',name='SDO/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN4/~SS~/LVDin',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='CK/TX/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='~RD~/AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='T1OSI/CCP2/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='RB6/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='~WR~/AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='~CS~/AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='SCK/SCL/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='~MCLR~/Vpp',do_erc=True),
Pin(num='28',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='PSP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='PSP1/RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4450-I/ML',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 8K SRAM, USB, nanoWatt XLP, QFN44',ref_prefix='U',num_units=1,fplist=['QFN*1EP*8x8mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18LF4450-I/ML'],pins=[
Pin(num='1',name='RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='RB0/AN12/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB1/AN10/INT1',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='AVSS',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='40',name='RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='41',name='RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='RB3/AN9/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='OSC1/CLKI',do_erc=True),
Pin(num='42',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RA4/T0CKI/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='43',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4/AN11/KBI0',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN4/HLVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='~UOE~/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='28',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4450-I/P',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 8K SRAM, USB, nanoWatt XLP, PDIP40',ref_prefix='U',num_units=1,fplist=['DIP*W15.24mm*', 'PDIP*W15.24mm*'],do_erc=True,aliases=['PIC18LF4450-I/P'],pins=[
Pin(num='1',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/T0CKI/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN4/HLVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RD1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='21',name='RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='22',name='RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='OSC1/CLKI',do_erc=True),
Pin(num='23',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RB0/AN12/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='24',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='RB1/AN10/INT1',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='~UOE~/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='RB3/AN9/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='RB4/AN11/KBI0',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='38',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='39',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4450-I/PT',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='16K Flash, 8K SRAM, USB, nanoWatt XLP, TQFP44',ref_prefix='U',num_units=1,fplist=['TQFP*10x10mm*Pitch0.8mm*'],do_erc=True,aliases=['PIC18LF4450-I/PT'],pins=[
Pin(num='1',name='RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='RB0/AN12/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='RB1/AN10/INT1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='OSC1/CLKI',do_erc=True),
Pin(num='40',name='RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB3/AN9/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='41',name='RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='(ICCK/ICPGC)',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='T1OSO/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='(ICDT/ICPGD)',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RA4/T0CKI/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='(~ICRST~/ICVpp)',do_erc=True),
Pin(num='43',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4/AN11/KBI0',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN4/HLVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='(ICPORTS)',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='~UOE~/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='28',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4455-I/ML',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='32K Flash, 2K SRAM, 256 EEPROM, USB, nanoWatt XLP, QFN44',ref_prefix='U',num_units=1,fplist=['QFN*1EP*8x8mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18LF4455-I/ML', 'PIC18LF4550-I/ML', 'PIC18F4550-I/ML'],pins=[
Pin(num='1',name='SDO/RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='SPP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='P1B/SPP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='P1C/SPP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='P1D/SPP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='RB0/AN12/INT0/FLT0/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB1/AN10/INT1/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='40',name='SPP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/AN2/Vref-/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='41',name='SPP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='RB3/AN9/CCP2/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='OSC1/CLKI',do_erc=True),
Pin(num='42',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RA4/T0CKI/C1OUT/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='43',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4/AN11/KBI0/CSSPP',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN4/~SS~/HLVDIN/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='T1OSO/T13CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='CK1SPP/AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='~UOE~/CCP2/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='CK2SPP/AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='P1A/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='OESPP/AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='28',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='SPP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='SPP1/RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4455-I/P',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='32K Flash, 2K SRAM, 256 EEPROM, USB, nanoWatt XLP, DIP40',ref_prefix='U',num_units=1,fplist=['DIP*W15.24mm*', 'PDIP*W15.24mm*'],do_erc=True,aliases=['PIC18LF4455-I/P', 'PIC18LF4550-I/P', 'PIC18F4550-I/P'],pins=[
Pin(num='1',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/T0CKI/C1OUT/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN4/~SS~/HLVDIN/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='CK1SPP/AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='CK2SPP/AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='OESPP/AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='SPP1/RD1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='P1D/SPP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='21',name='SPP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='22',name='SPP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='OSC1/CLKI',do_erc=True),
Pin(num='23',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RB0/AN12/INT0/FLT0/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='24',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='RB1/AN10/INT1/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='T1OSO/T13CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='~UOE~/CCP2/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='SDO/RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='RB3/AN9/CCP2/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='P1A/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='SPP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='RB4/AN11/KBI0/CSSPP',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='P1B/SPP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='38',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='SPP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='P1C/SPP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='39',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4455-I/PT',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='32K Flash, 2K SRAM, 256 EEPROM, USB, nanoWatt XLP, TQFP44',ref_prefix='U',num_units=1,fplist=['TQFP*10x10mm*Pitch0.8mm*'],do_erc=True,aliases=['PIC18LF4455-I/PT', 'PIC18LF4550-I/PT', 'PIC18F4550-I/PT'],pins=[
Pin(num='1',name='SDO/RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='SPP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='P1B/SPP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='P1C/SPP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='P1D/SPP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='RB0/AN12/INT0/FLT0/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='RB1/AN10/INT1/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='OSC1/CLKI',do_erc=True),
Pin(num='40',name='SPP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB3/AN9/CCP2/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/AN2/Vref-/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='41',name='SPP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='(ICCK/ICPGC)',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='T1OSO/T13CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='(ICDT/ICPGD)',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RA4/T0CKI/C1OUT/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='(~ICRST~/ICVpp)',do_erc=True),
Pin(num='43',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4/AN11/KBI0/CSSPP',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN4/~SS~/HLVDIN/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='(ICPORTS)',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='CK1SPP/AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='~UOE~/CCP2/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='CK2SPP/AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='P1A/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='OESPP/AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='28',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='SPP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='SPP1/RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4458-I/ML',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='32K Flash, 2K SRAM, 256 EEPROM, USB, 12-Bit A/D, nanoWatt XLP, QFN44',ref_prefix='U',num_units=1,fplist=['QFN*1EP*8x8mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18LF4458-I/ML', 'PIC18F4553-I/ML', 'PIC18LF4553-I/ML'],pins=[
Pin(num='1',name='SDO/RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='SPP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='P1B/SPP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='P1C/SPP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='P1D/SPP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='RB0/AN12/INT0/FLT0/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB1/AN10/INT1/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='40',name='SPP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/AN2/Vref-/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='41',name='SPP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='RB3/AN9/CCP2/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='OSC1/CLKI',do_erc=True),
Pin(num='42',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='23',name='RA4/T0CKI/C1OUT/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='43',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4/AN11/KBI0/CSSPP',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN4/~SS~/HLVDIN/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='T1OSO/T13CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='CK1SPP/AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='~UOE~/CCP2/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='45',name='PAD',func=Pin.PWRIN,do_erc=True),
Pin(num='16',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='CK2SPP/AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='P1A/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='OESPP/AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='28',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='SPP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='SPP1/RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4458-I/P',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='32K Flash, 2K SRAM, 256 EEPROM, USB, 12-Bit A/D, nanoWatt XLP, DIP40',ref_prefix='U',num_units=1,fplist=['DIP*W15.24mm*', 'PDIP*W15.24mm*'],do_erc=True,aliases=['PIC18LF4458-I/P', 'PIC18F4553-I/P', 'PIC18LF4553-I/P'],pins=[
Pin(num='1',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/T0CKI/C1OUT/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN4/~SS~/HLVDIN/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='CK1SPP/AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='CK2SPP/AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='OESPP/AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='SPP1/RD1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='P1D/SPP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='21',name='SPP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='22',name='SPP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='OSC1/CLKI',do_erc=True),
Pin(num='23',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RB0/AN12/INT0/FLT0/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='RB1/AN10/INT1/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='T1OSO/T13CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='~UOE~/CCP2/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='SDO/RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='RB3/AN9/CCP2/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='P1A/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='SPP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='RB4/AN11/KBI0/CSSPP',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='P1B/SPP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='38',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='SPP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='P1C/SPP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='39',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F4458-I/PT',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller XLP',description='32K Flash, 2K SRAM, 256 EEPROM, USB, 12-Bit A/D, nanoWatt XLP, TQFP44',ref_prefix='U',num_units=1,fplist=['TQFP*10x10mm*Pitch0.8mm*'],do_erc=True,aliases=['PIC18LF4458-I/PT', 'PIC18F4553-I/PT', 'PIC18LF4553-I/PT'],pins=[
Pin(num='1',name='SDO/RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='SPP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='P1B/SPP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='P1C/SPP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='P1D/SPP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='RB0/AN12/INT0/FLT0/SDI/SDA',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='RB1/AN10/INT1/SCK/SCL',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB2/AN8/INT2/VMO',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='OSC1/CLKI',do_erc=True),
Pin(num='40',name='SPP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB3/AN9/CCP2/VPO',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/AN2/Vref-/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='RA6/OSC2/CLKO',func=Pin.BIDIR,do_erc=True),
Pin(num='41',name='SPP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='(ICCK/ICPGC)',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='T1OSO/T13CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='VM/D-/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='(ICDT/ICPGD)',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='RA4/T0CKI/C1OUT/RCV',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='(~ICRST~/ICVpp)',do_erc=True),
Pin(num='43',name='VP/D+/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4/AN11/KBI0/CSSPP',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN4/~SS~/HLVDIN/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='(ICPORTS)',do_erc=True),
Pin(num='44',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='CK1SPP/AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='~UOE~/CCP2/T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='CK2SPP/AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='P1A/CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='OESPP/AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='VUSB',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='Vpp/~MCLR~/RE3',do_erc=True),
Pin(num='28',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='SPP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='SPP1/RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F448-I/P',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller CAN',description='32K Flash, 1536B SRAM, 256 EEPROM, CAN, DIP40',ref_prefix='U',num_units=1,fplist=['DIP*W15.24mm*', 'PDIP*W15.24mm*'],do_erc=True,aliases=['PIC18LF448-I/P', 'PIC18LF458-I/P', 'PIC18F458-I/P'],pins=[
Pin(num='1',name='~MCLR~/VPP',do_erc=True),
Pin(num='2',name='RA0/AN0/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/TOCKI',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/AN4/~SS~/LVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='~RD~/AN5/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='C1OUT/~WR~/AN6/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='C2OUT/~CS~/AN7/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='C1IN-/PSP1/RD1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='P1D/PSP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='RB7/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='21',name='C2IN+/PSP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='22',name='C2IN-/PSP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='OSC1/CLKIN',do_erc=True),
Pin(num='23',name='SDI/SDA/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RB0/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RA6/OSC2/CLKO',func=Pin.OUTPUT,do_erc=True),
Pin(num='24',name='SDO/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='RB1/INT1',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='T10S0/T1CKI/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='TX/CK/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='RB2/INT2/CANTX',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RX/DT/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='RB3/CANRX',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='P1A/ECCP1/PSP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='RB4',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='SCK/SCL/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='P1B/PSP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='38',name='RB5/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='C1IN+/PSP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='P1C/PSP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='39',name='RB6/PGC',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F44J10-I/P',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller',description='32K Flash, 1K SRAM, ADC, DIP40',ref_prefix='U',num_units=1,fplist=['DIP*W15.24mm*', 'PDIP*W15.24mm*'],do_erc=True,aliases=['PIC18LF44J10-I/P', 'PIC18LF45J10-I/P', 'PIC18F45J10-I/P'],pins=[
Pin(num='1',name='~MCLR~',do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.TRISTATE,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.TRISTATE,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-/CVref',func=Pin.TRISTATE,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.TRISTATE,do_erc=True),
Pin(num='6',name='VDDCORE/VCAP',func=Pin.PASSIVE,do_erc=True),
Pin(num='7',name='RA5/AN4/~SS1~/C2OUT',func=Pin.TRISTATE,do_erc=True),
Pin(num='8',name='~RD~/AN5/RE0',func=Pin.TRISTATE,do_erc=True),
Pin(num='9',name='~WR~/AN6/RE1',func=Pin.TRISTATE,do_erc=True),
Pin(num='10',name='~CS~/AN7/RE2',func=Pin.TRISTATE,do_erc=True),
Pin(num='20',name='PSP1/SDI2/SDA2/RD1',func=Pin.TRISTATE,do_erc=True),
Pin(num='30',name='PSP7/P1D/RD7',func=Pin.TRISTATE,do_erc=True),
Pin(num='40',name='RB7/KBI3/PGD',func=Pin.TRISTATE,do_erc=True),
Pin(num='11',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='21',name='PSP2/SDO2/RD2',func=Pin.TRISTATE,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='22',name='PSP3/~SS2~/RD3',func=Pin.TRISTATE,do_erc=True),
Pin(num='32',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='OSC1/CLKI',do_erc=True),
Pin(num='23',name='SDI1/SDA1/RC4',func=Pin.TRISTATE,do_erc=True),
Pin(num='33',name='RB0/INT0/FLT0/AN12',func=Pin.TRISTATE,do_erc=True),
Pin(num='14',name='OSC2/CLK0',func=Pin.OUTPUT,do_erc=True),
Pin(num='24',name='SDO1/RC5',func=Pin.TRISTATE,do_erc=True),
Pin(num='34',name='RB1/INT1/AN10',func=Pin.TRISTATE,do_erc=True),
Pin(num='15',name='T1OSO/T1CKI/RC0',func=Pin.TRISTATE,do_erc=True),
Pin(num='25',name='TX/CK/RC6',func=Pin.TRISTATE,do_erc=True),
Pin(num='35',name='RB2/INT2/AN8',func=Pin.TRISTATE,do_erc=True),
Pin(num='16',name='T1OSI/CCP2/RC1',func=Pin.TRISTATE,do_erc=True),
Pin(num='26',name='RX/DT/RC7',func=Pin.TRISTATE,do_erc=True),
Pin(num='36',name='RB3/CCP2/AN9',func=Pin.TRISTATE,do_erc=True),
Pin(num='17',name='P1A/CCP1/RC2',func=Pin.TRISTATE,do_erc=True),
Pin(num='27',name='PSP4/RD4',func=Pin.TRISTATE,do_erc=True),
Pin(num='37',name='RB4/KBI0/AN11',func=Pin.TRISTATE,do_erc=True),
Pin(num='18',name='SCK1/SCL1/RC3',func=Pin.TRISTATE,do_erc=True),
Pin(num='28',name='PSP5/P1B/RD5',func=Pin.TRISTATE,do_erc=True),
Pin(num='38',name='RB5/KBI1/T0CKI/C1OUT',func=Pin.TRISTATE,do_erc=True),
Pin(num='19',name='PSP0/SCK2/SCL2/RD0',func=Pin.TRISTATE,do_erc=True),
Pin(num='29',name='PSP6/P1C/RD6',func=Pin.TRISTATE,do_erc=True),
Pin(num='39',name='RB6/KBI2/PGC',func=Pin.TRISTATE,do_erc=True)]),
Part(name='PIC18F4580-I/P',dest=TEMPLATE,tool=SKIDL,keywords='Flash-Based 8-Bit Microcontroller CAN',description='32K Flash, 768B SRAM, 256 EEPROM, ECAN, DIP40',ref_prefix='U',num_units=1,fplist=['DIP*W15.24mm*', 'PDIP*W15.24mm*'],do_erc=True,aliases=['PIC18LF4580-I/P', 'PIC18LF4480-I/P', 'PIC18F4480-I/P'],pins=[
Pin(num='1',name='~MCLR~/VPP/RE3',do_erc=True),
Pin(num='2',name='RA0/AN0/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/AN2/Vref-',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/AN3/Vref+',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4/TOCKI',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='RA5/ANA4/~SS~/HLVDIN',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='AN5/~RD~/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='C1OUT/AN6/~WR~/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='C2OUT/AN7/~CS~/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='C1IN-/PSP1/RD1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='P1D/PSP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='RB7/KBI3/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='21',name='C2IN+/PSP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='12',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='22',name='C2IN-/PSP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='13',name='OSC1/CLKI/RA7',do_erc=True),
Pin(num='23',name='SDA/SDI/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='RB0/INT0/FLT0/AN10',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='OSC2/CLKO/RA6',func=Pin.OUTPUT,do_erc=True),
Pin(num='24',name='SDO/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='RB1/INT1/AN8',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='T13CKI/T1OSO/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='CK/TX/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='RB2/INT2/CANTX',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='T1OSI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='DT/RX/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='RB3/CANRX',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='P1A/ECCP1/PSP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='RB4/KBI0/AN9',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='SCL/SCK/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='P1B/PSP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='38',name='RB5/KBI1/PGM',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='C1IN+/PSP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='P1C/PSP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='39',name='RB6/KBI2/PGC',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F45K80-I/PT',dest=TEMPLATE,tool=SKIDL,keywords='microchip microcontroller PIC18 flash ECAN XLP nanoWatt',description='64K Flash, 3.5K RAM, 1K EEPROM PIC18 Microcontroller ADC PWM CAN SPI I2C USART in TQFP44 package',ref_prefix='U',num_units=1,fplist=['TQFP*10x10mm*Pitch0.8mm*'],do_erc=True,aliases=['PIC18LF45K80-I/PT', 'PIC18LF46K80-I/PT', 'PIC18F46K80-I/PT'],pins=[
Pin(num='1',name='RC7/CANRX/RX1/DT1/CCP4',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RD4/ECCP1/P1A/PSP4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RD5/P1B/PSP5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RD6/TX2/CK2/P1C/PSP6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RD7/RX2/DT2/P1D/PSP7',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='AN10/FLT0/INT0/RB0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='AN8/CTDIN/INT1/RB1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='CANTX/CTED1/INT2/RB2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='AN1/C1INC/RA1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='OSC1/CLKIN/RA7',do_erc=True),
Pin(num='40',name='RD2/C2INA/PSP2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='CANRX/CTED2/INT3/RB3',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='VREF-/AN2/C2INC/RA2',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='OSC2/CLKOUT/RA6',func=Pin.OUTPUT,do_erc=True),
Pin(num='41',name='RD3/C2INB/CTMUI/PSP3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='N/C',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='Vref+/AN3/RA3',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='RC0/SOSCO/SCLKI',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='RC4/SDA/SDI',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='N/C',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='VDDCORE/VCAP',func=Pin.PASSIVE,do_erc=True),
Pin(num='33',name='N/C',func=Pin.BIDIR,do_erc=True),
Pin(num='43',name='RC5/SDO',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='AN9/CTPLS/KBI0/RB4',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='AN4/HLVDIN/T1CKI/~SS~/RA5',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='N/C',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='RC6/CANTX/TX1/CK1/CCP3',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='T0CKI/T3CKI/CCP5/KBI1/RB5',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RE0/AN5/~RD~',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='RC1/SOSCI',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='PGC/KBI2/RB6',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RE1/AN6/C1OUT/~WR~',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='RC2/T1G/CCP2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='PGD/T3G/KBI3/RB7',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='RE2/AN7/C2OUT/~CS~',func=Pin.PWRIN,do_erc=True),
Pin(num='37',name='RC3/REFO/SCL/SCK',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='~MCLR~/RE3',do_erc=True),
Pin(num='28',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='RD0/C1INA/PSP0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='CVref/AN0/ULPWU/RA0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='RD1/C1INB/PSP1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F1220-SO',dest=TEMPLATE,tool=SKIDL,keywords='RAM ADC UART PWM',description='18-Pin Flash Microcontroller 8K Flash 256byte RAM',ref_prefix='U',num_units=1,fplist=['SOIC*W*7.5x11.6mm*Pitch1.27mm*'],do_erc=True,aliases=['PIC18LF1220-SO', 'PIC18LF1320-SO', 'PIC18F1320-SO'],pins=[
Pin(num='1',name='RA0/AN0',do_erc=True),
Pin(num='2',name='RA1/AN1/LVDIN',do_erc=True),
Pin(num='3',name='RA4/TOCKI',do_erc=True),
Pin(num='4',name='~MCLR~/Vpp/RA5',do_erc=True),
Pin(num='5',name='Vss',func=Pin.PWRIN,do_erc=True),
Pin(num='6',name='RA2/AN2/Vref-',do_erc=True),
Pin(num='7',name='RA3/AN3/Vref+',do_erc=True),
Pin(num='8',name='INT0/AN4/RB0',do_erc=True),
Pin(num='9',name='INT1/CK/TX/AN5/RB1',do_erc=True),
Pin(num='10',name='KBI0/DT/RX/AN6/RB4',do_erc=True),
Pin(num='11',name='KBI1/PGM/RB5',do_erc=True),
Pin(num='12',name='KBI2/P1C/T13CKI/T1OSO/PGC/RB6',do_erc=True),
Pin(num='13',name='KBI3/P1D/T1OSI/PGD/RB7',do_erc=True),
Pin(num='14',name='Vdd',func=Pin.PWRIN,do_erc=True),
Pin(num='15',name='OSC2/CLKO/RA6',do_erc=True),
Pin(num='16',name='OSC1/CLKI/RA7',do_erc=True),
Pin(num='17',name='INT2/P1B/RB2',do_erc=True),
Pin(num='18',name='P1A/CCP1/RB3',do_erc=True)]),
Part(name='PIC18F23K20_I/SS',dest=TEMPLATE,tool=SKIDL,keywords='microcontroller PIC18F flash XLP',description='64K Flash, 3936Byte RAM, 1KByte EEPROM, PIC18 Microcontroller ADC PWM SPI I2C USART in SSOP28 package',ref_prefix='U',num_units=1,fplist=['SSOP*5.3x10.2mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18F24K20_I/SS', 'PIC18F25K20_I/SS', 'PIC18F26K20_I/SS'],pins=[
Pin(num='1',name='~MCLR~/RE3',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/Vref-AN2',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/Vref+AN3',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='RA4',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='RA5/AN4',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='Vss',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='RA7/OSC1/CLKIN',do_erc=True),
Pin(num='10',name='RA6/OSC2/CLKOUT',do_erc=True),
Pin(num='20',name='Vdd',func=Pin.PWRIN,do_erc=True),
Pin(num='11',name='RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='AN12/INT0/RB0',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='CCP2/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='AN10/INT1/RB1',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='CCP1/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='AN8/INT2/RB2',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='SCL/SCK/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='AN9/RB3',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='SDA/SDI/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='AN11/RB4',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='SDO/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='PGM/RB5',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='TX/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='PGC/RB6',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='RX/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='PGD/RB7',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='Vss',func=Pin.PWRIN,do_erc=True)]),
Part(name='PIC18F25K80_I/ML',dest=TEMPLATE,tool=SKIDL,keywords='microchip microcontroller PIC18 flash ECAN XLP nanoWatt',description='64K Flash, 3.5K RAM, 1K EEPROM PIC18 Microcontroller ADC PWM CAN SPI I2C USART in QFN28 package',ref_prefix='U',num_units=1,fplist=['QFN*1EP*6x6mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18F26K80_I/ML', 'PIC18LF25K80_I/ML', 'PIC18LF26K80_I/ML'],pins=[
Pin(num='1',name='RA2/Vref-/AN2',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RA3/Vref+/AN3',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='Vcap',func=Pin.PWRIN,do_erc=True),
Pin(num='4',name='RA5/AN4',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='Vss',func=Pin.PWRIN,do_erc=True),
Pin(num='6',name='OSC1/CLKIN',do_erc=True),
Pin(num='7',name='OSC2/CLKOUT',do_erc=True),
Pin(num='8',name='SOSCO/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='SOSCI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='CCP2/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='CANTX/INT2/RB2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='SCL/SCK/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='CANRX/INT3/RB3',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='SDA/SDI/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='AN9/ECCP1/RB4',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='SDO/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='CCP5/RB5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='CANTX/TX1/CCP3/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='PGC/TX2/RB6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='CANRX/RX1/CCP4/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='PGD/RX2/RB7',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='Vss',func=Pin.PWRIN,do_erc=True),
Pin(num='26',name='~MCLR~/RE3',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='Vdd',func=Pin.PWRIN,do_erc=True),
Pin(num='27',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='AN10/INT0/RB0',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='AN8/INT1/RB1',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='PAD',func=Pin.PWRIN,do_erc=True)]),
Part(name='PIC18F25K80_I/SS',dest=TEMPLATE,tool=SKIDL,keywords='microchip microcontroller PIC18 flash ECAN XLP nanoWatt',description='64K Flash, 3.5K RAM, 1K EEPROM PIC18 Microcontroller ADC PWM CAN SPI I2C USART in SSOP28 package',ref_prefix='U',num_units=1,fplist=['SSOP*5.3x10.2mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18F26K80_I/SS', 'PIC18LF25K80_I/SS', 'PIC18LF26K80_I/SS'],pins=[
Pin(num='1',name='~MCLR~/RE3',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RA0/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='RA2/Vref-/AN2',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RA3/Vref+/AN3',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='Vcap',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='RA5/AN4',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='Vss',func=Pin.PWRIN,do_erc=True),
Pin(num='9',name='OSC1/CLKIN',do_erc=True),
Pin(num='10',name='OSC2/CLKOUT',do_erc=True),
Pin(num='11',name='SOSCO/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='AN10/INT0/RB0',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='SOSCI/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='AN8/INT1/RB1',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='CCP2/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='CANTX/INT2/RB2',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='SCL/SCK/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='CANRX/INT3/RB3',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='SDA/SDI/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='AN9/ECCP1/RB4',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='SDO/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='CCP5/RB5',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='CANTX/TX1/CCP3/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='Vdd',func=Pin.PWRIN,do_erc=True),
Pin(num='27',name='PGC/TX2/RB6',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='CANRX/RX1/CCP4/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='PGD/RX2/RB7',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='Vss',func=Pin.PWRIN,do_erc=True)]),
Part(name='PIC18F45K80-I/ML',dest=TEMPLATE,tool=SKIDL,keywords='microchip microcontroller PIC18 flash ECAN XLP nanoWatt',description='64K Flash, 3.5K RAM, 1K EEPROM PIC18 Microcontroller ADC PWM CAN SPI I2C USART in QFN44 package',ref_prefix='U',num_units=1,fplist=['QFN*1EP*8x8mm*Pitch0.65mm*'],do_erc=True,aliases=['PIC18F46K80-I/ML', 'PIC18LF45K80-I/ML', 'PIC18LF46K80-I/ML'],pins=[
Pin(num='1',name='CANRX/RX1/CCP4/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='ECCP1/PSP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='PSP5/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='TX2/PSP6/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='RX2/PSP7/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='13',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='33',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='34',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='6',name='Vss',func=Pin.PWRIN,do_erc=True),
Pin(num='7',name='Vdd',func=Pin.PWRIN,do_erc=True),
Pin(num='8',name='RB0/AN10/INT0',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='RB1/AN8/INT1',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RB2/CANTX/INT2',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RA1/AN1',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='OSC1/CLKIN/RA7',do_erc=True),
Pin(num='40',name='C2INA/PSP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RB3/CANRX/INT3',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RA2/Vref-/AN2',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='OSC2/CLKOUT/RA6',do_erc=True),
Pin(num='41',name='C2INB/PSP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RA3/Vref+/AN3',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='SDA/SDI/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='Vddcore/Vcap',func=Pin.PWRIN,do_erc=True),
Pin(num='43',name='SDO/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RB4/AN9',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RA5/AN4',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='CANTX/TX1/CCP3/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RB5/CCP5',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='RE0/AN5',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='45',name='PAD',func=Pin.PWRIN,do_erc=True),
Pin(num='16',name='RB6/PGC',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='RE1/AN6',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='CCP2/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='RB7/PGD',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='RE2/AN7',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='SCL/SCK/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='~MLCR~/RE3',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='Vdd',func=Pin.PWRIN,do_erc=True),
Pin(num='38',name='C1INA/PSP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RA0/CVref/AN0',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='Vss',func=Pin.PWRIN,do_erc=True),
Pin(num='39',name='C1INB/PSP1/RD1',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F66J60-I/PT',dest=TEMPLATE,tool=SKIDL,keywords='Flash Based 8-Bit Microcontroller Ethernet Controller PHY',description='128K Flash, 3.7K SRAM, Ethernet Controller with PHY, 8K Buffer, TQFP64',ref_prefix='U',num_units=1,fplist=['TQFP*10x10mm*Pitch0.5mm*'],do_erc=True,aliases=['PIC18F66J65-I/PT', 'PIC18F67J60-I/PT'],pins=[
Pin(num='1',name='RE1/P2C',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RE0/P2D',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='INT0/FLT0/RB0',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='INT1/RB1',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='INT2/RB2',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='INT3/RB3',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='~MCLR~',do_erc=True),
Pin(num='8',name='RG4/CCP5/P1D',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='10',name='VDDCORE/VCAP',func=Pin.PASSIVE,do_erc=True),
Pin(num='20',name='AVSS',func=Pin.PWRIN,do_erc=True),
Pin(num='30',name='T13CKI/T1OSO/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='OSC2/CLK2',func=Pin.OUTPUT,do_erc=True),
Pin(num='50',name='TPOUT-',func=Pin.PASSIVE,do_erc=True),
Pin(num='60',name='RD0/P1B',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RF7/~SS1~',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='Vref+/AN3/RA3',func=Pin.BIDIR,do_erc=True),
Pin(num='31',name='TX1/CK1/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='41',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='51',name='TPOUT+',func=Pin.PASSIVE,do_erc=True),
Pin(num='61',name='RE5/P1C',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='RF6/AN11',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='Vref-/AN2/RA2',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='RX1/DT1/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='PGC/KBI2/RB6',func=Pin.BIDIR,do_erc=True),
Pin(num='52',name='VSSTX',func=Pin.PWRIN,do_erc=True),
Pin(num='62',name='RE4/P3B',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='RF5/AN10/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='23',name='LEDB/AN1/RA1',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='ECCP1/P1A/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='43',name='KBI1/RB5',func=Pin.BIDIR,do_erc=True),
Pin(num='53',name='RBIAS',func=Pin.PASSIVE,do_erc=True),
Pin(num='63',name='RE3/P3C',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RF4/AN9',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='LEDA/AN0/RA0',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='SCK1/SCL1/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='KBI0/RB4',func=Pin.BIDIR,do_erc=True),
Pin(num='54',name='VDDPLL',func=Pin.PWRIN,do_erc=True),
Pin(num='64',name='RE2/P2B',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='RF3/AN8',func=Pin.BIDIR,do_erc=True),
Pin(num='25',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='35',name='SDI1/SDA1/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='45',name='VSSRX',func=Pin.PWRIN,do_erc=True),
Pin(num='55',name='VSSPLL',func=Pin.PWRIN,do_erc=True),
Pin(num='16',name='RF2/AN7/C1OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='26',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='36',name='SDO1/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='46',name='TPIN-',func=Pin.PASSIVE,do_erc=True),
Pin(num='56',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='17',name='RF1/AN6/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='27',name='AN4/RA5',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='PGD/KBI3/RB7',func=Pin.BIDIR,do_erc=True),
Pin(num='47',name='TPIN+',func=Pin.PASSIVE,do_erc=True),
Pin(num='57',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='18',name='ENVREG',func=Pin.PASSIVE,do_erc=True),
Pin(num='28',name='T0CKI/RA4',func=Pin.BIDIR,do_erc=True),
Pin(num='38',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='48',name='VDDRX',func=Pin.PWRIN,do_erc=True),
Pin(num='58',name='RD2/CCP4/P3D',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='29',name='T1OSI/ECCP2/P2A/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='39',name='OSC1/CLKI',do_erc=True),
Pin(num='49',name='VDDTX',func=Pin.PWRIN,do_erc=True),
Pin(num='59',name='RD1/ECCP3/P3A',func=Pin.BIDIR,do_erc=True)]),
Part(name='PIC18F96J60-I/PT',dest=TEMPLATE,tool=SKIDL,keywords='Flash Based 8-Bit Microcontroller Ethernet Controller PHY',description='128K Flash, 3.7K SRAM, Ethernet Controller with PHY, 8K Buffer, TQFP100',ref_prefix='U',num_units=1,fplist=['TQFP*12x12mm*Pitch0.4mm*', 'TQFP*14x14mm*Pitch0.5mm*'],do_erc=True,aliases=['PIC18F96J65-I/PT', 'PIC18F97J60-I/PT', 'PIC18F96J60-I/PF', 'PIC18F96J65-I/PF', 'PIC18F97J60-I/PF'],pins=[
Pin(num='1',name='RH2/A18',func=Pin.BIDIR,do_erc=True),
Pin(num='2',name='RH3/A19',func=Pin.BIDIR,do_erc=True),
Pin(num='3',name='~WR~/AD9/P2C/RE1',func=Pin.BIDIR,do_erc=True),
Pin(num='4',name='~RD~/AD8/P2D/RE0',func=Pin.BIDIR,do_erc=True),
Pin(num='5',name='INT0/FLT0/RB0',func=Pin.BIDIR,do_erc=True),
Pin(num='6',name='INT1/RB1',func=Pin.BIDIR,do_erc=True),
Pin(num='7',name='INT2/RB2',func=Pin.BIDIR,do_erc=True),
Pin(num='8',name='INT3/ECCP2/P2A/RB3',func=Pin.BIDIR,do_erc=True),
Pin(num='10',name='RG6',func=Pin.BIDIR,do_erc=True),
Pin(num='20',name='RF5/AN10/CVref',func=Pin.BIDIR,do_erc=True),
Pin(num='30',name='AVDD',func=Pin.PWRIN,do_erc=True),
Pin(num='50',name='RJ1/~OE~',func=Pin.BIDIR,do_erc=True),
Pin(num='40',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='60',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='70',name='RG1/TX2/CK2',func=Pin.BIDIR,do_erc=True),
Pin(num='80',name='RBIAS',func=Pin.PASSIVE,do_erc=True),
Pin(num='90',name='AD2/PSP2/RD2',func=Pin.BIDIR,do_erc=True),
Pin(num='11',name='RG5',func=Pin.BIDIR,do_erc=True),
Pin(num='21',name='RF4/AN9',func=Pin.BIDIR,do_erc=True),
Pin(num='9',name='NC',func=Pin.NOCONNECT,do_erc=True),
Pin(num='31',name='AVSS',func=Pin.PWRIN,do_erc=True),
Pin(num='41',name='AN4/RA5',func=Pin.BIDIR,do_erc=True),
Pin(num='51',name='RG3/CCP4/P3D',func=Pin.BIDIR,do_erc=True),
Pin(num='61',name='RJ3/~WRH~',func=Pin.BIDIR,do_erc=True),
Pin(num='71',name='RG0/ECCP3/P3A',func=Pin.BIDIR,do_erc=True),
Pin(num='81',name='VDDPLL',func=Pin.PWRIN,do_erc=True),
Pin(num='91',name='AD1/PSP1/RD1',func=Pin.BIDIR,do_erc=True),
Pin(num='12',name='RF0/AN5',func=Pin.BIDIR,do_erc=True),
Pin(num='22',name='RF3/AN8',func=Pin.BIDIR,do_erc=True),
Pin(num='32',name='Vref+/AN3/RA3',func=Pin.BIDIR,do_erc=True),
Pin(num='42',name='TOCKI/RA4',func=Pin.BIDIR,do_erc=True),
Pin(num='52',name='RG2/RX2/DT2',func=Pin.BIDIR,do_erc=True),
Pin(num='62',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='72',name='VSSRX',func=Pin.PWRIN,do_erc=True),
Pin(num='82',name='VSSPLL',func=Pin.PWRIN,do_erc=True),
Pin(num='92',name='AD0/PSP0/RD0',func=Pin.BIDIR,do_erc=True),
Pin(num='13',name='~MCLR~',do_erc=True),
Pin(num='23',name='RF2/AN7/C1OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='33',name='Vref-/AN2/RA2',func=Pin.BIDIR,do_erc=True),
Pin(num='43',name='T1OSI/ECCP2/P2A/RC1',func=Pin.BIDIR,do_erc=True),
Pin(num='53',name='ECCP1/P1A/RC2',func=Pin.BIDIR,do_erc=True),
Pin(num='63',name='OSC1/CLKI',do_erc=True),
Pin(num='73',name='TPIN-',func=Pin.PASSIVE,do_erc=True),
Pin(num='83',name='AD7/PSP7/~SS2~/RD7',func=Pin.BIDIR,do_erc=True),
Pin(num='93',name='ECCP2/AD15/P2A/RE7',func=Pin.BIDIR,do_erc=True),
Pin(num='14',name='RG4/CCP5/P1D',func=Pin.BIDIR,do_erc=True),
Pin(num='24',name='RH7/AN15/P1B',func=Pin.BIDIR,do_erc=True),
Pin(num='34',name='LEDB/AN1/RA1',func=Pin.BIDIR,do_erc=True),
Pin(num='44',name='T13CKI/T1OSO/RC0',func=Pin.BIDIR,do_erc=True),
Pin(num='54',name='SCK1/SCL1/RC3',func=Pin.BIDIR,do_erc=True),
Pin(num='64',name='OSC2/CLK0',func=Pin.OUTPUT,do_erc=True),
Pin(num='74',name='TPIN+',func=Pin.PASSIVE,do_erc=True),
Pin(num='84',name='AD6/PSP6/SCK2/SCL2/RD6',func=Pin.BIDIR,do_erc=True),
Pin(num='94',name='AD14/P1B/RE6',func=Pin.BIDIR,do_erc=True),
Pin(num='15',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='25',name='RH6/AN14/P1C',func=Pin.BIDIR,do_erc=True),
Pin(num='35',name='LEDA/AN0/RA0',func=Pin.BIDIR,do_erc=True),
Pin(num='45',name='TX1/CK1/RC6',func=Pin.BIDIR,do_erc=True),
Pin(num='55',name='SDI1/SDA1/RC4',func=Pin.BIDIR,do_erc=True),
Pin(num='65',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='75',name='VDDRX',func=Pin.PWRIN,do_erc=True),
Pin(num='85',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='95',name='AD13/P1C/RE5',func=Pin.BIDIR,do_erc=True),
Pin(num='16',name='VDDCORE/VCAP',func=Pin.PASSIVE,do_erc=True),
Pin(num='26',name='RH5/AN13/P3B',func=Pin.BIDIR,do_erc=True),
Pin(num='36',name='VSS',func=Pin.PWRIN,do_erc=True),
Pin(num='46',name='RX1/DT1/RC7',func=Pin.BIDIR,do_erc=True),
Pin(num='56',name='SDO1/RC5',func=Pin.BIDIR,do_erc=True),
Pin(num='66',name='RJ2/~WRL~',func=Pin.BIDIR,do_erc=True),
Pin(num='76',name='VDDTX',func=Pin.PWRIN,do_erc=True),
Pin(num='86',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='96',name='AD12/P3B/RE4',func=Pin.BIDIR,do_erc=True),
Pin(num='17',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='27',name='RH4/AN12/P3C',func=Pin.BIDIR,do_erc=True),
Pin(num='37',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='47',name='RJ4/BA0',func=Pin.BIDIR,do_erc=True),
Pin(num='57',name='PGD/KBI3/RB7',func=Pin.BIDIR,do_erc=True),
Pin(num='67',name='PGC/KBI2/RB6',func=Pin.BIDIR,do_erc=True),
Pin(num='77',name='TPOUT-',func=Pin.PASSIVE,do_erc=True),
Pin(num='87',name='AD5/PSP5/SDI2/SDA2/RD5',func=Pin.BIDIR,do_erc=True),
Pin(num='97',name='AD11/P3C/RE3',func=Pin.BIDIR,do_erc=True),
Pin(num='18',name='RF7/~SS1~',func=Pin.BIDIR,do_erc=True),
Pin(num='28',name='RF1/AN6/C2OUT',func=Pin.BIDIR,do_erc=True),
Pin(num='38',name='RG7',func=Pin.BIDIR,do_erc=True),
Pin(num='48',name='RJ5/~CE~',func=Pin.BIDIR,do_erc=True),
Pin(num='58',name='RJ6/~LB~',func=Pin.BIDIR,do_erc=True),
Pin(num='68',name='KBI1/RB5',func=Pin.BIDIR,do_erc=True),
Pin(num='78',name='TP0UT+',func=Pin.PASSIVE,do_erc=True),
Pin(num='88',name='SD02/AD4/PSP4/RD4',func=Pin.BIDIR,do_erc=True),
Pin(num='98',name='~CS~/AD10/P2B/RE2',func=Pin.BIDIR,do_erc=True),
Pin(num='19',name='RF6/AN11',func=Pin.BIDIR,do_erc=True),
Pin(num='29',name='ENVREG',func=Pin.PASSIVE,do_erc=True),
Pin(num='39',name='RJ7/~UB~',func=Pin.BIDIR,do_erc=True),
Pin(num='49',name='RJ0/ALE',func=Pin.BIDIR,do_erc=True),
Pin(num='59',name='VDD',func=Pin.PWRIN,do_erc=True),
Pin(num='69',name='KBI0/RB4',func=Pin.BIDIR,do_erc=True),
Pin(num='79',name='VSSTX',func=Pin.PWRIN,do_erc=True),
Pin(num='89',name='AD3/PSP3/RD3',func=Pin.BIDIR,do_erc=True),
Pin(num='99',name='RH0/A16',func=Pin.BIDIR,do_erc=True),
Pin(num='100',name='RH1/A17',func=Pin.BIDIR,do_erc=True)])])
| 78.554682 | 435 | 0.603014 | 18,836 | 109,898 | 3.439425 | 0.033181 | 0.107509 | 0.193517 | 0.244686 | 0.9605 | 0.959281 | 0.950961 | 0.940465 | 0.919642 | 0.91197 | 0 | 0.069881 | 0.168593 | 109,898 | 1,398 | 436 | 78.610873 | 0.639159 | 0 | 0 | 0.606734 | 0 | 0.005014 | 0.216719 | 0.02101 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.01361 | 0.000716 | 0 | 0.000716 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
510ae4960de347bf4e9c5ef20547d47c9ddde128 | 7,311 | py | Python | botx/bots/mixins/collecting/system_events.py | ExpressApp/pybotx | 97c8b1ce5d45a05567ed01d545cb43174a2dcbb9 | [
"MIT"
] | 13 | 2021-01-21T12:43:10.000Z | 2022-03-23T11:11:59.000Z | botx/bots/mixins/collecting/system_events.py | ExpressApp/pybotx | 97c8b1ce5d45a05567ed01d545cb43174a2dcbb9 | [
"MIT"
] | 259 | 2020-02-26T08:51:03.000Z | 2022-03-23T11:08:36.000Z | botx/bots/mixins/collecting/system_events.py | ExpressApp/pybotx | 97c8b1ce5d45a05567ed01d545cb43174a2dcbb9 | [
"MIT"
] | 5 | 2019-12-02T16:19:22.000Z | 2021-11-22T20:33:34.000Z | """Mixin that defines handler decorator."""
from typing import Any, Callable, Optional, Sequence
from botx.collecting.collectors.collector import Collector
from botx.dependencies.models import Depends
from botx.models.enums import SystemEvents
class SystemEventsHandlerMixin: # noqa: WPS214
"""Mixin that defines handler decorator."""
collector: Collector
def system_event( # noqa: WPS211
self,
handler: Optional[Callable] = None,
*,
event: Optional[SystemEvents] = None,
events: Optional[Sequence[SystemEvents]] = None,
name: Optional[str] = None,
dependencies: Optional[Sequence[Depends]] = None,
dependency_overrides_provider: Any = None,
) -> Callable:
"""Register handler for system event.
Arguments:
handler: callable that will be used for executing handler.
event: event for triggering this handler.
events: a sequence of events that will trigger handler.
name: optional name for handler that will be used in generating body.
dependencies: sequence of dependencies that should be executed before
handler.
dependency_overrides_provider: mock of callable for handler.
Returns:
Passed in `handler` callable.
"""
return self.collector.system_event(
handler=handler,
event=event,
events=events,
name=name,
dependencies=dependencies,
dependency_overrides_provider=dependency_overrides_provider,
)
def chat_created(
self,
handler: Optional[Callable] = None,
*,
dependencies: Optional[Sequence[Depends]] = None,
dependency_overrides_provider: Any = None,
) -> Callable:
"""Register handler for `system:chat_created` event.
Arguments:
handler: callable that will be used for executing handler.
dependencies: sequence of dependencies that should be executed before
handler.
dependency_overrides_provider: mock of callable for handler.
Returns:
Passed in `handler` callable.
"""
return self.collector.chat_created(
handler=handler,
dependencies=dependencies,
dependency_overrides_provider=dependency_overrides_provider,
)
def file_transfer(
self,
handler: Optional[Callable] = None,
*,
dependencies: Optional[Sequence[Depends]] = None,
dependency_overrides_provider: Any = None,
) -> Callable:
"""Register handler for `file_transfer` event.
Arguments:
handler: callable that will be used for executing handler.
dependencies: sequence of dependencies that should be executed before
handler.
dependency_overrides_provider: mock of callable for handler.
Returns:
Passed in `handler` callable.
"""
return self.collector.file_transfer(
handler=handler,
dependencies=dependencies,
dependency_overrides_provider=dependency_overrides_provider,
)
def added_to_chat(
self,
handler: Optional[Callable] = None,
*,
dependencies: Optional[Sequence[Depends]] = None,
dependency_overrides_provider: Any = None,
) -> Callable:
"""Register handler for `added_to_chat` event.
Arguments:
handler: callable that will be used for executing handler.
dependencies: sequence of dependencies that should be executed before
handler.
dependency_overrides_provider: mock of callable for handler.
Returns:
Passed in `handler` callable.
"""
return self.collector.added_to_chat(
handler=handler,
dependencies=dependencies,
dependency_overrides_provider=dependency_overrides_provider,
)
def deleted_from_chat(
self,
handler: Optional[Callable] = None,
*,
dependencies: Optional[Sequence[Depends]] = None,
dependency_overrides_provider: Any = None,
) -> Callable:
"""Register handler for `deleted_from_chat` event.
Arguments:
handler: callable that will be used for executing handler.
dependencies: sequence of dependencies that should be executed before
handler.
dependency_overrides_provider: mock of callable for handler.
Returns:
Passed in `handler` callable.
"""
return self.collector.deleted_from_chat(
handler=handler,
dependencies=dependencies,
dependency_overrides_provider=dependency_overrides_provider,
)
def left_from_chat(
self,
handler: Optional[Callable] = None,
*,
dependencies: Optional[Sequence[Depends]] = None,
dependency_overrides_provider: Any = None,
) -> Callable:
"""Register handler for `left_from_chat` event.
Arguments:
handler: callable that will be used for executing handler.
dependencies: sequence of dependencies that should be executed before
handler.
dependency_overrides_provider: mock of callable for handler.
Returns:
Passed in `handler` callable.
"""
return self.collector.left_from_chat(
handler=handler,
dependencies=dependencies,
dependency_overrides_provider=dependency_overrides_provider,
)
def cts_login(
self,
handler: Optional[Callable] = None,
*,
dependencies: Optional[Sequence[Depends]] = None,
dependency_overrides_provider: Any = None,
) -> Callable:
"""Register handler for `cts_login` event.
Arguments:
handler: callable that will be used for executing handler.
dependencies: sequence of dependencies that should be executed before
handler.
dependency_overrides_provider: mock of callable for handler.
Returns:
Passed in `handler` callable.
"""
return self.collector.cts_login(
handler=handler,
dependencies=dependencies,
dependency_overrides_provider=dependency_overrides_provider,
)
def cts_logout(
self,
handler: Optional[Callable] = None,
*,
dependencies: Optional[Sequence[Depends]] = None,
dependency_overrides_provider: Any = None,
) -> Callable:
"""Register handler for `cts_logout` event.
Arguments:
handler: callable that will be used for executing handler.
dependencies: sequence of dependencies that should be executed before
handler.
dependency_overrides_provider: mock of callable for handler.
Returns:
Passed in `handler` callable.
"""
return self.collector.cts_logout(
handler=handler,
dependencies=dependencies,
dependency_overrides_provider=dependency_overrides_provider,
)
| 34.004651 | 81 | 0.622213 | 683 | 7,311 | 6.519766 | 0.102489 | 0.136537 | 0.194027 | 0.028296 | 0.847294 | 0.82596 | 0.82596 | 0.82596 | 0.82596 | 0.80777 | 0 | 0.001193 | 0.311859 | 7,311 | 214 | 82 | 34.163551 | 0.88392 | 0.386951 | 0 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0 | 0.037037 | 0 | 0.203704 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
510d17bce508a2f374d91ce226a95ccf59cf25a8 | 138 | py | Python | smartbox/__init__.py | graham33/smartbox | f7deaf85b50f0821fba265aaf54319e03cf86dbc | [
"MIT"
] | 2 | 2021-03-08T14:40:12.000Z | 2021-11-12T14:50:01.000Z | smartbox/__init__.py | graham33/smartbox | f7deaf85b50f0821fba265aaf54319e03cf86dbc | [
"MIT"
] | 7 | 2021-03-09T22:06:03.000Z | 2022-03-06T22:30:45.000Z | smartbox/__init__.py | graham33/smartbox | f7deaf85b50f0821fba265aaf54319e03cf86dbc | [
"MIT"
] | 3 | 2021-02-23T09:32:22.000Z | 2021-09-27T09:37:16.000Z | from .error import SmartboxError # noqa: F401
from .session import Session # noqa: F401
from .socket import SocketSession # noqa: F401
| 34.5 | 47 | 0.76087 | 18 | 138 | 5.833333 | 0.5 | 0.228571 | 0.228571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 0.173913 | 138 | 3 | 48 | 46 | 0.842105 | 0.231884 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
51116957d6d2aa225af7480a8dea787647f8fccf | 2,554 | py | Python | tests/processor/test_ProgramCounter.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | tests/processor/test_ProgramCounter.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | tests/processor/test_ProgramCounter.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | import bitwise as bw
class TestProgramCounter:
def test_ProgramCounter(self):
input_bus = bw.wire.Bus16()
up = bw.wire.Wire()
load = bw.wire.Wire()
clock = bw.wire.Wire()
output_bus = bw.wire.Bus16()
a = bw.processor.ProgramCounter(input_bus, up, load, clock, output_bus)
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
up.value = 1
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1)
up.value = 1
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0)
up.value = 1
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1)
up.value = 0
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1)
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1)
input_bus.wire_values = (
0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1)
load.value = 1
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1)
load.value = 0
up.value = 1
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0)
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1)
input_bus.wire_values = (
0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
load.value = 1
clock.value = 0
clock.value = 1
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
print(a.__doc__)
print(a)
a(
data_bus=(0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1),
up=0,
load=1,
clock=0,
output_bus=None
)
a(clock=1)
assert output_bus.wire_values == (
0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1)
| 28.696629 | 79 | 0.437745 | 434 | 2,554 | 2.490783 | 0.06682 | 0.296022 | 0.39408 | 0.458834 | 0.757632 | 0.757632 | 0.757632 | 0.757632 | 0.757632 | 0.757632 | 0 | 0.172643 | 0.410337 | 2,554 | 88 | 80 | 29.022727 | 0.545153 | 0 | 0 | 0.643836 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150685 | 1 | 0.013699 | false | 0 | 0.013699 | 0 | 0.041096 | 0.027397 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
511b619584d8f89e4e193626f66fc995b01a06b8 | 205 | py | Python | production/pygsl-0.9.5/gsl_dist/array_includes.py | juhnowski/FishingRod | 457e7afb5cab424296dff95e1acf10ebf70d32a9 | [
"MIT"
] | 1 | 2019-07-29T02:53:51.000Z | 2019-07-29T02:53:51.000Z | production/pygsl-0.9.5/gsl_dist/array_includes.py | juhnowski/FishingRod | 457e7afb5cab424296dff95e1acf10ebf70d32a9 | [
"MIT"
] | 1 | 2021-09-11T14:30:32.000Z | 2021-09-11T14:30:32.000Z | Dockerfiles/gedlab-khmer-filter-abund/pymodules/python2.7/lib/python/pygsl/gsl_dist/array_includes.py | poojavade/Genomics_Docker | 829b5094bba18bbe03ae97daf925fee40a8476e8 | [
"Apache-2.0"
] | 2 | 2016-12-19T02:27:46.000Z | 2019-07-29T02:53:54.000Z | """
WARNING: File Generated during build. DO NOT MODIFY!!!
"""
array_include_dirs = []
from numpy.distutils.misc_util import get_numpy_include_dirs
array_include_dirs = get_numpy_include_dirs()
| 22.777778 | 62 | 0.756098 | 28 | 205 | 5.142857 | 0.642857 | 0.305556 | 0.222222 | 0.263889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15122 | 205 | 8 | 63 | 25.625 | 0.827586 | 0.263415 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
51545f5ee08776d3eaa2bffb8f7910cd6cb8d5c2 | 38,162 | py | Python | synchronized.py | brookman1/synchronized | 1b753aca6886b86f20d036ac9540a347e3531f3b | [
"Unlicense"
] | null | null | null | synchronized.py | brookman1/synchronized | 1b753aca6886b86f20d036ac9540a347e3531f3b | [
"Unlicense"
] | null | null | null | synchronized.py | brookman1/synchronized | 1b753aca6886b86f20d036ac9540a347e3531f3b | [
"Unlicense"
] | null | null | null | import functools
import threading
class NotAbleToLock(Exception):
pass
class ThreadSleep(object):
__slots__ = ['lock', 'condition', ]
def __init__(self):
self.lock = threading.Lock()
self.condition = threading.Condition(self.lock)
self.condition.acquire()
def __call__(self, seconds):
self.condition.wait(seconds)
class RWSynchronization(object):
'''
Some objects can be viewed by many readers at the same time but only one
writer can be allowed to modify it at a time, and all the readers usually
need to wait for write to complete so there would not be dirty reads.
RWSynchronization is a context manager that can use separetly created
semaphor and a mutex to enable read/write. If default, class local, mutex
and semaphore are selected, then object is either a glofied resource,
access limiter or not so spiffy mutex.
'''
def __init__(self, counting_semaphore=None, mutex_lock=None,
simultaneous_reads=1, blocking=True,
retries_until_locked=False, raise_this=NotAbleToLock,
write_lock=False):
self.counting_semaphore = counting_semaphore or \
threading.Semaphore(simultaneous_reads)
self.mutex_lock = mutex_lock or threading.Lock()
self.blocking = blocking
self.retries_until_locked = retries_until_locked or 0
self.simultaneous_reads = simultaneous_reads
self.write_lock = write_lock
self.to_raise = raise_this or NotAbleToLock
self.thread_sleep = ThreadSleep()
def _get_read_lock(self):
retry = self.retries_until_locked or 1
while retry:
if self.mutex_lock.acquire(self.blocking):
if self.counting_semaphore.acquire(self.blocking):
self.mutex_lock.release()
return True
else:
self.mutex_lock.release()
self.thread_sleep(0.0005)
retry -= 1
return False
def _unlock_reader(self):
self.counting_semaphore.release()
def _get_write_lock(self):
retry = self.retries_until_locked or 1
while retry > 0:
lockout_readers = self.simultaneous_reads
if self.mutex_lock.acquire(self.blocking):
while lockout_readers and retry:
if self.counting_semaphore.acquire(self.blocking):
lockout_readers -= 1
else:
while lockout_readers <= self.simultaneous_reads:
self.counting_semaphore.release()
lockout_readers += 1
self.thread_sleep(0.5)
retry -= 1
if lockout_readers == 0:
return True
if not retry:
break
self.thread_sleep(0.0005)
retry -= 1
return False
def _unlock_writer(self):
unlock_readers = self.simultaneous_reads
while unlock_readers:
self.counting_semaphore.release()
unlock_readers -= 1
self.mutex_lock.release()
def __enter__(self, ):
locked = self._get_write_lock() if self.write_lock \
else self._get_read_lock()
if not locked:
raise self.to_raise("Not able to lock in Synchronization %s" %
"Read Lock" if not self.write_lock
else "Write Lock")
return self
def __exit__(self, *args):
if self.write_lock:
self._unlock_writer()
else:
self._unlock_reader()
class RWSynchronized(object):
__slots__ = ['rwSynchronization', ]
def __init__(self, counting_semaphore=None, mutex_lock=None,
simultaneous_reads=1, blocking=True,
retries_until_locked=False, raise_this=NotAbleToLock,
write_lock=False):
self.rwSynchronization = RWSynchronization(
counting_semaphore=counting_semaphore,
mutex_lock=mutex_lock,
simultaneous_reads=simultaneous_reads,
blocking=blocking,
retries_until_locked=retries_until_locked,
raise_this=raise_this,
write_lock=write_lock
)
def __call__(self, f,):
def synchronized_wrapper(*args, **kwargs):
self.rwSynchronization.__enter__()
try:
return f(*args, **kwargs)
except Exception:
raise
finally:
self.rwSynchronization.__exit__()
functools.update_wrapper(synchronized_wrapper, f)
return synchronized_wrapper
rwSynchronized = RWSynchronized
class Synchronization(object):
def __init__(self, locks=None, rlock=True, blocking=True,
raise_this=NotAbleToLock, retries_until_locked=False):
'''
Here we need to descide what locks we are going to use.
locks -- a list of lock to spin through, best to have the right order
that locks should be spun through.
blocking -- always false if more than 1 lock in list otherwise true if
not specified otherwise
raise_this -- exception that will be raised, default is NotAbleToLock
retries_until_locked -- if False then defaul value for several locks is
10. Any retry is meant to be used by a different decorator.
Exception -- NotAbleToLock is defualt exception raised
'''
if locks:
self.locks = locks
else:
if rlock:
self.locks = [threading.RLock(), ]
else:
self.locks = [threading.Lock(), ]
def default_blocking():
if len(locks) <= 1:
return blocking
else:
return False
self.blocking = default_blocking()
def default_retries_until_locked():
if len(locks) <= 1 or retries_until_locked > 0:
return retries_until_locked
return 10
self.to_raise = raise_this or NotAbleToLock
self.retries_until_locked = default_retries_until_locked()
self.thread_sleep = ThreadSleep()
def _get_locks(self):
locked = []
for lock in self.locks:
if lock.acquire(self.blocking):
locked.append(lock)
else:
for lock in locked:
lock.release()
return False
return True
def _unlock(self):
for lock in self.locks:
lock.release()
def __enter__(self, ):
isLocked = self._get_locks()
if not isLocked and not self.retries_until_locked:
raise self.to_raise("Not able to acquire lock in synchronization.")
retry = self.retries_until_locked or 0
while retry and not isLocked:
self.thread_sleep(0.0005)
isLocked = self._get_locks()
if isLocked:
break
retry -= 1
if not isLocked:
raise self.to_raise("Not able to acquire lock in synchronization.")
return self
def __exit__(self, *args):
self._unlock()
class Synchronized(object):
'''
Making a decorator class for Synchronization, putting in
self.Synchronization as the underlying mechanism.
__slots__ below are just
'''
__slots__ = ['synchronization', ]
def __init__(self, locks=None, rlock=True, blocking=True,
raise_this=NotAbleToLock, retries_until_locked=False):
'''
Here we need to descide what locks we are going to use.
locks -- a list of lock to spin through, best to have the right order
that locks should be spun through.
blocking -- always false if more than 1 lock in list otherwise true if
not specified otherwise
raise_this -- exception that will be raised, default is NotAbleToLock
retries_until_locked -- if False then defaul value for several locks is
10. Any retry is meant to be used by a different decorator.
Exception -- NotAbleToLock is raised
'''
self.synchronization = \
Synchronization(locks=locks, rlock=rlock, blocking=blocking,
raise_this=raise_this,
retries_until_locked=retries_until_locked)
def __call__(self, f):
def wrapped_synched_test_func(*args, **kwargs):
self.synchronization.__enter__()
try:
return f(*args, **kwargs)
except Exception:
raise
finally:
self.synchronization.__exit__()
functools.update_wrapper(wrapped_synched_test_func, f)
return wrapped_synched_test_func
synchronized = Synchronized # decorators freuently used as lowercase names.
######################################################################
# It is best to put this one in a separate file.
######################################################################
import unittest
class UnitTest_RWSynchronization_MultiTread(unittest.TestCase):
def setUp(self):
'''
Activities that will be started by thread,
test_thread_run_target_reader
test_thread_run_target_writer
both have sleep times set via kwargs sleep_time passed in by
threading.Thread(...) at startup. This makes one thread come to a
section when another has it locked.
'''
self.test_mutex_lock = threading.Lock()
self.test_semaphore = threading.Semaphore()
self.test_thread_activity1 = None
self.test_thread_activity2 = None
self.test_thread_activity3 = None
def test_thread_run_target_writer(*args, **kwargs):
thread_sleep = ThreadSleep()
thread_sleep(kwargs.get('sleep_time'))
if self.test_thread_activity1:
self.test_thread_activity1()
print('%s done.' % self.thread_1.name)
def test_thread_run_target_reader(*args, **kwargs):
thread_sleep = ThreadSleep()
thread_sleep(kwargs.get('sleep_time'))
if self.test_thread_activity2:
self.test_thread_activity2(*args)
print('%s done.' % self.thread_2.name)
self.thread_1 = threading.Thread(name=repr(type(self))+' thread 1 W',
target=test_thread_run_target_writer,
kwargs={'sleep_time': 0.3})
self.thread_2 = threading.Thread(name=repr(type(self))+' thread 2 R',
target=test_thread_run_target_reader,
kwargs={'sleep_time': 0.1},
args=(2, ))
self.thread_3 = threading.Thread(name=repr(type(self))+' thread 2 R',
target=test_thread_run_target_reader,
kwargs={'sleep_time': 0.31},
args=(3, ))
def setup_activitiesi_and_initial_state(self):
self.expected_number_of_items = 2
self.list_based_test_load = [0.14, 0.01, ]
def test_thread_activity_writer():
with RWSynchronization(mutex_lock=self.test_mutex_lock,
counting_semaphore=self.test_semaphore,
write_lock=True):
thread_sleep = ThreadSleep()
if self.expected_number_of_items != \
len(self.list_based_test_load):
self.assertFalse('Inconsistent thread work state')
print("Thread_activity")
sleep_interval = self.list_based_test_load.pop(0)
thread_sleep(sleep_interval)
self.expected_number_of_items = len(self.list_based_test_load)
print("slept %d" % sleep_interval)
def test_thread_activity_reader(start_order):
with RWSynchronization(mutex_lock=self.test_mutex_lock,
counting_semaphore=self.test_semaphore,
write_lock=False):
thread_sleep = ThreadSleep()
if self.expected_number_of_items != \
len(self.list_based_test_load):
self.assertFalse('Inconsistent thread work state')
print("Thread_activity")
sleep_interval = 0.01
if start_order == 2 and len(self.list_based_test_load) != 2:
self.assertFalse('Inconsistent thread work state')
# came after the writer executed.
if start_order == 3 and len(self.list_based_test_load) != 1:
self.assertFalse('Inconsistent thread_work state')
thread_sleep(sleep_interval)
self.test_thread_activity1 = test_thread_activity_writer
self.test_thread_activity2 = test_thread_activity_reader
self.test_thread_activity3 = test_thread_activity_reader
def test_sychronized_activity(self):
'''
using list-based queue check work state, number of items = counter.
Raise exception if state is inconsitent
Writer will,
1. make inconsistent thread condition, pop one item off
2. sleep a randon interval that came from the list
3. fix inconsistent state.
Reader will take an argument specifing it's expected start order,
1. If reader is expected to be at the synchronized section first,
it will check that list has both items
2. If reader gets to the synchronized section last
it checks that list only has one item
'''
self.setup_activitiesi_and_initial_state()
try:
self.thread_1.start()
self.thread_2.start()
self.thread_3.start()
self.thread_1.join()
self.thread_2.join()
self.thread_3.join()
except Exception as e:
self.assertFalse(str(e) + ' ' + repr(e))
else:
self.assertTrue(True)
class UnitTest_RWSynchronization_OneThread(unittest.TestCase):
def setUp(self):
self.test_semaphore = threading.Semaphore(10)
self.test_lock = threading.Lock()
def test_rwSynchronized_OneThread_readlock(self):
with RWSynchronization(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=True,
retries_until_locked=False,
raise_this=NotAbleToLock,
write_lock=False) as rw_synchronized:
self.assertTrue(rw_synchronized)
def test_rwSynchronized_OneThread_writelock(self):
with RWSynchronization(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=True,
retries_until_locked=False,
raise_this=NotAbleToLock,
write_lock=True) as rw_synchronized:
self.assertTrue(rw_synchronized)
def test_rwSynchronized_OneThread_read_fail(self):
self.test_lock.acquire()
with self.assertRaises(NotAbleToLock) as e:
with RWSynchronization(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=False,
raise_this=NotAbleToLock,
write_lock=False) as rw_synchronized:
self.assertFalse("Should not have gotten locked, %s"
% rw_synchronized)
self.assertTrue(e)
with self.assertRaises(NotAbleToLock) as e:
with RWSynchronization(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=1,
raise_this=NotAbleToLock,
write_lock=False) as rw_synchronized:
self.assertFalse("Should not have gotten locked, %s"
% rw_synchronized)
self.assertTrue(e)
with self.assertRaises(NotAbleToLock) as e:
with RWSynchronization(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=3,
raise_this=NotAbleToLock,
write_lock=False) as rw_synchronized:
self.assertFalse("Should not have gotten locked, %s"
% rw_synchronized)
self.assertTrue(e)
self.test_lock.release()
def test_rwSynchronized_OneThread_write_fail(self):
self.test_lock.acquire()
with self.assertRaises(NotAbleToLock) as e:
with RWSynchronization(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=False,
raise_this=NotAbleToLock,
write_lock=True) as rw_synchronized:
self.assertFalse("Should not have gotten locked, %s"
% rw_synchronized)
self.assertTrue(e)
with self.assertRaises(NotAbleToLock) as e:
with RWSynchronization(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=1,
raise_this=NotAbleToLock,
write_lock=True) as rw_synchronized:
self.assertFalse("Should not have gotten locked, %s"
% rw_synchronized)
self.assertTrue(e)
with self.assertRaises(NotAbleToLock) as e:
with RWSynchronization(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=3,
raise_this=NotAbleToLock,
write_lock=True) as rw_synchronized:
self.assertFalse("Should not have gotten locked, %s"
% rw_synchronized)
self.assertTrue(e)
self.test_lock.release()
class UnitTest_RWSynchronized_MultiTread(unittest.TestCase):
def setUp(self):
'''
Activities that will be started by thread,
test_thread_run_target_reader
test_thread_run_target_writer
both have sleep times set via kwargs sleep_time passed in by
threading.Thread(...) at startup. This makes one thread come to a
section when another has it locked.
'''
self.test_mutex_lock = threading.Lock()
self.test_semaphore = threading.Semaphore()
self.test_thread_activity1 = None
self.test_thread_activity2 = None
self.test_thread_activity3 = None
def test_thread_run_target_writer(*args, **kwargs):
thread_sleep = ThreadSleep()
thread_sleep(kwargs.get('sleep_time'))
if self.test_thread_activity1:
self.test_thread_activity1()
print('%s done.' % self.thread_1.name)
def test_thread_run_target_reader(*args, **kwargs):
thread_sleep = ThreadSleep()
thread_sleep(kwargs.get('sleep_time'))
if self.test_thread_activity2:
self.test_thread_activity2(*args)
print('%s done.' % self.thread_2.name)
self.thread_1 = threading.Thread(name=repr(type(self))+' thread 1 W',
target=test_thread_run_target_writer,
kwargs={'sleep_time': 0.3})
self.thread_2 = threading.Thread(name=repr(type(self))+' thread 2 R',
target=test_thread_run_target_reader,
kwargs={'sleep_time': 0.1},
args=(2, ))
self.thread_3 = threading.Thread(name=repr(type(self))+' thread 2 R',
target=test_thread_run_target_reader,
kwargs={'sleep_time': 0.31},
args=(3, ))
def setup_activitiesi_and_initial_state(self):
self.expected_number_of_items = 2
self.list_based_test_load = [0.14, 0.01, ]
@rwSynchronized(mutex_lock=self.test_mutex_lock,
counting_semaphore=self.test_semaphore,
write_lock=True)
def test_thread_activity_writer():
thread_sleep = ThreadSleep()
if self.expected_number_of_items != \
len(self.list_based_test_load):
self.assertFalse('Inconsistent thread work state')
print("Thread_activity")
sleep_interval = self.list_based_test_load.pop(0)
thread_sleep(sleep_interval)
self.expected_number_of_items = len(self.list_based_test_load)
print("slept %d" % sleep_interval)
@rwSynchronized(mutex_lock=self.test_mutex_lock,
counting_semaphore=self.test_semaphore,
write_lock=False)
def test_thread_activity_reader(start_order):
thread_sleep = ThreadSleep()
if self.expected_number_of_items != \
len(self.list_based_test_load):
self.assertFalse('Inconsistent thread work state')
print("Thread_activity")
sleep_interval = 0.01
if start_order == 2 and len(self.list_based_test_load) != 2:
self.assertFalse('Inconsistent thread work state')
# came after the writer executed.
if start_order == 3 and len(self.list_based_test_load) != 1:
self.assertFalse('Inconsistent thread_work state')
thread_sleep(sleep_interval)
self.test_thread_activity1 = test_thread_activity_writer
self.test_thread_activity2 = test_thread_activity_reader
self.test_thread_activity3 = test_thread_activity_reader
def test_sychronized_activity(self):
'''
using list-based queue check work state, number of items = counter.
Raise exception if state is inconsitent
Writer will,
1. make inconsistent thread condition, pop one item off
2. sleep a randon interval that came from the list
3. fix inconsistent state.
Reader will take an argument specifing it's expected start order,
1. If reader is expected to be at the synchronized section first,
it will check that list has both items
2. If reader gets to the synchronized section last
it checks that list only has one item
'''
self.setup_activitiesi_and_initial_state()
try:
self.thread_1.start()
self.thread_2.start()
self.thread_3.start()
self.thread_1.join()
self.thread_2.join()
self.thread_3.join()
except Exception as e:
self.assertFalse(str(e) + ' ' + repr(e))
else:
self.assertTrue(True)
class UnitTest_RWSynchronized_OneThread(unittest.TestCase):
def setUp(self):
self.test_semaphore = threading.Semaphore(10)
self.test_lock = threading.Lock()
def test_rwSynchronized_OneThread_readlock(self):
@rwSynchronized(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=True,
retries_until_locked=False,
raise_this=NotAbleToLock,
write_lock=False)
def test_synch_func():
self.assertTrue("function passed")
test_synch_func()
def test_rwSynchronized_OneThread_writelock(self):
@rwSynchronized(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=True,
retries_until_locked=False,
raise_this=NotAbleToLock,
write_lock=True)
def test_synch_func():
self.assertTrue("function passed")
test_synch_func()
def test_rwSynchronized_OneThread_read_fail(self):
self.test_lock.acquire()
with self.assertRaises(NotAbleToLock) as e:
@rwSynchronized(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=False,
raise_this=NotAbleToLock,
write_lock=False)
def test_synch_func():
self.assertFalse("Should not have gotten locked")
test_synch_func()
self.assertTrue(e)
with self.assertRaises(NotAbleToLock) as e:
@rwSynchronized(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=1,
raise_this=NotAbleToLock,
write_lock=False)
def test_synch_func():
self.assertFalse("Should not have gotten locked")
test_synch_func()
self.assertTrue(e)
with self.assertRaises(NotAbleToLock) as e:
@rwSynchronized(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=3,
raise_this=NotAbleToLock,
write_lock=False)
def test_synch_func():
self.assertFalse("Should not have gotten locked")
test_synch_func()
self.assertTrue(e)
self.test_lock.release()
def test_rwSynchronized_OneThread_write_fail(self):
self.test_lock.acquire()
with self.assertRaises(NotAbleToLock) as e:
@rwSynchronized(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=False,
raise_this=NotAbleToLock,
write_lock=True)
def test_synch_func():
self.assertFalse("Should not have gotten locked")
test_synch_func()
self.assertTrue(e)
with self.assertRaises(NotAbleToLock) as e:
@rwSynchronized(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=1,
raise_this=NotAbleToLock,
write_lock=True)
def test_synch_func():
self.assertFalse("Should not have gotten locked")
test_synch_func()
self.assertTrue(e)
with self.assertRaises(NotAbleToLock) as e:
@rwSynchronized(counting_semaphore=self.test_semaphore,
mutex_lock=self.test_lock,
simultaneous_reads=1, blocking=False,
retries_until_locked=3,
raise_this=NotAbleToLock,
write_lock=True)
def test_synch_func():
self.assertFalse("Should not have gotten locked")
test_synch_func()
self.assertTrue(e)
self.test_lock.release()
class UnitTest_Synchronization_OneThread(unittest.TestCase):
def setUp(self):
self.test_rlock = threading.RLock()
self.test_lock = threading.Lock()
self.locks = [threading.Lock(), threading.Lock(), threading.Lock(), ]
def test_no_spawned_thread_run(self):
with Synchronization(self.locks):
self.assertEqual(1, 1)
def test_no_spawned_thread_run_prelocked(self):
self.locks[0].acquire()
with self.assertRaises(NotAbleToLock) as cm:
with Synchronization(self.locks):
self.assertEqual(0, -1)
self.assertTrue(cm.exception)
with self.assertRaises(NotAbleToLock) as cm:
with Synchronization(self.locks,
blocking=False,
retries_until_locked=10):
self.assertEqual(0, -1)
self.assertTrue(cm.exception)
class UnitTest_Synchronized_OneThread(unittest.TestCase):
'''
Tests the decorator version of synchronization.
'''
def setUp(self):
self.test_rlock = threading.RLock()
self.test_lock = threading.Lock()
self.locks = [threading.Lock(), threading.Lock(), threading.Lock(), ]
def test_no_spawned_thread_run(self):
@synchronized(self.locks)
def synchronized_test_func(*args, **kwargs):
pass
synchronized_test_func()
self.assertEqual(1, 1)
def test_no_spawned_thread_run_prelocked(self):
self.locks[0].acquire()
with self.assertRaises(NotAbleToLock) as cm:
@synchronized(self.locks, blocking=False)
def synchronized_test_func(*args, **kwargs):
self.assertEqual(0, -1)
synchronized_test_func()
self.assertTrue(cm.exception)
with self.assertRaises(NotAbleToLock) as cm:
@synchronized(self.locks, blocking=False, retries_until_locked=10)
def synchronized_test_func(*args, **kwargs):
self.assertEqual(0, -1)
synchronized_test_func()
self.assertTrue(cm.exception)
class UnitTest_Synchronization_MultiThread(unittest.TestCase):
def setUp(self):
'''
Activities that will be started by thread,
test_thread_run_target_1
test_thread_run_target_2
both have sleep times set via kwargs sleep_time passed in by
threading.Thread(...) at startup. This makes one thread come to a
section when another has it locked.
'''
self.test_rlock = threading.RLock()
self.test_lock = threading.Lock()
self.locks = [threading.Lock(), threading.Lock(), threading.Lock(), ]
self.test_thread_activity1 = None
self.test_thread_activity2 = None
def test_thread_run_target_1(*args, **kwargs):
thread_sleep = ThreadSleep()
thread_sleep(kwargs.get('sleep_time'))
if self.test_thread_activity1:
self.test_thread_activity1()
print('%s done.' % self.thread_1.name)
def test_thread_run_target_2(*args, **kwargs):
thread_sleep = ThreadSleep()
thread_sleep(kwargs.get('sleep_time'))
if self.test_thread_activity2:
self.test_thread_activity2()
print('%s done.' % self.thread_2.name)
self.thread_1 = threading.Thread(name=repr(type(self))+' thread 1',
target=test_thread_run_target_1,
kwargs={'sleep_time': 0.3})
self.thread_2 = threading.Thread(name=repr(type(self))+' thread 2',
target=test_thread_run_target_2,
kwargs={'sleep_time': 0.1})
def test_sychronized_activity(self):
'''
1. using list-based queue check work state, number of items = counter.
Raise exception if state is inconsitent
2. make inconsistent thread condition, pop one item off
3. sleep a randon interval
4. fix inconsistent state.
'''
self.expected_number_of_items = 2
self.list_based_test_load = [0.14, 0.01, ]
def test_thread_activity():
with Synchronization([self.test_lock]):
thread_sleep = ThreadSleep()
if self.expected_number_of_items != \
len(self.list_based_test_load):
self.assertFalse('Inconsistent thread work state')
print("Thread_activity")
sleep_interval = self.list_based_test_load.pop(0)
thread_sleep(sleep_interval)
self.expected_number_of_items = len(self.list_based_test_load)
print("slept %d" % sleep_interval)
self.test_thread_activity1 = test_thread_activity
self.test_thread_activity2 = test_thread_activity
try:
self.thread_1.start()
self.thread_2.start()
self.thread_1.join()
self.thread_2.join()
except Exception as e:
self.assertFalse(str(e) + ' ' + repr(e))
else:
self.assertTrue(True)
class UnitTest_Synchronized_MultiThread(unittest.TestCase):
def setUp(self):
self.test_rlock = threading.RLock()
self.test_lock = threading.Lock()
self.locks = [threading.Lock(), threading.Lock(), threading.Lock(), ]
self.test_thread_activity1 = None
self.test_thread_activity2 = None
def test_thread_run_target_1(*args, **kwargs):
thread_sleep = ThreadSleep()
thread_sleep(kwargs.get('sleep_time'))
if self.test_thread_activity1:
self.test_thread_activity1()
print('%s done.' % self.thread_1.name)
def test_thread_run_target_2(*args, **kwargs):
thread_sleep = ThreadSleep()
thread_sleep(kwargs.get('sleep_time'))
if self.test_thread_activity2:
self.test_thread_activity2()
print('%s done.' % self.thread_2.name)
self.thread_1 = threading.Thread(name=repr(type(self))+' thread 1',
target=test_thread_run_target_1,
kwargs={'sleep_time': 0.3})
self.thread_2 = threading.Thread(name=repr(type(self))+' thread 2',
target=test_thread_run_target_2,
kwargs={'sleep_time': 0.1})
def test_sychronized_activity(self):
'''
1. using list-based queue check work state, number of items = counter.
Raise exception if state is inconsitent
2. make inconsistent thread condition, pop one item off
3. sleep a randon interval
4. fix inconsistent state.
'''
self.expected_number_of_items = 2
self.list_based_test_load = [0.3, 0.4, ]
@synchronized([self.test_lock])
def test_thread_activity():
thread_sleep = ThreadSleep()
if self.expected_number_of_items != len(self.list_based_test_load):
raise Exception('Inconsistent thread work stated')
sleep_interval = self.list_based_test_load.pop(0)
print("Thread_activity")
thread_sleep(sleep_interval)
self.expected_number_of_items = len(self.list_based_test_load)
print("slept: ", sleep_interval)
self.test_thread_activity1 = test_thread_activity
self.test_thread_activity2 = test_thread_activity
try:
self.thread_1.start()
self.thread_2.start()
self.thread_1.join()
self.thread_2.join()
except Exception as e:
self.assertFalse(str(e) + ' ' + repr(e))
if __name__ == '__main__':
unittest.main()
| 40.641108 | 80 | 0.558173 | 3,908 | 38,162 | 5.187052 | 0.07088 | 0.040255 | 0.034631 | 0.022495 | 0.8413 | 0.825218 | 0.806768 | 0.795619 | 0.783977 | 0.78082 | 0 | 0.011222 | 0.367198 | 38,162 | 938 | 81 | 40.684435 | 0.828192 | 0.112442 | 0 | 0.810606 | 0 | 0 | 0.042831 | 0 | 0 | 0 | 0 | 0 | 0.106061 | 1 | 0.110606 | false | 0.006061 | 0.004545 | 0 | 0.165152 | 0.027273 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5aa09da29654ba4c20c8fb2eb814f922868ab209 | 22,564 | py | Python | pymaclab/dsge/updaters/one_off.py | radovankavicky/pymaclab | 21da758f64ed0b62969c9289576f677e977cfd98 | [
"Apache-2.0"
] | 96 | 2015-01-25T05:59:56.000Z | 2021-12-29T14:05:22.000Z | pymaclab/dsge/updaters/one_off.py | 1zinnur9/pymaclab | 21da758f64ed0b62969c9289576f677e977cfd98 | [
"Apache-2.0"
] | 3 | 2015-12-17T19:25:46.000Z | 2018-06-19T07:05:20.000Z | pymaclab/dsge/updaters/one_off.py | 1zinnur9/pymaclab | 21da758f64ed0b62969c9289576f677e977cfd98 | [
"Apache-2.0"
] | 36 | 2016-01-31T15:22:01.000Z | 2021-03-29T07:03:07.000Z | '''
.. module:: one_off
:platform: Linux
:synopsis: A collection of tools required for doing intelligent and dynamic DSGE model instance updating at runtime. The version in
this module is for the one-off updater behaviour.
.. moduleauthor:: Eric M. Scheffel <eric.scheffel@nottingham.edu.cn>
'''
from copy import deepcopy
from ..solvers.steadystate import ManualSteadyState
class Updaters(object):
def __init__(self):
pass
class dicwrap:
def __init__(self,other,wrapobj_str,initlev):
self.other = other
self.wrapobj_str = wrapobj_str
wrapobj = eval('other.'+wrapobj_str.split('.')[1])
self.wrapobj = wrapobj
self.initlev = initlev
if wrapobj_str == 'self.vardic':
self.wrapdic = deepcopy(other.vardic)
elif wrapobj_str == 'self.nlsubsdic':
self.wrapdic = deepcopy(other.nlsubsdic)
elif wrapobj_str == 'self.paramdic':
self.wrapdic = deepcopy(other.paramdic)
def __getattr__(self,attrname):
return getattr(self.wrapdic,attrname)
def __setitem__(self,key,value):
other = self.other
initlev = self.initlev
wrapobj_str = self.wrapobj_str
wrapobj = self.wrapobj
mesg = other._mesg
# ...and assign before test of inequality
old_value = deepcopy(wrapobj[key])
wrapobj[key] = value
# Test if the dictionary has changed relative to self.wrapdic
if self.wrapdic != wrapobj:
if mesg:
print "You have UPDATED in object "+wrapobj_str+"['"+key+"']:"
print str(old_value)+' --> '+str(value)
##### THE INITS #####################
other.inits.init1()
######## Copy correct values into the model instance ########
if wrapobj_str == 'self.paramdic':
for keyo in wrapobj.keys():
other.paramdic[keyo] = deepcopy(wrapobj[keyo])
elif wrapobj_str == 'self.nlsubsdic':
for keyo in wrapobj.keys():
other.nlsubsdic[keyo] = deepcopy(wrapobj[keyo])
elif wrapobj_str == 'self.vardic':
for keyo in wrapobj.keys():
other.vardic[keyo] = deepcopy(wrapobj[keyo])
##############################################################
if wrapobj_str == 'self.vardic':
other.vardic.update(wrapobj)
other.inits.init1a()
if wrapobj_str == 'self.nlsubsdic':
for i1,elem in enumerate(other.nlsubs_raw1):
other.nlsubs_raw1[i1][1] = self.wrapdic[other.nlsubs_raw1[i1][0]]
other.nlsubsdic.update(wrapobj)
other.inits.init1b()
if wrapobj_str == 'self.paramdic':
other.paramdic.update(wrapobj)
other.inits.init1c()
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
def __getitem__(self,key):
return self.wrapdic[key]
def update(self,dico):
self.updic = dico
other = self.other
initlev = self.initlev
wrapobj_str = self.wrapobj_str
wrapobj = self.wrapobj
wrapobj.update(dico)
# Check if new keys are already present in wrapdic
for keyo in dico.keys():
if keyo not in self.wrapdic.keys():
print "ERROR: You can only update existing keys, not introduce new ones."
return
# Check if any key's value has been updated relative to wrapdic
if self.wrapdic != wrapobj:
self.wrapdic.update(wrapobj)
##### THE INITS #####################
other.inits.init1()
######## Copy correct values into the model instance ########
if wrapobj_str == 'self.paramdic':
for keyo in wrapobj.keys():
other.paramdic[keyo] = deepcopy(wrapobj[keyo])
elif wrapobj_str == 'self.nlsubsdic':
for keyo in wrapobj.keys():
other.nlsubsdic[keyo] = deepcopy(wrapobj[keyo])
elif wrapobj_str == 'self.vardic':
for keyo in wrapobj.keys():
other.vardic[keyo] = deepcopy(wrapobj[keyo])
##############################################################
if wrapobj_str == 'self.vardic':
other.vardic = deepcopy(wrapobj)
other.inits.init1a()
if wrapobj_str == 'self.nlsubsdic':
for i1,elem in enumerate(other.nlsubs_raw1):
other.nlsubs_raw1[i1][1] = self.wrapdic[other.nlsubs_raw1[i1][0]]
other.nlsubsdic.update(wrapobj)
other.inits.init1b()
if wrapobj_str == 'self.paramdic':
other.paramdic.update(wrapobj)
other.inits.init1c()
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
def __repr__(self):
return self.wrapdic.__repr__()
def __str__(self):
return self.wrapdic.__str__()
class listwrapk:
def __init__(self,other,wrapobj,wrapobj_str,wrapdic,initlev):
self.other = other
self.wrapobj = wrapobj
self.wrapobj_str = wrapobj_str
self.wrapdic = wrapdic
self.initlev = initlev
self.wrapli = deepcopy(wrapobj)
def reinit(self,other):
# Need to copy manually as inner wrapped objects do not support deepcopy
if self.wrapobj_str == 'self.vardic':
var_keys = []
for keyo in other.vardic.keys():
var_keys.append(keyo)
del other.vardic
other.vardic = {}
for keyo in var_keys:
other.vardic[keyo]={}
other.vardic[keyo]['var']=[]
other.vardic[keyo]['mod']=[]
for keyo1 in self.wrapdic.keys():
for keyo2 in self.wrapdic[keyo1].keys():
other.vardic[keyo1][keyo2] = []
for i1,elem1 in enumerate(self.wrapdic[keyo1][keyo2]):
other.vardic[keyo1][keyo2].append([])
for i2,elem2 in enumerate(elem1):
other.vardic[keyo1][keyo2][i1].append(elem2)
def __getattr__(self,attrname):
return getattr(self.wrapli,attrname)
def __setslice__(self,ind1,ind2,into):
other = self.other
wrapobj = self.wrapobj
initlev = self.initlev
lengo = len(self.wrapli)
if ind2 >= lengo:
print "ERROR: Assignment out of bounds of original list"
return
##### THE INITS #####################
#other.init1()
if self.wrapli[ind1:ind2] != into:
self.wrapli[ind1:ind2] = into
wrapobj[ind1:ind2] = into
other.updaters.vardic.wrapobj.update(other.updaters.vardic)
self.reinit(other)
other.inits.init1a()
other.inits.init1b()
other.inits.init1c()
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
def __setitem__(self,ind,into):
other = self.other
initlev = self.initlev
wrapobj = self.wrapobj
lengo = len(self.wrapli)
if ind >= lengo:
print "ERROR: Assignment out of bounds of original list"
return
##### THE INITS #####################
#other.inits.init1()
if self.wrapli[ind] != into:
self.wrapli[ind] = into
wrapobj[ind] = into
other.updaters.vardic.wrapobj.update(other.updaters.vardic)
self.reinit(other)
other.inits.init1a()
other.inits.init1b()
other.inits.init1c()
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
def __getitem__(self,ind):
lengo = len(self.wrapli)
if ind >= lengo:
print "ERROR: Assignment out of bounds of original list"
return
else:
return self.wrapli[ind]
def __repr__(self):
return self.wrapli.__repr__()
def __str__(self):
return self.wrapli.__str__()
class dicwrapk:
def __init__(self,other,wrapobj,initlev):
self.other = other
self.wrapobj = wrapobj
self.initlev = initlev
self.wrapdic = deepcopy(wrapobj)
def __getattr__(self,attrname):
return getattr(self.wrapdic,attrname)
def __setitem__(self,key,value):
other = self.other
initlev = self.initlev
wrapobj = self.wrapobj
wrapobj[key] = value
# Test if the dictionary has changed relative to self.wrapdic
if self.wrapdic != wrapobj:
self.wrapdic[key] = value
##### THE INITS #####################
#other.inits.init1()
other.vardic.upate(self.wrapdic)
other.updaters.vardic.wrapobj.update(self.wrapdic)
other.inits.init1a()
other.inits.init1b()
other.inits.init1c()
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
def __getitem__(self,key):
return self.wrapdic[key]
def update(self,dico):
self.updic = dico
other = self.other
initlev = self.initlev
wrapobj = self.wrapobj
wrapobj.update(dico)
# Check if new keys are already present in wrapdic
for keyo in dico.keys():
if keyo not in self.wrapdic.keys():
print "ERROR: You can only update existing keys, not introduce new ones."
return
# Check if any key's value has been updated relative to wrapdic
if self.wrapdic != wrapobj:
self.wrapdic.update(dico)
##### THE INITS #####################
#other.inits.init1()
other.vardic = deepcopy(self.wrapdic)
other.updaters.vardic.wrapobj.update(self.wrapdic)
other.inits.init1a()
other.inits.init1b()
other.inits.init1c()
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
def __repr__(self):
return self.wrapdic.__repr__()
def __str__(self):
return self.wrapdic.__str__()
class dicwrap_deep:
def __init__(self,other,wrapobj_str,initlev):
self.other = other
self.wrapobj_str = wrapobj_str
self.initlev = initlev
# Create the wrapobj using the passed string
self.wrapobj = eval('other.'+wrapobj_str.split('.')[1])
if wrapobj_str == 'self.vardic':
self.wrapdic = deepcopy(self.wrapobj)
for keyo in self.wrapdic.keys():
self.wrapdic[keyo] = dicwrapk(other,self.wrapdic[keyo],initlev)
for keyo2 in self.wrapdic[keyo].keys():
self.wrapdic[keyo][keyo2] = dicwrapk(other,self.wrapdic[keyo][keyo2],initlev)
for i1,elem in enumerate(self.wrapdic[keyo][keyo2]):
self.wrapdic[keyo][keyo2][i1] = listwrapk(other,self.wrapdic[keyo][keyo2][i1],self.wrapobj_str,self.wrapdic,initlev)
def __getattr__(self,attrname):
return getattr(self.wrapdic,attrname)
def __setitem__(self,key,value):
other = self.other
initlev = self.initlev
wrapobj = self.wrapobj
wrapobj[key] = value
# Test if the dictionary has changed relative to self.wrapdic
if self.wrapdic[key] != wrapobj[key]:
self.wrapdic[key] = value
##### THE INITS #####################
#other.inits.init1()
# Need to copy manually as inner wrapped objects do not support deepcopy
if wrapobj_str == 'self.vardic':
var_keys = []
for keyo in other.vardic.keys():
var_keys.append(keyo)
del other.vardic
other.vardic = {}
for keyo in var_keys:
other.vardic[keyo]={}
other.vardic[keyo]['var']=[]
other.vardic[keyo]['mod']=[]
for keyo1 in self.wrapdic.keys():
for keyo2 in self.wrapdic[keyo1].keys():
other.vardic[keyo1][keyo2] = []
for i1,elem1 in enumerate(self.wrapdic[keyo1][keyo2]):
other.vardic[keyo1][keyo2].append([])
for i2,elem2 in enumerate(elem1):
other.vardic[keyo1][keyo2][i1].append(elem2)
other.inits.init1a()
if wrapobj_str == 'self.nlsubsdic':
# not a deep dic
pass
other.inits.init1b()
if wrapobj_str == 'self.paramdic':
# not a deep dic
pass
other.inits.init1c()
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
def __update__(self,dico):
other = self.other
initlev = self.initlev
wrapobj_str = self.wrapobj_str
wrapobj = self.wrapobj
wrapobj.update(dico)
# Test if the dictionary has changed relative to self.wrapdic
if self.wrapdic != wrapobj:
self.wrapdic.update(dico)
##### THE INITS #####################
#other.inits.init1()
# Need to copy manually as inner wrapped objects do not support deepcopy
if wrapobj_str == 'self.vardic':
var_keys = []
for keyo in other.vardic.keys():
var_keys.append(keyo)
del other.vardic
other.vardic = {}
for keyo in var_keys:
other.vardic[keyo]={}
other.vardic[keyo]['var']=[]
other.vardic[keyo]['mod']=[]
for keyo1 in self.wrapdic.keys():
for keyo2 in self.wrapdic[keyo1].keys():
other.vardic[keyo1][keyo2] = []
for i1,elem1 in enumerate(self.wrapdic[keyo1][keyo2]):
other.vardic[keyo1][keyo2].append([])
for i2,elem2 in enumerate(elem1):
other.vardic[keyo1][keyo2][i1].append(elem2)
other.inits.init1a()
other.inits.init1b()
other.inits.init1c()
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
class listwrap:
def __init__(self,other,wrapobj_str,initlev):
self.other = other
self.wrapobj_str = wrapobj_str
wrapobj = eval('other.'+wrapobj_str.split('.')[1])
self.wrapobj = wrapobj
self.initlev = initlev
self.wrapli = deepcopy(wrapobj)
def __getattr__(self,attrname):
return getattr(self.wrapli,attrname)
def __setslice__(self,ind1,ind2,into):
other = self.other
wrapobj_str = self.wrapobj_str
wrapobj = eval('other.'+wrapobj_str.split('.')[1])
initlev = self.initlev
do_rest = False
lengo = len(self.wrapli)
if ind2 >= lengo:
print "ERROR: Assignment out of bounds of original list"
return
##### THE INITS #####################
#other.inits.init1()
#other.inits.init1a()
#other.inits.init1b()
#other.inits.init1c()
if self.wrapli[ind1:ind2] != into:
self.wrapli[ind1:ind2] = into
wrapobj[ind1:ind2] = into
do_rest = True
if do_rest:
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
def __setitem__(self,ind,into):
other = self.other
wrapobj_str = self.wrapobj_str
wrapobj = eval('other.'+wrapobj_str.split('.')[1])
initlev = self.initlev
do_rest = False
lengo = len(self.wrapli)
if ind >= lengo:
print "ERROR: Assignment out of bounds of original list"
return
##### THE INITS #####################
#other.inits.init1()
#other.inits.init1a()
#other.inits.init1b()
#other.inits.init1c()
if self.wrapli[ind] != into:
self.wrapli[ind] = into
wrapobj[ind] = into
do_rest = True
if do_rest:
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
# Solve model dynamically
other.inits.init4()
other.inits.init5()
if initlev == 2:
other.inits.init_out()
def __getitem__(self,ind):
lengo = len(self.wrapli)
if ind >= lengo:
print "ERROR: Assignment out of bounds of original list"
return
else:
return self.wrapli[ind]
def __repr__(self):
return self.wrapli.__repr__()
def __str__(self):
return self.wrapli.__str__()
class matwrap:
def __init__(self,other,wrapobj_str,initlev):
self.other = other
self.wrapobj_str = wrapobj_str
self.initlev = initlev
if wrapobj_str == 'self.sigma':
self.wrapmat = other.sigma
def __getattr__(self,attrname):
return getattr(self.wrapmat,attrname)
def __setitem__(self,ind,into):
other = self.other
wrapob_str = self.wrapobj_str
initlev = self.initlev
##### THE INITS #####################
other.inits.init1()
other.inits.init1a()
other.inits.init1b()
other.inits.init1c()
# Prepare DSGE model instance for manual SS computation
other.inits.init2()
if initlev == 0:
other.inits.init_out()
# Solve for SS automatically
other.inits.init3()
if initlev == 1:
other.inits.init_out()
other.inits.init4()
if self.wrapmat[ind[0],ind[1]] != into and wrapob_str == 'self.sigma':
self.wrapmat[ind[0],ind[1]] = into
other.sigma[ind[0],ind[1]] = into
# Solve model dynamically
other.inits.init5()
if initlev == 2:
other.inits.init_out() | 34.343988 | 140 | 0.516664 | 2,345 | 22,564 | 4.85032 | 0.077186 | 0.105504 | 0.040619 | 0.049323 | 0.90619 | 0.881132 | 0.878583 | 0.860735 | 0.832249 | 0.82205 | 0 | 0.016111 | 0.370059 | 22,564 | 657 | 141 | 34.343988 | 0.784086 | 0.107738 | 0 | 0.898488 | 0 | 0 | 0.040858 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.006479 | 0.00432 | null | null | 0.021598 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5aec271929f4191aea48b636c5774da9e9d60dc0 | 90,772 | py | Python | pollination_sdk/api/projects_api.py | pollination/python-sdk | 599e8dbfc6e547c5e18aa903b27c70d7ffef84e5 | [
"RSA-MD"
] | 2 | 2020-01-30T23:28:59.000Z | 2020-05-06T16:43:47.000Z | pollination_sdk/api/projects_api.py | pollination/python-sdk | 599e8dbfc6e547c5e18aa903b27c70d7ffef84e5 | [
"RSA-MD"
] | 1 | 2020-10-02T18:00:25.000Z | 2020-10-02T18:00:25.000Z | pollination_sdk/api/projects_api.py | pollination/python-sdk | 599e8dbfc6e547c5e18aa903b27c70d7ffef84e5 | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
pollination-server
Pollination Server OpenAPI Definition # noqa: E501
The version of the OpenAPI document: 0.16.0
Contact: info@pollination.cloud
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from pollination_sdk.api_client import ApiClient
from pollination_sdk.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class ProjectsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_project(self, owner, project_create, **kwargs): # noqa: E501
"""Create a Project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_project(owner, project_create, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param project_create: (required)
:type project_create: ProjectCreate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: CreatedContent
"""
kwargs['_return_http_data_only'] = True
return self.create_project_with_http_info(owner, project_create, **kwargs) # noqa: E501
def create_project_with_http_info(self, owner, project_create, **kwargs): # noqa: E501
"""Create a Project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_project_with_http_info(owner, project_create, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param project_create: (required)
:type project_create: ProjectCreate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(CreatedContent, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'owner',
'project_create'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_project" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `create_project`") # noqa: E501
# verify the required parameter 'project_create' is set
if self.api_client.client_side_validation and ('project_create' not in local_var_params or # noqa: E501
local_var_params['project_create'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_create` when calling `create_project`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'project_create' in local_var_params:
body_params = local_var_params['project_create']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CreatedContent', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def create_project_recipe_filter(self, owner, name, project_recipe_filter, **kwargs): # noqa: E501
"""Upsert a recipe filter to a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_project_recipe_filter(owner, name, project_recipe_filter, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_recipe_filter: (required)
:type project_recipe_filter: ProjectRecipeFilter
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: ProjectRecipeFilter
"""
kwargs['_return_http_data_only'] = True
return self.create_project_recipe_filter_with_http_info(owner, name, project_recipe_filter, **kwargs) # noqa: E501
def create_project_recipe_filter_with_http_info(self, owner, name, project_recipe_filter, **kwargs): # noqa: E501
"""Upsert a recipe filter to a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_project_recipe_filter_with_http_info(owner, name, project_recipe_filter, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_recipe_filter: (required)
:type project_recipe_filter: ProjectRecipeFilter
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(ProjectRecipeFilter, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'owner',
'name',
'project_recipe_filter'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_project_recipe_filter" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `create_project_recipe_filter`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `create_project_recipe_filter`") # noqa: E501
# verify the required parameter 'project_recipe_filter' is set
if self.api_client.client_side_validation and ('project_recipe_filter' not in local_var_params or # noqa: E501
local_var_params['project_recipe_filter'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_recipe_filter` when calling `create_project_recipe_filter`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'project_recipe_filter' in local_var_params:
body_params = local_var_params['project_recipe_filter']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}/recipes/filters', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProjectRecipeFilter', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def delete_project(self, owner, name, **kwargs): # noqa: E501
"""Delete a Project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_project(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.delete_project_with_http_info(owner, name, **kwargs) # noqa: E501
def delete_project_with_http_info(self, owner, name, **kwargs): # noqa: E501
"""Delete a Project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_project_with_http_info(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'owner',
'name'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_project" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `delete_project`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `delete_project`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def delete_project_org_permission(self, owner, name, project_policy_subject, **kwargs): # noqa: E501
"""Remove a Project permissions # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_project_org_permission(owner, name, project_policy_subject, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_policy_subject: (required)
:type project_policy_subject: ProjectPolicySubject
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.delete_project_org_permission_with_http_info(owner, name, project_policy_subject, **kwargs) # noqa: E501
def delete_project_org_permission_with_http_info(self, owner, name, project_policy_subject, **kwargs): # noqa: E501
"""Remove a Project permissions # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_project_org_permission_with_http_info(owner, name, project_policy_subject, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_policy_subject: (required)
:type project_policy_subject: ProjectPolicySubject
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'owner',
'name',
'project_policy_subject'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_project_org_permission" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `delete_project_org_permission`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `delete_project_org_permission`") # noqa: E501
# verify the required parameter 'project_policy_subject' is set
if self.api_client.client_side_validation and ('project_policy_subject' not in local_var_params or # noqa: E501
local_var_params['project_policy_subject'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_policy_subject` when calling `delete_project_org_permission`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'project_policy_subject' in local_var_params:
body_params = local_var_params['project_policy_subject']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}/permissions', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def delete_project_recipe_filter(self, owner, name, project_recipe_filter, **kwargs): # noqa: E501
"""Remove a Project recipe filter # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_project_recipe_filter(owner, name, project_recipe_filter, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_recipe_filter: (required)
:type project_recipe_filter: ProjectRecipeFilter
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.delete_project_recipe_filter_with_http_info(owner, name, project_recipe_filter, **kwargs) # noqa: E501
def delete_project_recipe_filter_with_http_info(self, owner, name, project_recipe_filter, **kwargs): # noqa: E501
"""Remove a Project recipe filter # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_project_recipe_filter_with_http_info(owner, name, project_recipe_filter, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_recipe_filter: (required)
:type project_recipe_filter: ProjectRecipeFilter
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'owner',
'name',
'project_recipe_filter'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_project_recipe_filter" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `delete_project_recipe_filter`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `delete_project_recipe_filter`") # noqa: E501
# verify the required parameter 'project_recipe_filter' is set
if self.api_client.client_side_validation and ('project_recipe_filter' not in local_var_params or # noqa: E501
local_var_params['project_recipe_filter'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_recipe_filter` when calling `delete_project_recipe_filter`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'project_recipe_filter' in local_var_params:
body_params = local_var_params['project_recipe_filter']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}/recipes/filters', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def get_project(self, owner, name, **kwargs): # noqa: E501
"""Get a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_project(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: Project
"""
kwargs['_return_http_data_only'] = True
return self.get_project_with_http_info(owner, name, **kwargs) # noqa: E501
def get_project_with_http_info(self, owner, name, **kwargs): # noqa: E501
"""Get a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_project_with_http_info(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(Project, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'owner',
'name'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_project" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `get_project`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `get_project`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Project', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def get_project_access_permissions(self, owner, name, **kwargs): # noqa: E501
"""Get project access permissions # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_project_access_permissions(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param page: Page number starting from 1
:type page: int
:param per_page: Number of items per page
:type per_page: int
:param subject_type: The type of access policy subject
:type subject_type: list[str]
:param permission: An access policy permission string
:type permission: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: ProjectAccessPolicyList
"""
kwargs['_return_http_data_only'] = True
return self.get_project_access_permissions_with_http_info(owner, name, **kwargs) # noqa: E501
def get_project_access_permissions_with_http_info(self, owner, name, **kwargs): # noqa: E501
"""Get project access permissions # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_project_access_permissions_with_http_info(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param page: Page number starting from 1
:type page: int
:param per_page: Number of items per page
:type per_page: int
:param subject_type: The type of access policy subject
:type subject_type: list[str]
:param permission: An access policy permission string
:type permission: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(ProjectAccessPolicyList, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'owner',
'name',
'page',
'per_page',
'subject_type',
'permission'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_project_access_permissions" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `get_project_access_permissions`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `get_project_access_permissions`") # noqa: E501
if self.api_client.client_side_validation and 'page' in local_var_params and local_var_params['page'] < 1: # noqa: E501
raise ApiValueError("Invalid value for parameter `page` when calling `get_project_access_permissions`, must be a value greater than or equal to `1`") # noqa: E501
if self.api_client.client_side_validation and 'per_page' in local_var_params and local_var_params['per_page'] > 100: # noqa: E501
raise ApiValueError("Invalid value for parameter `per_page` when calling `get_project_access_permissions`, must be a value less than or equal to `100`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'per_page' in local_var_params and local_var_params['per_page'] is not None: # noqa: E501
query_params.append(('per-page', local_var_params['per_page'])) # noqa: E501
if 'subject_type' in local_var_params and local_var_params['subject_type'] is not None: # noqa: E501
query_params.append(('subject_type', local_var_params['subject_type'])) # noqa: E501
collection_formats['subject_type'] = 'multi' # noqa: E501
if 'permission' in local_var_params and local_var_params['permission'] is not None: # noqa: E501
query_params.append(('permission', local_var_params['permission'])) # noqa: E501
collection_formats['permission'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}/permissions', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProjectAccessPolicyList', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def get_project_recipe_filters(self, owner, name, **kwargs): # noqa: E501
"""Get project recipe filters # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_project_recipe_filters(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param page: Page number starting from 1
:type page: int
:param per_page: Number of items per page
:type per_page: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: ProjectRecipeFilterList
"""
kwargs['_return_http_data_only'] = True
return self.get_project_recipe_filters_with_http_info(owner, name, **kwargs) # noqa: E501
def get_project_recipe_filters_with_http_info(self, owner, name, **kwargs): # noqa: E501
"""Get project recipe filters # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_project_recipe_filters_with_http_info(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param page: Page number starting from 1
:type page: int
:param per_page: Number of items per page
:type per_page: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(ProjectRecipeFilterList, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'owner',
'name',
'page',
'per_page'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_project_recipe_filters" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `get_project_recipe_filters`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `get_project_recipe_filters`") # noqa: E501
if self.api_client.client_side_validation and 'page' in local_var_params and local_var_params['page'] < 1: # noqa: E501
raise ApiValueError("Invalid value for parameter `page` when calling `get_project_recipe_filters`, must be a value greater than or equal to `1`") # noqa: E501
if self.api_client.client_side_validation and 'per_page' in local_var_params and local_var_params['per_page'] > 100: # noqa: E501
raise ApiValueError("Invalid value for parameter `per_page` when calling `get_project_recipe_filters`, must be a value less than or equal to `100`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'per_page' in local_var_params and local_var_params['per_page'] is not None: # noqa: E501
query_params.append(('per-page', local_var_params['per_page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}/recipes/filters', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProjectRecipeFilterList', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def get_project_recipes(self, owner, name, **kwargs): # noqa: E501
"""Get project recipes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_project_recipes(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param search: Search string to find recipes
:type search: str
:param page: Page number starting from 1
:type page: int
:param per_page: Number of items per page
:type per_page: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: RecipeInterfaceList
"""
kwargs['_return_http_data_only'] = True
return self.get_project_recipes_with_http_info(owner, name, **kwargs) # noqa: E501
def get_project_recipes_with_http_info(self, owner, name, **kwargs): # noqa: E501
"""Get project recipes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_project_recipes_with_http_info(owner, name, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param search: Search string to find recipes
:type search: str
:param page: Page number starting from 1
:type page: int
:param per_page: Number of items per page
:type per_page: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(RecipeInterfaceList, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'owner',
'name',
'search',
'page',
'per_page'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_project_recipes" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `get_project_recipes`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `get_project_recipes`") # noqa: E501
if self.api_client.client_side_validation and 'page' in local_var_params and local_var_params['page'] < 1: # noqa: E501
raise ApiValueError("Invalid value for parameter `page` when calling `get_project_recipes`, must be a value greater than or equal to `1`") # noqa: E501
if self.api_client.client_side_validation and 'per_page' in local_var_params and local_var_params['per_page'] > 100: # noqa: E501
raise ApiValueError("Invalid value for parameter `per_page` when calling `get_project_recipes`, must be a value less than or equal to `100`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'per_page' in local_var_params and local_var_params['per_page'] is not None: # noqa: E501
query_params.append(('per-page', local_var_params['per_page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}/recipes', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RecipeInterfaceList', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def list_projects(self, **kwargs): # noqa: E501
"""List Projects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_projects(async_req=True)
>>> result = thread.get()
:param search: Search string to find projects
:type search: str
:param ids: The ID of a project to search for
:type ids: list[str]
:param names: The name of the project
:type names: list[str]
:param owner: Owner of the project
:type owner: list[str]
:param public: Boolean check for public/private projects
:type public: bool
:param permissions: Filter by permission on given resource
:type permissions: list[str]
:param sort_by: Key to sort the list by
:type sort_by: ProjectSortKey
:param sort_order: The order to sort the list
:type sort_order: SortEnum
:param page: Page number starting from 1
:type page: int
:param per_page: Number of items per page
:type per_page: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: ProjectList
"""
kwargs['_return_http_data_only'] = True
return self.list_projects_with_http_info(**kwargs) # noqa: E501
def list_projects_with_http_info(self, **kwargs): # noqa: E501
"""List Projects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_projects_with_http_info(async_req=True)
>>> result = thread.get()
:param search: Search string to find projects
:type search: str
:param ids: The ID of a project to search for
:type ids: list[str]
:param names: The name of the project
:type names: list[str]
:param owner: Owner of the project
:type owner: list[str]
:param public: Boolean check for public/private projects
:type public: bool
:param permissions: Filter by permission on given resource
:type permissions: list[str]
:param sort_by: Key to sort the list by
:type sort_by: ProjectSortKey
:param sort_order: The order to sort the list
:type sort_order: SortEnum
:param page: Page number starting from 1
:type page: int
:param per_page: Number of items per page
:type per_page: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(ProjectList, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'search',
'ids',
'names',
'owner',
'public',
'permissions',
'sort_by',
'sort_order',
'page',
'per_page'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_projects" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if self.api_client.client_side_validation and 'page' in local_var_params and local_var_params['page'] < 1: # noqa: E501
raise ApiValueError("Invalid value for parameter `page` when calling `list_projects`, must be a value greater than or equal to `1`") # noqa: E501
if self.api_client.client_side_validation and 'per_page' in local_var_params and local_var_params['per_page'] > 100: # noqa: E501
raise ApiValueError("Invalid value for parameter `per_page` when calling `list_projects`, must be a value less than or equal to `100`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'search' in local_var_params and local_var_params['search'] is not None: # noqa: E501
query_params.append(('search', local_var_params['search'])) # noqa: E501
if 'ids' in local_var_params and local_var_params['ids'] is not None: # noqa: E501
query_params.append(('ids', local_var_params['ids'])) # noqa: E501
collection_formats['ids'] = 'multi' # noqa: E501
if 'names' in local_var_params and local_var_params['names'] is not None: # noqa: E501
query_params.append(('names', local_var_params['names'])) # noqa: E501
collection_formats['names'] = 'multi' # noqa: E501
if 'owner' in local_var_params and local_var_params['owner'] is not None: # noqa: E501
query_params.append(('owner', local_var_params['owner'])) # noqa: E501
collection_formats['owner'] = 'multi' # noqa: E501
if 'public' in local_var_params and local_var_params['public'] is not None: # noqa: E501
query_params.append(('public', local_var_params['public'])) # noqa: E501
if 'permissions' in local_var_params and local_var_params['permissions'] is not None: # noqa: E501
query_params.append(('permissions', local_var_params['permissions'])) # noqa: E501
collection_formats['permissions'] = 'multi' # noqa: E501
if 'sort_by' in local_var_params and local_var_params['sort_by'] is not None: # noqa: E501
query_params.append(('sort_by', local_var_params['sort_by'])) # noqa: E501
if 'sort_order' in local_var_params and local_var_params['sort_order'] is not None: # noqa: E501
query_params.append(('sort_order', local_var_params['sort_order'])) # noqa: E501
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'per_page' in local_var_params and local_var_params['per_page'] is not None: # noqa: E501
query_params.append(('per-page', local_var_params['per_page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProjectList', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def update(self, owner, name, project_update, **kwargs): # noqa: E501
"""Update a Project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update(owner, name, project_update, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_update: (required)
:type project_update: ProjectUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: UpdateAccepted
"""
kwargs['_return_http_data_only'] = True
return self.update_with_http_info(owner, name, project_update, **kwargs) # noqa: E501
def update_with_http_info(self, owner, name, project_update, **kwargs): # noqa: E501
"""Update a Project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_with_http_info(owner, name, project_update, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_update: (required)
:type project_update: ProjectUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(UpdateAccepted, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'owner',
'name',
'project_update'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method update" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `update`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `update`") # noqa: E501
# verify the required parameter 'project_update' is set
if self.api_client.client_side_validation and ('project_update' not in local_var_params or # noqa: E501
local_var_params['project_update'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_update` when calling `update`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'project_update' in local_var_params:
body_params = local_var_params['project_update']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UpdateAccepted', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def upsert_project_permission(self, owner, name, project_access_policy, **kwargs): # noqa: E501
"""Upsert a new permission to a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.upsert_project_permission(owner, name, project_access_policy, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_access_policy: (required)
:type project_access_policy: ProjectAccessPolicy
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: UpdateAccepted
"""
kwargs['_return_http_data_only'] = True
return self.upsert_project_permission_with_http_info(owner, name, project_access_policy, **kwargs) # noqa: E501
def upsert_project_permission_with_http_info(self, owner, name, project_access_policy, **kwargs): # noqa: E501
"""Upsert a new permission to a project # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.upsert_project_permission_with_http_info(owner, name, project_access_policy, async_req=True)
>>> result = thread.get()
:param owner: (required)
:type owner: str
:param name: (required)
:type name: str
:param project_access_policy: (required)
:type project_access_policy: ProjectAccessPolicy
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(UpdateAccepted, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'owner',
'name',
'project_access_policy'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method upsert_project_permission" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'owner' is set
if self.api_client.client_side_validation and ('owner' not in local_var_params or # noqa: E501
local_var_params['owner'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `owner` when calling `upsert_project_permission`") # noqa: E501
# verify the required parameter 'name' is set
if self.api_client.client_side_validation and ('name' not in local_var_params or # noqa: E501
local_var_params['name'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `name` when calling `upsert_project_permission`") # noqa: E501
# verify the required parameter 'project_access_policy' is set
if self.api_client.client_side_validation and ('project_access_policy' not in local_var_params or # noqa: E501
local_var_params['project_access_policy'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `project_access_policy` when calling `upsert_project_permission`") # noqa: E501
collection_formats = {}
path_params = {}
if 'owner' in local_var_params:
path_params['owner'] = local_var_params['owner'] # noqa: E501
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'project_access_policy' in local_var_params:
body_params = local_var_params['project_access_policy']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['APIKeyAuth', 'JWTAuth'] # noqa: E501
return self.api_client.call_api(
'/projects/{owner}/{name}/permissions', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UpdateAccepted', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 47.007768 | 178 | 0.598136 | 10,220 | 90,772 | 5.062622 | 0.02407 | 0.048396 | 0.078199 | 0.025048 | 0.966061 | 0.959586 | 0.955818 | 0.951913 | 0.936587 | 0.931755 | 0 | 0.015149 | 0.32661 | 90,772 | 1,930 | 179 | 47.032124 | 0.831313 | 0.436599 | 0 | 0.736721 | 1 | 0.009238 | 0.206887 | 0.0566 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028868 | false | 0 | 0.005774 | 0 | 0.06351 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
517af4acaa339f8a3425c209bd73f5bb0cade8b6 | 137 | py | Python | frappe_notification/frappe_notification/doctype/notification_channel/__init__.py | leam-tech/frappe_notification | 79e40f2c541d86d714a0b8d48b87f32b2f85076a | [
"MIT"
] | null | null | null | frappe_notification/frappe_notification/doctype/notification_channel/__init__.py | leam-tech/frappe_notification | 79e40f2c541d86d714a0b8d48b87f32b2f85076a | [
"MIT"
] | null | null | null | frappe_notification/frappe_notification/doctype/notification_channel/__init__.py | leam-tech/frappe_notification | 79e40f2c541d86d714a0b8d48b87f32b2f85076a | [
"MIT"
] | null | null | null | from .notification_channel import NotificationChannel # noqa
from .test_notification_channel import NotificationChannelFixtures # noqa
| 45.666667 | 74 | 0.868613 | 13 | 137 | 8.923077 | 0.615385 | 0.327586 | 0.431034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10219 | 137 | 2 | 75 | 68.5 | 0.943089 | 0.065693 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
51a4c097fc35a9461dde453b85aa78f2ed83c7c2 | 8,100 | py | Python | fonts/font7ctrl.py | robert-hh/SSD1963-TFT-Library-for-PyBoard | db9786cdd95f9dab5334a9de28bed4e26436815c | [
"MIT"
] | 16 | 2016-02-23T12:20:36.000Z | 2021-02-02T06:41:49.000Z | fonts/font7ctrl.py | robert-hh/SSD1963-TFT-Library-for-PyBoard-and-RP2040 | db9786cdd95f9dab5334a9de28bed4e26436815c | [
"MIT"
] | 2 | 2016-11-26T07:46:58.000Z | 2017-12-10T08:44:38.000Z | fonts/font7ctrl.py | robert-hh/SSD1963-TFT-Library-for-PyBoard | db9786cdd95f9dab5334a9de28bed4e26436815c | [
"MIT"
] | 9 | 2016-06-04T08:22:55.000Z | 2020-04-19T14:40:36.000Z | # Code generated by cfonts_to_trans_py.py
import TFTfont
_font7ctrl = b'\xe0\xa0\xa0\x80\x28\x28\x28\x38\x04\x04\x04\x07'\
b'\x60\x80\x40\x28\xd4\x14\x14\x09\x05\x07\x05\x05'\
b'\x60\x80\x40\x20\xdc\x08\x08\x08\x05\x02\x02\x05'\
b'\xe0\x80\xe0\x9c\xe8\x08\x08\x05\x02\x02\x05\x00'\
b'\xe0\x80\xe0\x88\xf4\x14\x08\x07\x02\x02\x02\x00'\
b'\xe0\x80\xe0\x9c\xf4\x14\x14\x0e\x0a\x0e\x02\x03'\
b'\xe0\xa0\xe0\xac\x10\x10\x0c\x00\x0a\x0c\x0a\x0a'\
b'\xc0\xa0\xc0\xa0\xd8\x20\x38\x20\x3c\x04\x04\x07'\
b'\x00\xe0\x50\x52\x65\x54\x52\xe1\x05\x02\x00\x00'\
b'\x00\x50\x50\x57\x72\x52\x52\x52\x02\x02\x00\x00'\
b'\x00\x40\x40\x47\x44\x44\x47\x74\x04\x04\x00\x00'\
b'\x00\x50\x50\x57\x52\x52\x52\x22\x22\x02\x00\x00'\
b'\x00\x78\x40\x47\x74\x44\x46\x44\x04\x04\x00\x00'\
b'\x00\x20\x50\x46\x45\x45\x46\x55\x25\x05\x00\x00'\
b'\x00\x20\x50\x42\x65\x35\x15\x55\x25\x02\x00\x00'\
b'\x00\x20\x50\x47\x62\x32\x12\x52\x22\x07\x00\x00'\
b'\xc0\xa8\xa8\xa8\xc8\x0e\x00\x07\x04\x07\x04\x07'\
b'\xc0\xa0\xac\xd0\x10\x0c\x02\x06\x02\x02\x07\x00'\
b'\xc0\xa0\xac\xd0\x10\x0c\x02\x05\x01\x02\x04\x07'\
b'\xc0\xa0\xac\xd0\x10\x0c\x0e\x02\x0e\x02\x0e\x00'\
b'\xc0\xa0\xac\xd0\x10\x0c\x00\x14\x14\x1e\x04\x04'\
b'\xe0\xa0\xa0\xa8\x14\x1c\x14\x00\x0a\x0c\x0a\x0a'\
b'\x60\x80\x40\x20\xc8\x28\x38\x08\x34\x0a\x0a\x0a'\
b'\xe0\x80\xe0\x9c\x88\xe8\x08\x06\x05\x06\x05\x06'\
b'\x40\xa0\x80\xa8\x54\x1c\x14\x14\x02\x05\x05\x05'\
b'\xe0\x80\xe0\x80\xe0\x00\x3e\x2a\x2a\x2a\x2a\x00'\
b'\x60\x8a\x4a\x2a\xca\x0e\x00\x1c\x0a\x0c\x0a\x1c'\
b'\xe0\x80\xcc\x90\xe8\x04\x18\x02\x05\x04\x05\x02'\
b'\x78\x40\x40\x70\x44\x4a\x48\x04\x02\x0a\x04\x00'\
b'\x60\x90\x80\xb0\x94\x9a\x68\x04\x02\x0a\x04\x00'\
b'\xc0\xa0\xa0\xc0\xa4\xaa\xa8\x04\x02\x0a\x04\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x30\x78\x78\x78\x30\x30\x00\x30\x30\x00\x00'\
b'\x00\x66\x66\x66\x24\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x6c\x6c\xfe\x6c\x6c\x6c\xfe\x6c\x6c\x00\x00'\
b'\x30\x30\x7c\xc0\xc0\x78\x0c\x0c\xf8\x30\x30\x00'\
b'\x00\x00\x00\xc4\xcc\x18\x30\x60\xcc\x8c\x00\x00'\
b'\x00\x70\xd8\xd8\x70\xfa\xde\xcc\xdc\x76\x00\x00'\
b'\x00\x30\x30\x30\x60\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x0c\x18\x30\x60\x60\x60\x30\x18\x0c\x00\x00'\
b'\x00\x60\x30\x18\x0c\x0c\x0c\x18\x30\x60\x00\x00'\
b'\x00\x00\x00\x66\x3c\xff\x3c\x66\x00\x00\x00\x00'\
b'\x00\x00\x00\x18\x18\x7e\x18\x18\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x38\x38\x60\x00'\
b'\x00\x00\x00\x00\x00\xfe\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x38\x38\x00\x00'\
b'\x00\x00\x02\x06\x0c\x18\x30\x60\xc0\x80\x00\x00'\
b'\x00\x7c\xc6\xce\xde\xd6\xf6\xe6\xc6\x7c\x00\x00'\
b'\x00\x10\x30\xf0\x30\x30\x30\x30\x30\xfc\x00\x00'\
b'\x00\x78\xcc\xcc\x0c\x18\x30\x60\xcc\xfc\x00\x00'\
b'\x00\x78\xcc\x0c\x0c\x38\x0c\x0c\xcc\x78\x00\x00'\
b'\x00\x0c\x1c\x3c\x6c\xcc\xfe\x0c\x0c\x1e\x00\x00'\
b'\x00\xfc\xc0\xc0\xc0\xf8\x0c\x0c\xcc\x78\x00\x00'\
b'\x00\x38\x60\xc0\xc0\xf8\xcc\xcc\xcc\x78\x00\x00'\
b'\x00\xfe\xc6\xc6\x06\x0c\x18\x30\x30\x30\x00\x00'\
b'\x00\x78\xcc\xcc\xec\x78\xdc\xcc\xcc\x78\x00\x00'\
b'\x00\x78\xcc\xcc\xcc\x7c\x18\x18\x30\x70\x00\x00'\
b'\x00\x00\x00\x38\x38\x00\x00\x38\x38\x00\x00\x00'\
b'\x00\x00\x00\x38\x38\x00\x00\x38\x38\x18\x30\x00'\
b'\x00\x0c\x18\x30\x60\xc0\x60\x30\x18\x0c\x00\x00'\
b'\x00\x00\x00\x00\x7e\x00\x7e\x00\x00\x00\x00\x00'\
b'\x00\x60\x30\x18\x0c\x06\x0c\x18\x30\x60\x00\x00'\
b'\x00\x78\xcc\x0c\x18\x30\x30\x00\x30\x30\x00\x00'\
b'\x00\x7c\xc6\xc6\xde\xde\xde\xc0\xc0\x7c\x00\x00'\
b'\x00\x30\x78\xcc\xcc\xcc\xfc\xcc\xcc\xcc\x00\x00'\
b'\x00\xfc\x66\x66\x66\x7c\x66\x66\x66\xfc\x00\x00'\
b'\x00\x3c\x66\xc6\xc0\xc0\xc0\xc6\x66\x3c\x00\x00'\
b'\x00\xf8\x6c\x66\x66\x66\x66\x66\x6c\xf8\x00\x00'\
b'\x00\xfe\x62\x60\x64\x7c\x64\x60\x62\xfe\x00\x00'\
b'\x00\xfe\x66\x62\x64\x7c\x64\x60\x60\xf0\x00\x00'\
b'\x00\x3c\x66\xc6\xc0\xc0\xce\xc6\x66\x3e\x00\x00'\
b'\x00\xcc\xcc\xcc\xcc\xfc\xcc\xcc\xcc\xcc\x00\x00'\
b'\x00\x78\x30\x30\x30\x30\x30\x30\x30\x78\x00\x00'\
b'\x00\x1e\x0c\x0c\x0c\x0c\xcc\xcc\xcc\x78\x00\x00'\
b'\x00\xe6\x66\x6c\x6c\x78\x6c\x6c\x66\xe6\x00\x00'\
b'\x00\xf0\x60\x60\x60\x60\x62\x66\x66\xfe\x00\x00'\
b'\x00\xc6\xee\xfe\xfe\xd6\xc6\xc6\xc6\xc6\x00\x00'\
b'\x00\xc6\xc6\xe6\xf6\xfe\xde\xce\xc6\xc6\x00\x00'\
b'\x00\x38\x6c\xc6\xc6\xc6\xc6\xc6\x6c\x38\x00\x00'\
b'\x00\xfc\x66\x66\x66\x7c\x60\x60\x60\xf0\x00\x00'\
b'\x00\x38\x6c\xc6\xc6\xc6\xce\xde\x7c\x0c\x1e\x00'\
b'\x00\xfc\x66\x66\x66\x7c\x6c\x66\x66\xe6\x00\x00'\
b'\x00\x78\xcc\xcc\xc0\x70\x18\xcc\xcc\x78\x00\x00'\
b'\x00\xfc\xb4\x30\x30\x30\x30\x30\x30\x78\x00\x00'\
b'\x00\xcc\xcc\xcc\xcc\xcc\xcc\xcc\xcc\x78\x00\x00'\
b'\x00\xcc\xcc\xcc\xcc\xcc\xcc\xcc\x78\x30\x00\x00'\
b'\x00\xc6\xc6\xc6\xc6\xd6\xd6\x6c\x6c\x6c\x00\x00'\
b'\x00\xcc\xcc\xcc\x78\x30\x78\xcc\xcc\xcc\x00\x00'\
b'\x00\xcc\xcc\xcc\xcc\x78\x30\x30\x30\x78\x00\x00'\
b'\x00\xfe\xce\x98\x18\x30\x60\x62\xc6\xfe\x00\x00'\
b'\x00\x3c\x30\x30\x30\x30\x30\x30\x30\x3c\x00\x00'\
b'\x00\x00\x80\xc0\x60\x30\x18\x0c\x06\x02\x00\x00'\
b'\x00\x3c\x0c\x0c\x0c\x0c\x0c\x0c\x0c\x3c\x00\x00'\
b'\x10\x38\x6c\xc6\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\x00'\
b'\x30\x30\x18\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x78\x0c\x7c\xcc\xcc\x76\x00\x00'\
b'\x00\xe0\x60\x60\x7c\x66\x66\x66\x66\xdc\x00\x00'\
b'\x00\x00\x00\x00\x78\xcc\xc0\xc0\xcc\x78\x00\x00'\
b'\x00\x1c\x0c\x0c\x7c\xcc\xcc\xcc\xcc\x76\x00\x00'\
b'\x00\x00\x00\x00\x78\xcc\xfc\xc0\xcc\x78\x00\x00'\
b'\x00\x38\x6c\x60\x60\xf8\x60\x60\x60\xf0\x00\x00'\
b'\x00\x00\x00\x00\x76\xcc\xcc\xcc\x7c\x0c\xcc\x78'\
b'\x00\xe0\x60\x60\x6c\x76\x66\x66\x66\xe6\x00\x00'\
b'\x00\x18\x18\x00\x78\x18\x18\x18\x18\x7e\x00\x00'\
b'\x00\x0c\x0c\x00\x3c\x0c\x0c\x0c\x0c\xcc\xcc\x78'\
b'\x00\xe0\x60\x60\x66\x6c\x78\x6c\x66\xe6\x00\x00'\
b'\x00\x78\x18\x18\x18\x18\x18\x18\x18\x7e\x00\x00'\
b'\x00\x00\x00\x00\xfc\xd6\xd6\xd6\xd6\xc6\x00\x00'\
b'\x00\x00\x00\x00\xf8\xcc\xcc\xcc\xcc\xcc\x00\x00'\
b'\x00\x00\x00\x00\x78\xcc\xcc\xcc\xcc\x78\x00\x00'\
b'\x00\x00\x00\x00\xdc\x66\x66\x66\x66\x7c\x60\xf0'\
b'\x00\x00\x00\x00\x76\xcc\xcc\xcc\xcc\x7c\x0c\x1e'\
b'\x00\x00\x00\x00\xec\x6e\x76\x60\x60\xf0\x00\x00'\
b'\x00\x00\x00\x00\x78\xcc\x60\x18\xcc\x78\x00\x00'\
b'\x00\x00\x20\x60\xfc\x60\x60\x60\x6c\x38\x00\x00'\
b'\x00\x00\x00\x00\xcc\xcc\xcc\xcc\xcc\x76\x00\x00'\
b'\x00\x00\x00\x00\xcc\xcc\xcc\xcc\x78\x30\x00\x00'\
b'\x00\x00\x00\x00\xc6\xc6\xd6\xd6\x6c\x6c\x00\x00'\
b'\x00\x00\x00\x00\xc6\x6c\x38\x38\x6c\xc6\x00\x00'\
b'\x00\x00\x00\x00\x66\x66\x66\x66\x3c\x0c\x18\xf0'\
b'\x00\x00\x00\x00\xfc\x8c\x18\x60\xc4\xfc\x00\x00'\
b'\x00\x1c\x30\x30\x60\xc0\x60\x30\x30\x1c\x00\x00'\
b'\x00\x18\x18\x18\x18\x00\x18\x18\x18\x18\x00\x00'\
b'\x00\xe0\x30\x30\x18\x0c\x18\x30\x30\xe0\x00\x00'\
b'\x00\x73\xda\xce\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x10\x38\x6c\xc6\xc6\xfe\x00\x00\x00'\
_font7ctrl_index = b'\x00\x00\x0c\x00\x18\x00\x24\x00\x30\x00\x3c\x00\x48\x00\x54\x00'\
b'\x60\x00\x6c\x00\x78\x00\x84\x00\x90\x00\x9c\x00\xa8\x00\xb4\x00'\
b'\xc0\x00\xcc\x00\xd8\x00\xe4\x00\xf0\x00\xfc\x00\x08\x01\x14\x01'\
b'\x20\x01\x2c\x01\x38\x01\x44\x01\x50\x01\x5c\x01\x68\x01\x74\x01'\
b'\x80\x01\x8c\x01\x98\x01\xa4\x01\xb0\x01\xbc\x01\xc8\x01\xd4\x01'\
b'\xe0\x01\xec\x01\xf8\x01\x04\x02\x10\x02\x1c\x02\x28\x02\x34\x02'\
b'\x40\x02\x4c\x02\x58\x02\x64\x02\x70\x02\x7c\x02\x88\x02\x94\x02'\
b'\xa0\x02\xac\x02\xb8\x02\xc4\x02\xd0\x02\xdc\x02\xe8\x02\xf4\x02'\
b'\x00\x03\x0c\x03\x18\x03\x24\x03\x30\x03\x3c\x03\x48\x03\x54\x03'\
b'\x60\x03\x6c\x03\x78\x03\x84\x03\x90\x03\x9c\x03\xa8\x03\xb4\x03'\
b'\xc0\x03\xcc\x03\xd8\x03\xe4\x03\xf0\x03\xfc\x03\x08\x04\x14\x04'\
b'\x20\x04\x2c\x04\x38\x04\x44\x04\x50\x04\x5c\x04\x68\x04\x74\x04'\
b'\x80\x04\x8c\x04\x98\x04\xa4\x04\xb0\x04\xbc\x04\xc8\x04\xd4\x04'\
b'\xe0\x04\xec\x04\xf8\x04\x04\x05\x10\x05\x1c\x05\x28\x05\x34\x05'\
b'\x40\x05\x4c\x05\x58\x05\x64\x05\x70\x05\x7c\x05\x88\x05\x94\x05'\
b'\xa0\x05\xac\x05\xb8\x05\xc4\x05\xd0\x05\xdc\x05\xe8\x05\xf4\x05'\
b'\x00\x06'
font7ctrl = TFTfont.TFTFont(_font7ctrl, _font7ctrl_index, 12, 8, 128, 0)
fonts = {"font7ctrl":font7ctrl,
}
| 52.258065 | 87 | 0.701235 | 1,965 | 8,100 | 2.886005 | 0.066667 | 0.270852 | 0.201552 | 0.160466 | 0.563393 | 0.470993 | 0.387762 | 0.29113 | 0.202962 | 0.119379 | 0 | 0.357719 | 0.021235 | 8,100 | 154 | 88 | 52.597403 | 0.357593 | 0.004815 | 0 | 0.013423 | 1 | 0.966443 | 0.89166 | 0.889551 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006711 | 0 | 0.006711 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
cf91318828133b5e71f0f7a7e855c38cc0f53e58 | 18,236 | py | Python | qa327_test/Frontend/test_R4.py | lucidorangee/CISC-327-Course-Project | b86fe58e809f10a90134cbe33202c9e68a46d13b | [
"MIT"
] | null | null | null | qa327_test/Frontend/test_R4.py | lucidorangee/CISC-327-Course-Project | b86fe58e809f10a90134cbe33202c9e68a46d13b | [
"MIT"
] | null | null | null | qa327_test/Frontend/test_R4.py | lucidorangee/CISC-327-Course-Project | b86fe58e809f10a90134cbe33202c9e68a46d13b | [
"MIT"
] | null | null | null | import pytest
from seleniumbase import BaseCase
from qa327_test.conftest import base_url
from unittest.mock import patch
from qa327.models import db, User, TicketInfo
from werkzeug.security import generate_password_hash, check_password_hash
"""
This file defines unit tests for the frontend homepage.
The tests will only test the frontend portion of the program, by patching the backend to return
specfic values. For example:
@patch('qa327.backend.get_user', return_value=test_user)
Will patch the backend get_user function (within the scope of the current test case)
so that it return 'test_user' instance below rather than reading
the user from the database.
Annotate @patch before unit tests can mock backend methods (for that testing function)
"""
# Mock a smple user (login)
test_user_login = User(
email='login@gmail.com',
name='LetsTestL',
password=generate_password_hash('Tester327!'),
balance=10000
)
# Moch some sample tickets
test_tickets = TicketInfo(
email='login@gmail.com',
name='t1',
quantity=1,
price=100,
date='20210408'
)
class TestR4(BaseCase):
# Test Case R4.0.1
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_positive_sell(self, *_):
"""
Checking for positive case for the fields of ticket's selling form with lower boundaries
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "Hello World 123")
self.type("#quantity_sell", "1")
self.type("#price_sell", "10")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert no error text appears
self.assert_text_not_visible("Ticket name must be alphanumeric-only", "#message") #TODO these asserts have to be updated
self.assert_text_not_visible("Ticket name cannot begin with a space", "#message")
self.assert_text_not_visible("Ticket name cannot end with a space", "#message")
self.assert_text_not_visible("Ticket name cannot be longer than 60 characters", "#message")
self.assert_text_not_visible("At least 1 ticket must be sold", "#message")
self.assert_text_not_visible("At most 100 tickets can be sold", "#message")
self.assert_text_not_visible("Price of the ticket cannot be below 10", "#message")
self.assert_text_not_visible("Price of the ticket cannot be above 100", "#message")
self.assert_text_not_visible("Expiration date is in invalid format", "#message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.0.2
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_positive_sell_high(self, *_):
"""
Checking for positive case for the fields of ticket's selling form with higher boundaries
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "Hello World 123")
self.type("#quantity_sell", "100")
self.type("#price_sell", "100")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# asser no error text appears
self.assert_text_not_visible("Ticket name must be alphanumeric-only", "#message")
self.assert_text_not_visible("Ticket name cannot begin with a space", "#message")
self.assert_text_not_visible("Ticket name cannot end with a space", "#message")
self.assert_text_not_visible("Ticket name cannot be longer than 60 characters", "#message")
self.assert_text_not_visible("At least 1 ticket must be sold", "#message")
self.assert_text_not_visible("At most 100 tickets can be sold", "#message")
self.assert_text_not_visible("Price of the ticket cannot be below 10", "#message")
self.assert_text_not_visible("Price of the ticket cannot be above 100", "#message")
self.assert_text_not_visible("Expiration date is in invalid format", "#message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.1.1
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_alphanumeric_only(self, *_):
"""
Check if name of the ticket is alphanumeric-only
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "Ht1&t2@!*\")(/.,<>[]-+")
self.type("#quantity_sell", "1")
self.type("#price_sell", "15")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("The name of the ticket has to be alphanumeric-only, and space allowed only if it is not the "
"first or the last character.", "#sell_message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.1.2
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_spaces_only(self, *_):
"""
Check space is not allowed as first character
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", " t1")
self.type("#quantity_sell", "1")
self.type("#price_sell", "15")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("The name of the ticket has to be alphanumeric-only, and space allowed only if it is not the "
"first or the last character.", "#sell_message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.1.3
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_spaces_only2(self, *_):
"""
Check space is not allowed as last character
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "t1 ")
self.type("#quantity_sell", "1")
self.type("#price_sell", "15")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("The name of the ticket has to be alphanumeric-only, and space allowed only if it is not the "
"first or the last character.", "#sell_message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.2.1
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_name_length(self, *_):
"""
The name of the ticket is no longer than 60 characters
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzabcdefghi")
self.type("#quantity_sell", "1")
self.type("#price_sell", "15")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("Ticket name cannot be longer than 60 characters", "#sell_message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.3.1
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_quantity_bound(self, *_):
"""
The quantity of the tickets has to be more than 0
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "t1")
self.type("#quantity_sell", "0")
self.type("#price_sell", "15")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("The quantity of the tickets has to be more than 0, and less than or equal to 100.",
"#sell_message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.3.2
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_quantity_bound2(self, *_):
"""
The quantity of the tickets has to be less than or equal to 100
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "t1")
self.type("#quantity_sell", "101")
self.type("#price_sell", "15")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("The quantity of the tickets has to be more than 0, and less than or equal to 100.",
"#sell_message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.4.1
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_price_bound(self, *_):
"""
Price cannot be lower than 10
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "t1")
self.type("#quantity_sell", "1")
self.type("#price_sell", "9")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("Price has to be of range [10, 100]",
"#sell_message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.4.2
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_price_bound2(self, *_):
"""
Price cannot be higher than 100
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "t1")
self.type("#quantity_sell", "1")
self.type("#price_sell", "101")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("Price has to be of range [10, 100]", "#sell_message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.5.1
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_date_format(self, *_):
"""
Date must be given in the format YYYYMMDD (e.g. 20200901)
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "t1")
self.type("#quantity_sell", "1")
self.type("#price_sell", "15")
self.type("#expdate_sell", "Sept. 9 2021")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("Date must be given in the format YYYYMMDD (e.g. 20200901)", "#sell_message")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.6.1
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_redirect_sell(self, *_):
"""
For any errors, redirect back to / and show an error message
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "t1")
self.type("#quantity_sell", "1")
self.type("#price_sell", "15")
self.type("#expdate_sell", "Sept. 9 2021")
# click sell button
self.click('input[value="Sell"]')
# assert proper header
self.assert_element("#welcome-header")
# open logout (for cleanup)
self.open(base_url + '/logout')
# Test Case R4.7.1
@pytest.mark.timeout(60)
@patch('qa327.backend.get_user', return_value=test_user_login)
def test_posted(self, *_):
"""
The added new ticket information will be posted on the user profile page
"""
# open logout page to invalid any logged-in sessions that may exist, then open login page
self.open(base_url + '/logout')
self.open(base_url + '/')
# test that redirection to /login has occurred
# fill email and password
self.type("#email", test_user_login.email)
self.type("#password", "Tester327!")
# click enter button
self.click('input[type="submit"]')
# enter Sell ticket form with low values
self.type("#name_sell", "t1")
self.type("#quantity_sell", "1")
self.type("#price_sell", "15")
self.type("#expdate_sell", "20210901")
# click sell button
self.click('input[value="Sell"]')
# assert proper error message
self.assert_text("t1", "#tickets")
# open logout (for cleanup)
self.open(base_url + '/logout')
| 39.386609 | 128 | 0.621573 | 2,402 | 18,236 | 4.594921 | 0.092007 | 0.056537 | 0.042403 | 0.053004 | 0.883664 | 0.876325 | 0.874785 | 0.869711 | 0.866087 | 0.853402 | 0 | 0.030065 | 0.259487 | 18,236 | 462 | 129 | 39.471861 | 0.787248 | 0.2619 | 0 | 0.805907 | 0 | 0 | 0.31249 | 0.027711 | 0 | 0 | 0 | 0.002165 | 0.122363 | 1 | 0.054852 | false | 0.063291 | 0.025316 | 0 | 0.084388 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
cf97008a2d6531fa5fcc14f6b302e19390014e18 | 4,093 | py | Python | tests/records/test_records_views.py | lnielsen/invenio-communities | e6e032960abd5d4062a63824d6d349a6158339af | [
"MIT"
] | null | null | null | tests/records/test_records_views.py | lnielsen/invenio-communities | e6e032960abd5d4062a63824d6d349a6158339af | [
"MIT"
] | null | null | null | tests/records/test_records_views.py | lnielsen/invenio-communities | e6e032960abd5d4062a63824d6d349a6158339af | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#
# This file is part of Invenio.
# Copyright (C) 2016-2021 CERN.
#
# Invenio is free software; you can redistribute it and/or modify it
# under the terms of the MIT License; see LICENSE file for more details.
"""Community module tests."""
import pytest
from flask import url_for
from invenio_accounts.testutils import login_user_via_session
@pytest.mark.skip()
def test_simple_flow(
db, es_clear, community, record, client, record_owner,
community_owner):
"""Test basics operations on records."""
comid, community = community
recid, record = record
login_user_via_session(client, user=record_owner)
community_records_list_url = url_for(
'invenio_communities_records.community_records_list',
pid_value=comid.pid_value)
resp = client.post(community_records_list_url, json={
'record_pid': recid.pid_value
})
assert resp.status_code == 201
community_record_links = resp.json['links']
community_record_id = resp.json['id']
login_user_via_session(client, user=community_owner)
resp = client.post(
community_record_links['comment'],
json={'message': 'Hello there'})
assert resp.status_code == 201
login_user_via_session(client, user=record_owner)
resp = client.post(
community_record_links['comment'],
json={'message': 'Oh hi mark'})
assert resp.status_code == 201
login_user_via_session(client, user=community_owner)
resp = client.get(community_record_links['self'])
assert resp.status_code == 200
resp = client.post(community_record_links['accept'])
assert resp.status_code == 201
login_user_via_session(client, user=record_owner)
community_record_item_url = url_for(
'invenio_communities_records.community_records_item',
pid_value=comid.pid_value,
community_record_id=community_record_id)
resp = client.delete(community_record_item_url)
assert resp.status_code == 204
@pytest.mark.skip()
def test_alternate_flow(
db, es_clear, community, record, client, record_owner,
community_owner):
"""Test basics operations on records."""
comid, community = community
recid, record = record
login_user_via_session(client, user=community_owner)
community_records_list_url = url_for(
'invenio_communities_records.community_records_list',
pid_value=comid.pid_value)
resp = client.post(community_records_list_url, json={
'record_pid': recid.pid_value
})
assert resp.status_code == 201
community_record_links = resp.json['links']
community_record_id = resp.json['id']
login_user_via_session(client, user=record_owner)
resp = client.post(
community_record_links['comment'],
json={'message': 'Hello there'})
assert resp.status_code == 201
login_user_via_session(client, user=community_owner)
resp = client.post(
community_record_links['comment'],
json={'message': 'Oh hi mark'})
assert resp.status_code == 201
login_user_via_session(client, user=record_owner)
resp = client.get(community_record_links['self'])
assert resp.status_code == 200
resp = client.post(community_record_links['accept'])
assert resp.status_code == 201
login_user_via_session(client, user=community_owner)
community_record_item_url = url_for(
'invenio_communities_records.community_records_item',
pid_value=comid.pid_value,
community_record_id=community_record_id)
resp = client.delete(community_record_item_url)
assert resp.status_code == 204
@pytest.mark.skip()
def test_records_permissions(db, es_clear, community, record, client, users):
"""Test permissions on records."""
# TODO
pass
@pytest.mark.skip()
def test_records_validation(db, es_clear, community, record, client, users):
"""Test records validation."""
# TODO
pass
@pytest.mark.skip()
def test_records_search(db, es_clear, community, record, client, users):
"""Test records search."""
# TODO
pass
| 30.318519 | 77 | 0.710237 | 530 | 4,093 | 5.175472 | 0.183019 | 0.136712 | 0.069996 | 0.087495 | 0.869851 | 0.862195 | 0.859643 | 0.859643 | 0.819176 | 0.780168 | 0 | 0.013538 | 0.187882 | 4,093 | 134 | 78 | 30.544776 | 0.811673 | 0.098705 | 0 | 0.909091 | 0 | 0 | 0.096465 | 0.05481 | 0 | 0 | 0 | 0.007463 | 0.136364 | 1 | 0.056818 | false | 0.034091 | 0.034091 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cfd1a080158b630f85b4f1faa5c69725fd7d334f | 144 | py | Python | rasa_core/cli/__init__.py | MalwareC500/rasa_core | d51dac663f7f4ce85bc61b269180cedad9610269 | [
"Apache-2.0"
] | null | null | null | rasa_core/cli/__init__.py | MalwareC500/rasa_core | d51dac663f7f4ce85bc61b269180cedad9610269 | [
"Apache-2.0"
] | 5 | 2019-12-16T21:49:06.000Z | 2022-02-10T00:37:32.000Z | rasa_core/cli/__init__.py | MalwareC500/rasa_core | d51dac663f7f4ce85bc61b269180cedad9610269 | [
"Apache-2.0"
] | 1 | 2021-12-09T14:35:02.000Z | 2021-12-09T14:35:02.000Z | import rasa_core.cli.arguments
import rasa_core.cli.test
import rasa_core.cli.run
import rasa_core.cli.train
import rasa_core.cli.visualization
| 24 | 34 | 0.861111 | 25 | 144 | 4.76 | 0.36 | 0.420168 | 0.588235 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069444 | 144 | 5 | 35 | 28.8 | 0.88806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
cfd482a0eedc63392d987ad2b884f7519d698bbf | 9,423 | py | Python | src/heap_tests.py | hashamtanveer/Data-Structures---Algorithms | 80a82b3f0cf2e1cf4c06e6c8d4652046672c6920 | [
"MIT"
] | null | null | null | src/heap_tests.py | hashamtanveer/Data-Structures---Algorithms | 80a82b3f0cf2e1cf4c06e6c8d4652046672c6920 | [
"MIT"
] | null | null | null | src/heap_tests.py | hashamtanveer/Data-Structures---Algorithms | 80a82b3f0cf2e1cf4c06e6c8d4652046672c6920 | [
"MIT"
] | null | null | null | from heap import *
length = 8
values = [18, 6, 7, -3, 44, 10, 6]
heap = MinHeap(length)
num_tests_passed = 0
num_tests_failed = 0
num_tests = 0
print("====================================")
print("PART 1")
print("Initializing with values", values)
print("====================================")
print("====================================")
print("Test Case 1")
print("Correct Initialization")
print("====================================")
num_tests += 1
if heap.get_heap_array() == [None, None, None, None, None, None, None, None]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 2")
print("Correct Insertion")
print("====================================")
num_tests += 1
for value in values:
heap.insert(value)
if heap.get_heap_array() == [-3, 6, 6, 18, 44, 10, 7, None]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 3")
print("Correct Extraction")
print("====================================")
num_tests += 1
if heap.extract_min() == -3:
if heap.get_heap_array() == [6, 7, 6, 18, 44, 10, None, None]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED: Incorrect array output")
num_tests_failed += 1
else:
print("TEST FAILED: Incorrect value output")
num_tests_failed += 1
print()
length = 8
values = [3, 3, 8, 6, 4, 2, 8, 1]
print("====================================")
print("PART 2")
print("Initializing with values", values)
print("====================================")
heap = MinHeap(length)
print("====================================")
print("Test Case 4")
print("Correct Initialization")
print("====================================")
num_tests += 1
if heap.get_heap_array() == [None, None, None, None, None, None, None, None]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 5")
print("Correct Insertion")
print("====================================")
num_tests += 1
for value in values:
heap.insert(value)
if heap.get_heap_array() == [1, 2, 3, 3, 4, 8, 8, 6]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 6")
print("Correct Extraction")
print("====================================")
num_tests += 1
if heap.extract_min() == 1:
if heap.get_heap_array() == [2, 3, 3, 6, 4, 8, 8, None]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED: Incorrect array output")
num_tests_failed += 1
else:
print("TEST FAILED: Incorrect value output")
num_tests_failed += 1
print()
length = 8
values = [None, None, None, None, None, None, None, None]
print("====================================")
print("PART 3")
print("Initializing with values", values)
print("====================================")
heap = MinHeap(length)
print("====================================")
print("Test Case 7")
print("Correct Initialization")
print("====================================")
num_tests += 1
if heap.get_heap_array() == [None, None, None, None, None, None, None, None]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 8")
print("Correct Insertion")
print("====================================")
num_tests += 1
for value in values:
heap.insert(value)
if heap.get_heap_array() == [None, None, None, None, None, None, None, None]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 9")
print("Correct Extraction")
print("====================================")
num_tests += 1
if heap.extract_min() == None:
if heap.get_heap_array() == values:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED: Incorrect array output")
num_tests_failed += 1
else:
print("TEST FAILED: Incorrect value output")
num_tests_failed += 1
print()
length = 8
values = [3, 3, 8, 6, 4, 2, 8, 1, 9]
print("====================================")
print("PART 4")
print("Initializing with values", values)
print("====================================")
heap = MinHeap(length)
print("====================================")
print("Test Case 10")
print("Correct Initialization")
print("====================================")
num_tests += 1
if heap.get_heap_array() == [None, None, None, None, None, None, None, None]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 11")
print("Correct Insertion")
print("====================================")
num_tests += 1
for value in values:
heap.insert(value)
if heap.get_heap_array() == [1, 2, 3, 3, 4, 8, 8, 6]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 12")
print("Correct Extraction")
print("====================================")
num_tests += 1
if heap.extract_min() == 1:
if heap.get_heap_array() == [2, 3, 3, 6, 4, 8, 8, None]:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED: Incorrect array output")
num_tests_failed += 1
else:
print("TEST FAILED: Incorrect value output")
num_tests_failed += 1
print()
length = 0
values = [3, 3, 8, 6, 4, 2, 8, 1, 9]
print("====================================")
print("PART 5")
print("Initializing with values", values)
print("====================================")
heap = MinHeap(length)
print("====================================")
print("Test Case 13")
print("Correct Initialization")
print("====================================")
num_tests += 1
if heap.get_heap_array() == []:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 14")
print("Correct Insertion")
print("====================================")
num_tests += 1
for value in values:
heap.insert(value)
if heap.get_heap_array() == []:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 15")
print("Correct Extraction")
print("====================================")
num_tests += 1
if heap.extract_min() == None:
if heap.get_heap_array() == []:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED: Incorrect array output")
num_tests_failed += 1
else:
print("TEST FAILED: Incorrect value output")
num_tests_failed += 1
print()
length = 100
values = [3, 3, 8, 6, 4, 2, 8, 1, 9]
print("====================================")
print("PART 6")
print("Initializing with values", values)
print("====================================")
heap = MinHeap(length)
print("====================================")
print("Test Case 16")
print("Correct Initialization")
print("====================================")
num_tests += 1
if heap.get_heap_array() == [None] * length:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
num_tests_failed += 1
print()
print("====================================")
print("Test Case 17")
print("Correct Insertion")
print("====================================")
num_tests += 1
for value in values:
heap.insert(value)
test_array = [1, 2, 3, 3, 4, 8, 8, 6, 9]
for i in range(length - len(test_array)):
test_array.append(None)
if heap.get_heap_array() == test_array:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED")
print("Array Returned:")
print(heap.get_heap_array())
num_tests_failed += 1
print()
print("Correct Array:")
print(test_array)
print()
print("====================================")
print("Test Case 18")
print("Correct Extraction")
print("====================================")
num_tests += 1
test_array = [2, 3, 3, 6, 4, 8, 8, 9, None]
for i in range(length - len(test_array)):
test_array.append(None)
if heap.extract_min() == 1:
if heap.get_heap_array() == test_array:
print("TEST PASSED")
num_tests_passed += 1
else:
print("TEST FAILED: Incorrect array output")
print("Your array:")
print(heap.get_heap_array())
print("Correct array:")
print(test_array)
num_tests_failed += 1
num_tests_failed += 1
else:
print("TEST FAILED: Incorrect value output")
print()
print("====================================")
print("SUMMARY")
print("====================================")
print("Total Tests:", num_tests)
print("Total Tests Passed: {}/{}".format(num_tests_passed, num_tests))
print("Total Tests Failed: {}/{}".format(num_tests_failed, num_tests))
| 21.711982 | 77 | 0.50971 | 1,109 | 9,423 | 4.177638 | 0.056808 | 0.117419 | 0.093244 | 0.103605 | 0.925103 | 0.890136 | 0.864235 | 0.856464 | 0.853874 | 0.843082 | 0 | 0.027919 | 0.167569 | 9,423 | 433 | 78 | 21.762125 | 0.562723 | 0 | 0 | 0.84953 | 0 | 0 | 0.361108 | 0.191063 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.119122 | 0.003135 | 0 | 0.003135 | 0.53605 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 9 |
3208895c7c20bb567de8cb2e6de26a965306400c | 205 | py | Python | classroom/views.py | SilviaZeta/abclassroom | a8f3f6f70f8c10999a1716b19bdc5e5d1a682b65 | [
"Apache-2.0"
] | null | null | null | classroom/views.py | SilviaZeta/abclassroom | a8f3f6f70f8c10999a1716b19bdc5e5d1a682b65 | [
"Apache-2.0"
] | null | null | null | classroom/views.py | SilviaZeta/abclassroom | a8f3f6f70f8c10999a1716b19bdc5e5d1a682b65 | [
"Apache-2.0"
] | null | null | null | from django.shortcuts import render
from django.http import Http404
def home_view(request):
return render(request, 'home.html')
def custom_404_view(request):
return render(request, '404.html')
| 18.636364 | 39 | 0.756098 | 29 | 205 | 5.241379 | 0.517241 | 0.131579 | 0.223684 | 0.302632 | 0.394737 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051429 | 0.146341 | 205 | 10 | 40 | 20.5 | 0.817143 | 0 | 0 | 0 | 0 | 0 | 0.082927 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
7a29452d5ebb719a827ee7c42eade9bf240b2486 | 11,204 | py | Python | python/dqcsim/tests/test_gatestream_misc.py | QE-Lab/dqcsim | 2257d1ec5dc9578929b1b0726d5ab299bcab04e9 | [
"Apache-2.0"
] | 8 | 2020-05-06T03:32:49.000Z | 2022-03-05T03:23:27.000Z | python/dqcsim/tests/test_gatestream_misc.py | mbrobbel/dqcsim | 2257d1ec5dc9578929b1b0726d5ab299bcab04e9 | [
"Apache-2.0"
] | 387 | 2019-05-21T09:17:59.000Z | 2020-03-10T19:05:12.000Z | python/dqcsim/tests/test_gatestream_misc.py | mbrobbel/dqcsim-rs | 2257d1ec5dc9578929b1b0726d5ab299bcab04e9 | [
"Apache-2.0"
] | 3 | 2020-04-04T08:26:08.000Z | 2022-01-29T16:06:42.000Z | import unittest, logging, os, sys, tempfile, re, math, cmath, pickle
from dqcsim.common import *
from dqcsim.host import *
from dqcsim.plugin import *
def catch_errors(fn, *args, **kwargs):
try:
return fn(*args, **kwargs)
except Exception as e:
return str(e)
@plugin("Test frontend plugin", "Test", "0.1")
class TestFrontend(Frontend):
def handle_init(self, _):
self.allocate(5)
def handle_host_cmd_measure(self, measures=[]):
self.measure(measures)
def handle_host_cmd_advance(self, cycles=0):
self.advance(cycles)
def handle_host_cmd_arb(self, iface='', op=''):
return ArbData(pickle.dumps(catch_errors(self.arb, iface, op)))
def handle_host_cmd_stats(self, qubit=1):
return ArbData(pickle.dumps((
catch_errors(self.get_measurement, qubit),
catch_errors(self.get_cycles_since_measure, qubit),
catch_errors(self.get_cycles_between_measures, qubit),
self.get_cycle()
)))
@plugin("Null operator plugin", "Test", "0.1")
class NullOperator(Operator):
def handle_host_cmd_stats(self, qubit=1):
return ArbData(pickle.dumps((
catch_errors(self.get_measurement, qubit),
catch_errors(self.get_cycles_since_measure, qubit),
catch_errors(self.get_cycles_between_measures, qubit),
self.get_cycle()
)))
@plugin("Test operator 1", "Test", "0.1")
class TestOperator1(NullOperator):
def handle_measurement_gate(self, measures, matrix, arb):
self.measure([q+1 for q in measures])
def handle_measurement(self, measurement):
measurement.qubit -= 1
measurement.value = True
return measurement
def handle_advance(self, cycles):
self.advance(cycles*2)
def handle_upstream_b_b(self):
return ArbData(b'oper')
@plugin("Test operator 2", "Test", "0.1")
class TestOperator2(NullOperator):
def handle_measurement(self, measurement):
measurement.value = True
return [measurement]
def handle_upstream_a_a(self):
return ArbData(b'oper')
@plugin("Test operator 3", "Test", "0.1")
class TestOperator3(NullOperator):
def handle_measurement(self, measurement):
pass
def handle_upstream_a_b(self):
return ArbData(b'oper')
@plugin("Null backend plugin", "Test", "0.1")
class NullBackend(Backend):
def handle_unitary_gate(self, targets, matrix, arb):
pass
def handle_measurement_gate(self, measures, matrix, arb):
return [Measurement(qubit, False) for qubit in measures]
def handle_prepare_gate(self, targets, matrix, arb):
pass
def handle_upstream_a_b(self):
return ArbData(b'back')
class Tests(unittest.TestCase):
def test_null_operator(self):
sim = Simulator(
TestFrontend(), NullOperator(), NullBackend(),
repro=None, stderr_verbosity=Loglevel.ERROR
)
sim.simulate()
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
'Invalid argument: qubit 1 has not been measured yet',
'Invalid argument: qubit 1 has not been measured yet',
'Invalid argument: qubit 1 has not been measured yet',
0,
))
sim.arb('front', 'cmd', 'measure', measures=[1])
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, 0),
0,
'Invalid argument: qubit 1 has only been measured once',
0,
))
sim.arb('front', 'cmd', 'advance', cycles=10)
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, 0),
10,
'Invalid argument: qubit 1 has only been measured once',
10,
))
sim.arb('front', 'cmd', 'measure', measures=[1])
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, 0),
0,
10,
10,
))
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='a', op='b')[0]),
ArbData(b'back'))
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='a', op='a')[0]),
'Invalid operation ID a for interface ID a')
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='b', op='b')[0]),
ArbData())
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='b', op='a')[0]),
ArbData())
sim.stop()
def test_operator1(self):
sim = Simulator(
TestFrontend(), TestOperator1(), NullBackend(),
repro=None, stderr_verbosity=Loglevel.ERROR
)
sim.simulate()
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
'Invalid argument: qubit 1 has not been measured yet',
'Invalid argument: qubit 1 has not been measured yet',
'Invalid argument: qubit 1 has not been measured yet',
0,
))
self.assertEqual(pickle.loads(sim.arb('op1', 'cmd', 'stats', qubit=2)[0]), (
'Invalid argument: qubit 2 has not been measured yet',
'Invalid argument: qubit 2 has not been measured yet',
'Invalid argument: qubit 2 has not been measured yet',
0,
))
sim.arb('front', 'cmd', 'measure', measures=[1])
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, 1),
0,
'Invalid argument: qubit 1 has only been measured once',
0,
))
self.assertEqual(pickle.loads(sim.arb('op1', 'cmd', 'stats', qubit=2)[0]), (
Measurement(2, 0),
0,
'Invalid argument: qubit 2 has only been measured once',
0,
))
sim.arb('front', 'cmd', 'advance', cycles=10)
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, 1),
10,
'Invalid argument: qubit 1 has only been measured once',
10,
))
self.assertEqual(pickle.loads(sim.arb('op1', 'cmd', 'stats', qubit=2)[0]), (
Measurement(2, 0),
20,
'Invalid argument: qubit 2 has only been measured once',
20,
))
sim.arb('front', 'cmd', 'measure', measures=[1])
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, 1),
0,
10,
10,
))
self.assertEqual(pickle.loads(sim.arb('op1', 'cmd', 'stats', qubit=2)[0]), (
Measurement(2, 0),
0,
20,
20,
))
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='a', op='b')[0]),
ArbData(b'back'))
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='a', op='a')[0]),
'Invalid operation ID a for interface ID a')
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='b', op='b')[0]),
ArbData(b'oper'))
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='b', op='a')[0]),
'Invalid operation ID a for interface ID b')
sim.stop()
def test_operator2(self):
sim = Simulator(
TestFrontend(), TestOperator2(), NullBackend(),
repro=None, stderr_verbosity=Loglevel.ERROR
)
sim.simulate()
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
'Invalid argument: qubit 1 has not been measured yet',
'Invalid argument: qubit 1 has not been measured yet',
'Invalid argument: qubit 1 has not been measured yet',
0,
))
sim.arb('front', 'cmd', 'measure', measures=[1])
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, 1),
0,
'Invalid argument: qubit 1 has only been measured once',
0,
))
sim.arb('front', 'cmd', 'advance', cycles=10)
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, 1),
10,
'Invalid argument: qubit 1 has only been measured once',
10,
))
sim.arb('front', 'cmd', 'measure', measures=[1])
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, 1),
0,
10,
10,
))
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='a', op='b')[0]),
'Invalid operation ID b for interface ID a')
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='a', op='a')[0]),
ArbData(b'oper'))
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='b', op='b')[0]),
ArbData())
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='b', op='a')[0]),
ArbData())
sim.stop()
def test_operator3(self):
sim = Simulator(
TestFrontend(), TestOperator3(), NullBackend(),
repro=None, stderr_verbosity=Loglevel.ERROR
)
sim.simulate()
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
'Invalid argument: qubit 1 has not been measured yet',
'Invalid argument: qubit 1 has not been measured yet',
'Invalid argument: qubit 1 has not been measured yet',
0,
))
sim.arb('front', 'cmd', 'measure', measures=[1])
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, None),
0,
'Invalid argument: qubit 1 has only been measured once',
0,
))
sim.arb('front', 'cmd', 'advance', cycles=10)
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, None),
10,
'Invalid argument: qubit 1 has only been measured once',
10,
))
sim.arb('front', 'cmd', 'measure', measures=[1])
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'stats', qubit=1)[0]), (
Measurement(1, None),
0,
10,
10,
))
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='a', op='b')[0]),
ArbData(b'oper'))
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='a', op='a')[0]),
'Invalid operation ID a for interface ID a')
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='b', op='b')[0]),
ArbData())
self.assertEqual(pickle.loads(sim.arb('front', 'cmd', 'arb', iface='b', op='a')[0]),
ArbData())
sim.stop()
if __name__ == '__main__':
unittest.main()
| 37.471572 | 92 | 0.555784 | 1,353 | 11,204 | 4.541759 | 0.087953 | 0.046867 | 0.078763 | 0.100244 | 0.830919 | 0.82262 | 0.799837 | 0.776729 | 0.740602 | 0.730024 | 0 | 0.027372 | 0.28588 | 11,204 | 298 | 93 | 37.597315 | 0.740657 | 0 | 0 | 0.781132 | 0 | 0 | 0.207961 | 0 | 0 | 0 | 0 | 0 | 0.135849 | 1 | 0.086792 | false | 0.011321 | 0.015094 | 0.030189 | 0.173585 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7a7233753b4dc7eea262c3817fcca13dc69de1de | 34,535 | py | Python | src/codespaces/azext_codespaces/vendored_sdks/codespaces/operations/plan_operations.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 207 | 2017-11-29T06:59:41.000Z | 2022-03-31T10:00:53.000Z | src/codespaces/azext_codespaces/vendored_sdks/codespaces/operations/plan_operations.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 4,061 | 2017-10-27T23:19:56.000Z | 2022-03-31T23:18:30.000Z | src/codespaces/azext_codespaces/vendored_sdks/codespaces/operations/plan_operations.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 802 | 2017-10-11T17:36:26.000Z | 2022-03-31T22:24:32.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
import uuid
from msrest.pipeline import ClientRawResponse
from .. import models
class PlanOperations(object):
"""PlanOperations operations.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
:ivar api_version: The API version to be used with the HTTP request. Constant value: "2020-06-16".
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.api_version = config.api_version or "2020-06-16"
self.config = config
def get(
self, resource_group_name, plan_name, custom_headers=None, raw=False, **operation_config):
"""Retrieves information about a Codespaces Plan resource.
Retrieves the properties of a Codespaces Plan.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param plan_name: Name of the Codespaces Plan
:type plan_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CodespacesPlan or ClientRawResponse if raw=true
:rtype: ~microsoft.codespaces.models.CodespacesPlan or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
# Construct URL
url = self.get.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'planName': self._serialize.url("plan_name", plan_name, 'str', pattern=r'^[a-zA-Z0-9]')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CodespacesPlan', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Codespaces/plans/{planName}'}
def delete(
self, resource_group_name, plan_name, custom_headers=None, raw=False, **operation_config):
"""Deletes a Codespaces Plan resource.
Deletes an existing Codespaces Plan.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param plan_name: Name of the Codespaces Plan
:type plan_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: None or ClientRawResponse if raw=true
:rtype: None or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
# Construct URL
url = self.delete.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'planName': self._serialize.url("plan_name", plan_name, 'str', pattern=r'^[a-zA-Z0-9]')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 204]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
delete.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Codespaces/plans/{planName}'}
def create(
self, resource_group_name, plan_name, codespaces_plan, custom_headers=None, raw=False, **operation_config):
"""Creates a Codespaces Plan.
Creates a Codespaces Plan with the specified create parameters.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param plan_name: Name of the Codespaces Plan
:type plan_name: str
:param codespaces_plan: Codespaces Plan create parameters.
:type codespaces_plan: ~microsoft.codespaces.models.CodespacesPlan
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CodespacesPlan or ClientRawResponse if raw=true
:rtype: ~microsoft.codespaces.models.CodespacesPlan or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
# Construct URL
url = self.create.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'planName': self._serialize.url("plan_name", plan_name, 'str', pattern=r'^[a-zA-Z0-9]')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(codespaces_plan, 'CodespacesPlan')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CodespacesPlan', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
create.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Codespaces/plans/{planName}'}
def update(
self, resource_group_name, plan_name, codespaces_plan_update_parameters, custom_headers=None, raw=False, **operation_config):
"""Updates a Codespaces Plan.
Updates the properties of an existing Codespaces Plan with the
specified update parameters.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param plan_name: Name of the Codespaces Plan
:type plan_name: str
:param codespaces_plan_update_parameters: Parameters for updating the
Codespaces Plan.
:type codespaces_plan_update_parameters:
~microsoft.codespaces.models.CodespacesPlanUpdateParameters
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CodespacesPlan or ClientRawResponse if raw=true
:rtype: ~microsoft.codespaces.models.CodespacesPlan or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
# Construct URL
url = self.update.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'planName': self._serialize.url("plan_name", plan_name, 'str', pattern=r'^[a-zA-Z0-9]')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(codespaces_plan_update_parameters, 'CodespacesPlanUpdateParameters')
# Construct and send request
request = self._client.patch(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CodespacesPlan', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
update.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Codespaces/plans/{planName}'}
def read_all_codespaces_action(
self, resource_group_name, plan_name, expiration=None, custom_headers=None, raw=False, **operation_config):
"""Get Codespaces Plan read codespaces access token.
Get Codespaces Plan access token which allows listing all codespaces in
the Plan.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param plan_name: Name of the Codespaces Plan
:type plan_name: str
:param expiration: The requested expiration of a Codespaces Plan
access token.
:type expiration: long
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CodespacesPlanAccessToken or ClientRawResponse if raw=true
:rtype: ~microsoft.codespaces.models.CodespacesPlanAccessToken or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
# Construct URL
url = self.read_all_codespaces_action.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'planName': self._serialize.url("plan_name", plan_name, 'str', pattern=r'^[a-zA-Z0-9]')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
if expiration is not None:
query_parameters['expiration'] = self._serialize.query("expiration", expiration, 'long')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CodespacesPlanAccessToken', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
read_all_codespaces_action.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Codespaces/plans/{planName}/readAllCodespaces'}
def write_codespaces_action(
self, resource_group_name, plan_name, expiration=None, custom_headers=None, raw=False, **operation_config):
"""Get Codespaces Plan write codespaces access token.
Get Codespaces Plan access token which allows creating, updating,
deleting and connecting to codespaces owned by the user.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param plan_name: Name of the Codespaces Plan
:type plan_name: str
:param expiration: The requested expiration of a Codespaces Plan
access token.
:type expiration: long
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CodespacesPlanAccessToken or ClientRawResponse if raw=true
:rtype: ~microsoft.codespaces.models.CodespacesPlanAccessToken or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
# Construct URL
url = self.write_codespaces_action.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'planName': self._serialize.url("plan_name", plan_name, 'str', pattern=r'^[a-zA-Z0-9]')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
if expiration is not None:
query_parameters['expiration'] = self._serialize.query("expiration", expiration, 'long')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CodespacesPlanAccessToken', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
write_codespaces_action.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Codespaces/plans/{planName}/writeCodespaces'}
def delete_all_codespaces_action(
self, resource_group_name, plan_name, expiration=None, custom_headers=None, raw=False, **operation_config):
"""Get Codespaces Plan read and delete codespaces access token.
Get Codespaces Plan access token which allows reading and deleting all
codespaces in the Plan.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param plan_name: Name of the Codespaces Plan
:type plan_name: str
:param expiration: The requested expiration of a Codespaces Plan
access token.
:type expiration: long
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CodespacesPlanAccessToken or ClientRawResponse if raw=true
:rtype: ~microsoft.codespaces.models.CodespacesPlanAccessToken or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
# Construct URL
url = self.delete_all_codespaces_action.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'planName': self._serialize.url("plan_name", plan_name, 'str', pattern=r'^[a-zA-Z0-9]')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
if expiration is not None:
query_parameters['expiration'] = self._serialize.query("expiration", expiration, 'long')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CodespacesPlanAccessToken', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
delete_all_codespaces_action.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Codespaces/plans/{planName}/deleteAllCodespaces'}
def write_delegates_action(
self, resource_group_name, plan_name, delegate_request, custom_headers=None, raw=False, **operation_config):
"""Get Codespaces Plan delegated write codespaces access token.
Get Codespaces Plan delegated access token which allows creating,
updating, deleting and connecting to codespaces owned by the user.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param plan_name: Name of the Codespaces Plan
:type plan_name: str
:param delegate_request: Codespaces Plan delegate access token
parameters.
:type delegate_request:
~microsoft.codespaces.models.CodespacesDelegateAccessTokenRequestBody
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CodespacesPlanAccessToken or ClientRawResponse if raw=true
:rtype: ~microsoft.codespaces.models.CodespacesPlanAccessToken or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
# Construct URL
url = self.write_delegates_action.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'planName': self._serialize.url("plan_name", plan_name, 'str', pattern=r'^[a-zA-Z0-9]')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(delegate_request, 'CodespacesDelegateAccessTokenRequestBody')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CodespacesPlanAccessToken', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
write_delegates_action.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Codespaces/plans/{planName}/writeDelegates'}
def list_by_resource_group(
self, resource_group_name, custom_headers=None, raw=False, **operation_config):
"""Retrieves information about all Codespaces Plan resources under the
given subscription and resource group.
Retrieves the properties of all Codespaces Plans.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of CodespacesPlan
:rtype:
~microsoft.codespaces.models.CodespacesPlanPaged[~microsoft.codespaces.models.CodespacesPlan]
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = self.list_by_resource_group.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1)
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
return response
# Deserialize response
deserialized = models.CodespacesPlanPaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.CodespacesPlanPaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
list_by_resource_group.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Codespaces/plans'}
def list_by_subscription(
self, custom_headers=None, raw=False, **operation_config):
"""Retrieves information about all Codespaces Plan resources under the
given subscription.
Retrieves the properties of all Codespaces Plans.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of CodespacesPlan
:rtype:
~microsoft.codespaces.models.CodespacesPlanPaged[~microsoft.codespaces.models.CodespacesPlan]
:raises:
:class:`CodespacesPlanErrorResponseException<microsoft.codespaces.models.CodespacesPlanErrorResponseException>`
"""
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = self.list_by_subscription.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CodespacesPlanErrorResponseException(self._deserialize, response)
return response
# Deserialize response
deserialized = models.CodespacesPlanPaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.CodespacesPlanPaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
list_by_subscription.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.Codespaces/plans'}
| 48.640845 | 189 | 0.685015 | 3,620 | 34,535 | 6.335359 | 0.059116 | 0.043255 | 0.033357 | 0.031394 | 0.918811 | 0.912532 | 0.906645 | 0.894 | 0.886588 | 0.886588 | 0 | 0.004725 | 0.221746 | 34,535 | 709 | 190 | 48.70945 | 0.848569 | 0.301404 | 0 | 0.8 | 0 | 0 | 0.17136 | 0.092332 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038806 | false | 0 | 0.008955 | 0 | 0.116418 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7a8b4e269b5132aa27c1dc62077765375b8583a6 | 13,557 | py | Python | register/views.py | mejanvijay/Field-Management-Portal | 360357aaf519b87baee1931ce60b2f39df02e6eb | [
"MIT"
] | null | null | null | register/views.py | mejanvijay/Field-Management-Portal | 360357aaf519b87baee1931ce60b2f39df02e6eb | [
"MIT"
] | null | null | null | register/views.py | mejanvijay/Field-Management-Portal | 360357aaf519b87baee1931ce60b2f39df02e6eb | [
"MIT"
] | null | null | null | from django.shortcuts import render
from django.views.decorators.csrf import csrf_protect
from django.http import HttpResponseRedirect
from django.http import HttpResponse
from django.contrib.auth.decorators import login_required
from django.contrib.auth import logout
from django.db import connection
from FieldManagement import validations
from FieldManagement.validations import exists,validateEmail,validateFloatingNumber,isApproved
# Create your views here.
@csrf_protect
def student_registration(request) :
if request.method == 'POST' : # if its a post request
# getting data out of form
StudID = request.POST.get('StudID')
FirstName = request.POST.get('FirstName')
MiddleName = request.POST.get('MiddleName')
LastName = request.POST.get('LastName')
DeptID = request.POST.get('DeptID')
Sex = request.POST.get('Sex')
MobileNo = request.POST.get('MobileNo')
Course = request.POST.get('Course')
ProfID = request.POST.get('ProfID')
EmailID = request.POST.get('EmailID')
Password = request.POST.get('Password')
RePassword = request.POST.get('RePassword')
isValidateData = True
isStudIDValidate = (StudID != None ) and (not exists(StudID,"StudID","Advisor")) and (not exists(StudID,"StudID","Student")) and (StudID[0:2] == "ST") and validations.validateNumString(StudID[2:12]) and (len(StudID)==12)
isValidateData &= isStudIDValidate
isFirstNameValidate = (FirstName != None ) and validations.validateCharString(FirstName)
isValidateData &= isFirstNameValidate
isMiddleNameValidate = validations.validateCharString(MiddleName) or (MiddleName=="")
isValidateData &= isMiddleNameValidate
isLastNameValidate = validations.validateCharString(FirstName) or (LastName=="")
isValidateData &= isLastNameValidate
isDeptIDValidate = (DeptID != None ) and (exists(DeptID,"DeptID","Department",True)) and validations.validateNumString(DeptID)
isValidateData &= isDeptIDValidate
isMobileNoValid = (MobileNo != None ) and validations.validateNumString(MobileNo) and MobileNo[0]!="0" and (len(MobileNo) == 10) and (not exists(MobileNo,"MobileNo","Student"))
isValidateData &= isMobileNoValid
isEmailIDValid = validateEmail(EmailID) and (not exists(EmailID,"EmailID","Student"))
isValidateData &= isEmailIDValid
isProfIDValid = (ProfID != None) and exists(ProfID,"ProfID","Professor") and isApproved(ProfID,"ProfID","ProfApproved")
isValidateData &= isProfIDValid
isPasswordValid = (Password != None) and (len(Password)>=8 and len(Password)<=16)
isValidateData &= isPasswordValid
isRePasswordValid = (Password==RePassword)
isValidateData &= isRePasswordValid
if isValidateData :
with connection.cursor() as cursor :
try :
if MiddleName != "" and LastName != "" :
cursor.execute("INSERT INTO Student(StudID,FirstName,MiddleName,LastName,DeptID,Sex,MobileNo,Course,EmailID,Password) VALUES ('%s','%s','%s','%s',%s,'%s','%s','%s','%s','%s');" % (StudID,FirstName,MiddleName,LastName,DeptID,Sex,MobileNo,Course,EmailID,Password))
elif MiddleName == "" and LastName != "" :
cursor.execute("INSERT INTO Student(StudID,FirstName,LastName,DeptID,Sex,MobileNo,Course,EmailID,Password) VALUES ('%s','%s','%s',%s,'%s','%s','%s','%s','%s');" % (StudID,FirstName,LastName,DeptID,Sex,MobileNo,Course,EmailID,Password))
elif MiddleName != "" and LastName == "" :
cursor.execute("INSERT INTO Student(StudID,FirstName,MiddleName,DeptID,Sex,MobileNo,Course,EmailID,Password) VALUES ('%s','%s','%s',%s,'%s','%s','%s','%s','%s');" % (StudID,FirstName,MiddleName,DeptID,Sex,MobileNo,Course,EmailID,Password))
else :
cursor.execute("INSERT INTO Student(StudID,FirstName,DeptID,Sex,MobileNo,Course,EmailID,Password) VALUES ('%s','%s',%s,'%s','%s','%s','%s','%s');" % (StudID,FirstName,DeptID,Sex,MobileNo,Course,EmailID,Password))
cursor.execute("INSERT INTO Advisor VALUES('%s','%s',%s);" % (ProfID,StudID,0))
except Exception, e:
context = { 'DBerror' : str(e) ,}
return render(request,"student_register.html",context)
else :
context = { 'RegSuccess' : True ,}
return render(request,"home.html",context)
else :
context = {'isProfIDValid' : not isProfIDValid ,'isStudIDValidate' : not isStudIDValidate, 'isFirstNameValidate' : not isFirstNameValidate, 'isMiddleNameValidate' : not isMiddleNameValidate, 'isLastNameValidate' : not isLastNameValidate, 'isDeptIDValidate' : not isDeptIDValidate,'isMobileNoValid' : not isMobileNoValid, 'isEmailIDValid' : not isEmailIDValid, 'isPasswordValid' : not isPasswordValid, 'isRePasswordValid' : not isRePasswordValid}
return render(request,"student_register.html",context)
else : # if its a get request
return render(request,"student_register.html",{})
@csrf_protect
def professor_registration(request) :
if request.method == 'POST' : # if its a post request
# getting data out of form
ProfID = request.POST.get('ProfID')
FirstName = request.POST.get('FirstName')
MiddleName = request.POST.get('MiddleName')
LastName = request.POST.get('LastName')
DeptID = request.POST.get('DeptID')
Sex = request.POST.get('Sex')
MobileNo = request.POST.get('MobileNo')
Designation = request.POST.get('Designation')
AdminID = request.POST.get('AdminID')
EmailID = request.POST.get('EmailID')
Password = request.POST.get('Password')
RePassword = request.POST.get('RePassword')
isValidateData = True
isProfIDValidate = (ProfID != None ) and (not exists(ProfID,"ProfID","ProfApproved")) and (not exists(ProfID,"ProfID","Professor")) and (ProfID[0:2] == "PF") and validations.validateNumString(ProfID[2:12]) and (len(ProfID)==12)
isValidateData &= isProfIDValidate
isFirstNameValidate = (FirstName != None ) and validations.validateCharString(FirstName)
isValidateData &= isFirstNameValidate
isMiddleNameValidate = validations.validateCharString(MiddleName) or (MiddleName=="")
isValidateData &= isMiddleNameValidate
isLastNameValidate = validations.validateCharString(FirstName) or (LastName=="")
isValidateData &= isLastNameValidate
isDeptIDValidate = (DeptID != None ) and (exists(DeptID,"DeptID","Department",True)) and validations.validateNumString(DeptID)
isValidateData &= isDeptIDValidate
isMobileNoValid = (MobileNo != None ) and validations.validateNumString(MobileNo) and MobileNo[0]!="0" and (len(MobileNo) == 10) and (not exists(MobileNo,"MobileNo","Professor"))
isValidateData &= isMobileNoValid
isEmailIDValid = validateEmail(EmailID) and (not exists(EmailID,"EmailID","Professor"))
isValidateData &= isEmailIDValid
isAdminIDValidate = (AdminID != None) and exists(AdminID,"AdminID","Admin")
isValidateData &= isAdminIDValidate
isPasswordValid = (Password != None) and (len(Password)>=8 and len(Password)<=16)
isValidateData &= isPasswordValid
print "Password : ",Password
print "Re-Password : ",RePassword
isRePasswordValid = (Password==RePassword)
isValidateData &= isRePasswordValid
if isValidateData :
with connection.cursor() as cursor :
try :
if MiddleName != "" and LastName != "" :
cursor.execute("INSERT INTO Professor(ProfID,FirstName,MiddleName,LastName,DeptID,Sex,MobileNo,Designation,EmailID,Password) VALUES ('%s','%s','%s','%s',%s,'%s','%s','%s','%s','%s');" % (ProfID,FirstName,MiddleName,LastName,DeptID,Sex,MobileNo,Designation,EmailID,Password))
elif MiddleName == "" and LastName != "" :
cursor.execute("INSERT INTO Professor(ProfID,FirstName,LastName,DeptID,Sex,MobileNo,Designation,EmailID,Password) VALUES ('%s','%s','%s',%s,'%s','%s','%s','%s','%s');" % (ProfID,FirstName,LastName,DeptID,Sex,MobileNo,Designation,EmailID,Password))
elif MiddleName != "" and LastName == "" :
cursor.execute("INSERT INTO Professor(ProfID,FirstName,MiddleName,DeptID,Sex,MobileNo,Designation,EmailID,Password) VALUES ('%s','%s','%s',%s,'%s','%s','%s','%s','%s');" % (ProfID,FirstName,MiddleName,DeptID,Sex,MobileNo,Designation,EmailID,Password))
else :
cursor.execute("INSERT INTO Professor(ProfID,FirstName,DeptID,Sex,MobileNo,Designation,EmailID,Password) VALUES ('%s','%s',%s,'%s','%s','%s','%s','%s');" % (ProfID,FirstName,DeptID,Sex,MobileNo,Designation,EmailID,Password))
cursor.execute("INSERT INTO ProfApproved VALUES('%s','%s',%s);" % (ProfID,AdminID,0))
except Exception, e:
context = { 'DBerror' : str(e) ,}
return render(request,"professor_register.html",context)
else :
context = { 'RegSuccess' : True ,}
return render(request,"home.html",context)
else :
context = {'isProfIDValidate' : not isProfIDValidate ,'isAdminIDValidate' : not isAdminIDValidate, 'isFirstNameValidate' : not isFirstNameValidate, 'isMiddleNameValidate' : not isMiddleNameValidate, 'isLastNameValidate' : not isLastNameValidate, 'isDeptIDValidate' : not isDeptIDValidate,'isMobileNoValid' : not isMobileNoValid, 'isEmailIDValid' : not isEmailIDValid, 'isPasswordValid' : not isPasswordValid, 'isRePasswordValid' : not isRePasswordValid}
return render(request,"professor_register.html",context)
else : # if its a get request
return render(request,"professor_register.html",{})
@csrf_protect
def plotmanager_registration(request) :
if request.method == 'POST' : # if its a post request
# getting data out of form
MangID = request.POST.get('MangID')
FirstName = request.POST.get('FirstName')
MiddleName = request.POST.get('MiddleName')
LastName = request.POST.get('LastName')
Sex = request.POST.get('Sex')
MobileNo = request.POST.get('MobileNo')
Experience = request.POST.get('Experience')
AdminID = request.POST.get('AdminID')
EmailID = request.POST.get('EmailID')
Password = request.POST.get('Password')
RePassword = request.POST.get('RePassword')
isValidateData = True
isMangIDValidate = (MangID != None ) and (not exists(MangID,"MangID","MangApproved")) and (not exists(MangID,"MangID","PlotManager")) and (MangID[0:2] == "PM") and validations.validateNumString(MangID[2:12]) and (len(MangID)==12)
isValidateData &= isMangIDValidate
isFirstNameValidate = (FirstName != None ) and validations.validateCharString(FirstName)
isValidateData &= isFirstNameValidate
isMiddleNameValidate = validations.validateCharString(MiddleName) or (MiddleName=="")
isValidateData &= isMiddleNameValidate
isLastNameValidate = validations.validateCharString(FirstName) or (LastName=="")
isValidateData &= isLastNameValidate
isMobileNoValid = (MobileNo != None ) and validations.validateNumString(MobileNo) and MobileNo[0]!="0" and (len(MobileNo) == 10) and (not exists(MobileNo,"MobileNo","PlotManager"))
isValidateData &= isMobileNoValid
isEmailIDValid = validateEmail(EmailID) and (not exists(EmailID,"EmailID","PlotManager"))
isValidateData &= isEmailIDValid
isExperienceValid = (Experience!=None) and validateFloatingNumber(Experience) and len(Experience)<=5 and (0.0<=float(Experience)<=50.0) and len(str((float(Experience)-int(float(Experience)))))==3
isValidateData &= isExperienceValid
isAdminIDValidate = (AdminID != None) and exists(AdminID,"AdminID","Admin")
isValidateData &= isAdminIDValidate
isPasswordValid = (Password != None) and (len(Password)>=8 and len(Password)<=16)
isValidateData &= isPasswordValid
# print "Password : ",Password
# print "Re-Password : ",RePassword
isRePasswordValid = (Password==RePassword)
isValidateData &= isRePasswordValid
if isValidateData :
with connection.cursor() as cursor :
try :
if MiddleName != "" and LastName != "" :
cursor.execute("INSERT INTO PlotManager(MangID,FirstName,MiddleName,LastName,Sex,MobileNo,Experience,EmailID,Password) VALUES ('%s','%s','%s','%s','%s','%s',%s,'%s','%s');" % (MangID,FirstName,MiddleName,LastName,Sex,MobileNo,Experience,EmailID,Password))
elif MiddleName == "" and LastName != "" :
cursor.execute("INSERT INTO PlotManager(MangID,FirstName,LastName,Sex,MobileNo,Experience,EmailID,Password) VALUES ('%s','%s','%s','%s','%s',%s,'%s','%s');" % (MangID,FirstName,LastName,Sex,MobileNo,Experience,EmailID,Password))
elif MiddleName != "" and LastName == "" :
cursor.execute("INSERT INTO PlotManager(MangID,FirstName,MiddleName,Sex,MobileNo,Experience,EmailID,Password) VALUES ('%s','%s','%s','%s','%s',%s,'%s','%s');" % (MangID,FirstName,MiddleName,Sex,MobileNo,Experience,EmailID,Password))
else :
cursor.execute("INSERT INTO PlotManager(MangID,FirstName,Sex,MobileNo,Experience,EmailID,Password) VALUES ('%s','%s','%s','%s',%s,'%s','%s');" % (MangID,FirstName,Sex,MobileNo,Experience,EmailID,Password))
cursor.execute("INSERT INTO MangApproved VALUES('%s','%s',%s);" % (MangID,AdminID,0))
except Exception, e:
context = { 'DBerror' : str(e) ,}
return render(request,"plotmanager_register.html",context)
else :
context = { 'RegSuccess' : True ,}
return render(request,"home.html",context)
else :
context = {'isExperienceValid' : not isExperienceValid,'isMangValidate' : not isMangIDValidate ,'isAdminIDValidate' : not isAdminIDValidate, 'isFirstNameValidate' : not isFirstNameValidate, 'isMiddleNameValidate' : not isMiddleNameValidate, 'isLastNameValidate' : not isLastNameValidate, 'isMobileNoValid' : not isMobileNoValid, 'isEmailIDValid' : not isEmailIDValid, 'isPasswordValid' : not isPasswordValid, 'isRePasswordValid' : not isRePasswordValid}
return render(request,"plotmanager_register.html",context)
else : # if its a get request
return render(request,"plotmanager_register.html",{})
| 55.790123 | 456 | 0.723538 | 1,441 | 13,557 | 6.795281 | 0.088827 | 0.020016 | 0.025429 | 0.027778 | 0.833333 | 0.803105 | 0.789624 | 0.773386 | 0.752859 | 0.743056 | 0 | 0.004372 | 0.122741 | 13,557 | 242 | 457 | 56.020661 | 0.818969 | 0.021539 | 0 | 0.700535 | 0 | 0.064171 | 0.245484 | 0.133777 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.187166 | 0.048128 | null | null | 0.010695 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
7aaae63539dae9c21c8fba3060a81d13c71e2827 | 233 | py | Python | tccli/services/cloudhsm/__init__.py | zyh911/tencentcloud-cli | dfc5dbd660d4c60d265921c4edc630091478fc41 | [
"Apache-2.0"
] | null | null | null | tccli/services/cloudhsm/__init__.py | zyh911/tencentcloud-cli | dfc5dbd660d4c60d265921c4edc630091478fc41 | [
"Apache-2.0"
] | null | null | null | tccli/services/cloudhsm/__init__.py | zyh911/tencentcloud-cli | dfc5dbd660d4c60d265921c4edc630091478fc41 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from tccli.services.cloudhsm.cloudhsm_client import register_arg
from tccli.services.cloudhsm.cloudhsm_client import get_actions_info
from tccli.services.cloudhsm.cloudhsm_client import AVAILABLE_VERSION_LIST
| 46.6 | 74 | 0.849785 | 32 | 233 | 5.9375 | 0.53125 | 0.142105 | 0.268421 | 0.394737 | 0.710526 | 0.710526 | 0.710526 | 0 | 0 | 0 | 0 | 0.00463 | 0.072961 | 233 | 4 | 75 | 58.25 | 0.875 | 0.090129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8903f4999209c73d98bde6bde3ef4cbc6352a192 | 122 | py | Python | BQN/layers/dropout.py | umd-huang-lab/Bayesian-Quantized-Networks | eb56fa1cb142cf235dde9cec7badea86009c3fcb | [
"MIT"
] | null | null | null | BQN/layers/dropout.py | umd-huang-lab/Bayesian-Quantized-Networks | eb56fa1cb142cf235dde9cec7badea86009c3fcb | [
"MIT"
] | null | null | null | BQN/layers/dropout.py | umd-huang-lab/Bayesian-Quantized-Networks | eb56fa1cb142cf235dde9cec7badea86009c3fcb | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.nn.modules.utils as utils
import math
| 12.2 | 38 | 0.786885 | 22 | 122 | 4.363636 | 0.409091 | 0.458333 | 0.40625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163934 | 122 | 9 | 39 | 13.555556 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8f37c0c6f0b2de04dbc92173bb0e196d65624bc0 | 10,281 | py | Python | random_peptide_sampling/databases_stats/data_stats.py | IdoSpringer/TCR-PEP-Classification | 1cca1551ca71359239a5f5caea7f13ec01f4982b | [
"MIT"
] | 1 | 2019-04-30T12:31:44.000Z | 2019-04-30T12:31:44.000Z | random_peptide_sampling/databases_stats/data_stats.py | IdoSpringer/TCR-PEP-Classification | 1cca1551ca71359239a5f5caea7f13ec01f4982b | [
"MIT"
] | null | null | null | random_peptide_sampling/databases_stats/data_stats.py | IdoSpringer/TCR-PEP-Classification | 1cca1551ca71359239a5f5caea7f13ec01f4982b | [
"MIT"
] | null | null | null | import matplotlib.pyplot as plt
import csv
import numpy as np
weizmann = "Weizmann complete database.csv"
shugay = "Shugay complete database.tsv"
def tcr_per_peptide_w():
peptides = {}
with open(weizmann, 'r') as data:
next(data)
for line in data:
line = line.split(',')
print(line)
cdr_beta = line[2]
if cdr_beta == 'NA':
continue
peptide = line[12]
if peptide == 'NA':
continue
try:
peptides[peptide] += 1
except KeyError:
peptides[peptide] = 1
list = sorted(peptides, key=lambda k: peptides[k], reverse=True)
return list, peptides
def tcr_per_peptide_s():
peptides = {}
with open(shugay, 'r') as data:
next(data)
for line in data:
line = line.split('\t')
print(line)
cdr_type = line[1]
if cdr_type != "TRB":
continue
cdr_beta = line[2]
peptide = line[9]
try:
peptides[peptide] += 1
except KeyError:
peptides[peptide] = 1
list = sorted(peptides, key=lambda k: peptides[k], reverse=True)
return list, peptides
def tcr_per_peptide_u():
peptides = {}
with open(weizmann, 'r') as csv_file:
csv_reader = csv.reader(csv_file, delimiter=',')
next(csv_reader)
for line in csv_reader:
cdr_beta = line[2]
if cdr_beta == 'NA':
continue
peptide = line[12]
print(cdr_beta, peptide, 'w')
if peptide == 'NA':
continue
try:
peptides[peptide] += 1
except KeyError:
peptides[peptide] = 1
with open(shugay, 'r') as data:
next(data)
for line in data:
line = line.split('\t')
cdr_type = line[1]
if cdr_type != "TRB":
continue
cdr_beta = line[2]
peptide = line[9]
print(cdr_beta, peptide, 's')
try:
peptides[peptide] += 1
except KeyError:
peptides[peptide] = 1
list = sorted(peptides, key=lambda k: peptides[k], reverse=True)
return list, peptides
pass
# Get number of TCR per peptide
'''
# list, peptides = tcr_per_peptide_w()
# list, peptides = tcr_per_peptide_s()
# list, peptides = tcr_per_peptide_u()
with open("20tcr_per_peptide_union.csv", 'w+') as file:
file.write('"Number of TCR","Peptide"'+'\n')
for peptide in list[:20]:
file.write('"' + str(peptides[peptide]) + '"' + "," + '"'+peptide+'"' + '\n')
'''
def length_distribution_w():
tcr_len = {}
pep_len = {}
with open(weizmann, 'r') as data:
next(data)
for line in data:
line = line.split(',')
print(line)
tcr_beta = line[2]
if tcr_beta == 'NA':
continue
peptide = line[12]
if peptide == 'NA':
continue
try:
tcr_len[len(tcr_beta)] += 1
except KeyError:
tcr_len[len(tcr_beta)] = 1
try:
pep_len[len(peptide)] += 1
except KeyError:
pep_len[len(peptide)] = 1
lens_tcr = sorted(list(tcr_len.keys()))
lens_pep = sorted(list(pep_len.keys()))
num_len_tcr = [tcr_len[length] for length in lens_tcr]
num_len_pep = [pep_len[length] for length in lens_pep]
fig, ax = plt.subplots()
ax.bar(lens_tcr, num_len_tcr, color='SkyBlue', label='TCR')
ax.bar(lens_pep, num_len_pep, color='IndianRed', label='peptide')
ax.legend()
plt.xticks(lens_tcr)
plt.title("TCR and peptide length distribution, Weizmann database")
plt.show()
def length_distribution_s():
tcr_len = {}
pep_len = {}
with open(shugay, 'r') as data:
next(data)
for line in data:
line = line.split('\t')
print(line)
cdr_type = line[1]
if cdr_type != "TRB":
continue
tcr_beta = line[2]
peptide = line[9]
if len(peptide) > 26 or len(peptide) < 7:
continue
if len(tcr_beta) > 26 or len(tcr_beta) < 7:
continue
try:
tcr_len[len(tcr_beta)] += 1
except KeyError:
tcr_len[len(tcr_beta)] = 1
try:
pep_len[len(peptide)] += 1
except KeyError:
pep_len[len(peptide)] = 1
lens_tcr = sorted(list(tcr_len.keys()))
lens_pep = sorted(list(pep_len.keys()))
num_len_tcr = [tcr_len[length] for length in lens_tcr]
num_len_pep = [pep_len[length] for length in lens_pep]
fig, ax = plt.subplots()
ax.bar(lens_tcr, num_len_tcr, color='SkyBlue', label='TCR')
ax.bar(lens_pep, num_len_pep, color='IndianRed', label='peptide')
ax.legend()
plt.xticks(range(7, 27))
plt.title("TCR and peptide length distribution, Shugay database")
plt.show()
def length_distribution_u():
tcr_len = {}
pep_len = {}
with open(weizmann, 'r') as csv_file:
csv_reader = csv.reader(csv_file, delimiter=',')
next(csv_reader)
for line in csv_reader:
tcr_beta = line[2]
if tcr_beta == 'NA':
continue
peptide = line[12]
if peptide == 'NA':
continue
try:
tcr_len[len(tcr_beta)] += 1
except KeyError:
tcr_len[len(tcr_beta)] = 1
try:
pep_len[len(peptide)] += 1
except KeyError:
pep_len[len(peptide)] = 1
with open(shugay, 'r') as data:
next(data)
for line in data:
line = line.split('\t')
cdr_type = line[1]
if cdr_type != "TRB":
continue
tcr_beta = line[2]
peptide = line[9]
if len(peptide) > 26 or len(peptide) < 7:
continue
if len(tcr_beta) > 26 or len(tcr_beta) < 7:
continue
try:
tcr_len[len(tcr_beta)] += 1
except KeyError:
tcr_len[len(tcr_beta)] = 1
try:
pep_len[len(peptide)] += 1
except KeyError:
pep_len[len(peptide)] = 1
lens_tcr = sorted(list(tcr_len.keys()))
lens_pep = sorted(list(pep_len.keys()))
num_len_tcr = [tcr_len[length] for length in lens_tcr]
num_len_pep = [pep_len[length] for length in lens_pep]
fig, ax = plt.subplots()
ax.bar(lens_tcr, num_len_tcr, color='SkyBlue', label='TCR')
ax.bar(lens_pep, num_len_pep, color='IndianRed', label='peptide')
ax.legend()
plt.xticks(range(7, 21))
plt.title("TCR and peptide length distribution, Union database")
plt.show()
# length_distribution_w()
# length_distribution_s()
# length_distribution_u()
# list, peptides = tcr_per_peptide_w()
# list, peptides = tcr_per_peptide_s()
list, peptides = tcr_per_peptide_u()
print(list)
print(peptides)
print(len([pep for pep in peptides.keys()]))
print(len([pep for pep in peptides.keys() if peptides[pep] > 125]))
print(len([pep for pep in peptides.keys() if peptides[pep] > 600]))
def pep_and_tcr_per_disease_w():
diseases_tcr = {}
diseases_pep = {}
with open(weizmann, 'r') as data:
next(data)
for line in data:
line = line.split(',')
print(line)
cdr_beta = line[2]
if cdr_beta == 'NA':
continue
peptide = line[12]
if peptide == 'NA':
continue
disease = line[5]
try:
diseases_tcr[disease].append(cdr_beta)
except KeyError:
diseases_tcr[disease] = [cdr_beta]
try:
diseases_pep[disease].add(peptide)
except KeyError:
diseases_pep[disease] = set()
diseases_pep[disease].add(peptide)
list_tcr = sorted(diseases_tcr, key=lambda k: len(diseases_tcr[k]), reverse=True)
with open('disease_tcr_w.csv', 'w+') as file:
file.write('"Number of TCR", "Disease"'+'\n')
for disease in list_tcr[:20]:
file.write('"'+str(len(diseases_tcr[disease]))+'"'+","+disease+'\n')
print(disease ,len(diseases_tcr[disease]))
list_pep = sorted(diseases_pep, key=lambda k: len(diseases_pep[k]), reverse=True)
with open('disease_pep_w.csv', 'w+') as file:
file.write('"Number of peptides", "Disease"'+'\n')
for disease in list_pep[:15]:
file.write('"' + str(len(diseases_pep[disease])) + '"' + "," + disease + '\n')
print(disease, len(diseases_pep[disease]))
return diseases_tcr, diseases_pep
def pep_and_tcr_per_disease_s():
diseases_tcr = {}
diseases_pep = {}
with open(shugay, 'r') as data:
next(data)
for line in data:
line = line.split('\t')
print(line)
cdr_type = line[1]
if cdr_type != "TRB":
continue
cdr_beta = line[2]
peptide = line[9]
disease = line[11]
try:
diseases_tcr[disease].append(cdr_beta)
except KeyError:
diseases_tcr[disease] = [cdr_beta]
try:
diseases_pep[disease].add(peptide)
except KeyError:
diseases_pep[disease] = set()
diseases_pep[disease].add(peptide)
list_tcr = sorted(diseases_tcr, key=lambda k: len(diseases_tcr[k]), reverse=True)
for disease in list_tcr[:20]:
print(disease ,len(diseases_tcr[disease]))
list_pep = sorted(diseases_pep, key=lambda k: len(diseases_pep[k]), reverse=True)
for disease in list_pep[:15]:
print(disease, len(diseases_pep[disease]))
#print(list_tcr[:20])
return diseases_tcr, diseases_pep
#list = sorted(peptides, key=lambda k: peptides[k], reverse=True)
'''
# print(pep_and_tcr_per_disease_w())
# print(pep_and_tcr_per_disease_s())
''' | 32.028037 | 90 | 0.538761 | 1,279 | 10,281 | 4.145426 | 0.080532 | 0.023765 | 0.033949 | 0.016598 | 0.903055 | 0.883817 | 0.811958 | 0.773859 | 0.768012 | 0.746699 | 0 | 0.013973 | 0.338683 | 10,281 | 321 | 91 | 32.028037 | 0.765848 | 0.025192 | 0 | 0.867159 | 0 | 0 | 0.048399 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02952 | false | 0.00369 | 0.01107 | 0 | 0.059041 | 0.062731 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8f40ea815a66b038135b7efa73f846da4e0f72a4 | 10,620 | py | Python | batch_export_addon.py | ziyeshanwai/blender_batch_render_addon | 2aaaa86a57b9e68b5a7ad81565378b338768a8a3 | [
"Apache-2.0"
] | null | null | null | batch_export_addon.py | ziyeshanwai/blender_batch_render_addon | 2aaaa86a57b9e68b5a7ad81565378b338768a8a3 | [
"Apache-2.0"
] | null | null | null | batch_export_addon.py | ziyeshanwai/blender_batch_render_addon | 2aaaa86a57b9e68b5a7ad81565378b338768a8a3 | [
"Apache-2.0"
] | null | null | null | bl_info = {
"name": "Batch Export",
"author": "wangliyou",
"version": (1, 2),
"blender": (2, 83, 0),
"location": "View3D > Toolbar > Batch Export",
"description": "batch export abc or video",
"warning": "",
"wiki_url": "",
"category": "Add Mesh",
}
import bpy
import os
#This is the Main Panel (Parent of Panel A and B)
class MAINUI(bpy.types.Panel):
bl_label = "Batch Export Tool"
bl_idname = "VIEW_PT_MainUI"
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = 'Batch Export'
def draw(self, context):
layout = self.layout
row = layout.row()
row.label(text= "Batch ExPORT TOOL", icon= 'OBJECT_ORIGIN')
row = layout.row()
row.operator("wm.batch_render", icon= 'CUBE', text= "batch render aifa animaition")
row = layout.row()
row.operator("wm.batch_render_bone_animation", icon= 'CUBE', text= "batch render bone animaition")
row = layout.row()
row.operator("wm.batch_export_abc", icon= 'CUBE', text= "batch export abc")
class WM_OT_batch_export_abc(bpy.types.Operator):
bl_label = "batch export abc box"
bl_idname = "wm.batch_export_abc"
input_dir = bpy.props.StringProperty(name= "input dir", default= "")
output_dir = bpy.props.StringProperty(name= "output dir:", default= "")
scale = bpy.props.FloatProperty(name="scale", description="scale", default=1.0, min=0.01, max=100.0)
def execute(self, context):
input_dir = self.input_dir
print("input_dir is {}".format(input_dir))
output_dir = self.output_dir
print("output_dir is {}".format(output_dir))
names = os.listdir(input_dir)
names = list(filter(lambda x: x.endswith(".fbx"), names))
number = 0
for name in names:
input_path = os.path.join(input_dir, name)
self.import_fbx(input_path)
output_path = os.path.join(output_dir, name[:-3]+'abc')
self.export(output_path)
print("-" * 100)
number += 1
print("{} export finish".format(name))
print("{}/{}".format(number, len(names)))
return {'FINISHED'}
def invoke(self, context, event):
return context.window_manager.invoke_props_dialog(self)
def set_frame_number(self):
start_number = bpy.data.objects['head_geo'].animation_data.action.frame_range[0]
end_number = bpy.data.objects['head_geo'].animation_data.action.frame_range[1]
bpy.context.scene.frame_start = start_number
bpy.context.scene.frame_end = end_number
def import_fbx(self, file):
bpy.ops.import_scene.fbx(filepath=file, global_scale=self.scale) # bas
bpy.context.selected_objects[0].name ='head_geo'
def export(self, output_path):
for i in range(1, len(bpy.data.objects['head_geo'].data.shape_keys.key_blocks)):
bpy.data.objects['head_geo'].data.shape_keys.key_blocks[i].slider_min = -5
bpy.data.objects['head_geo'].data.shape_keys.key_blocks[i].slider_max = 5
self.set_frame_number()
bpy.data.objects['head_geo'].select_set(True)
bpy.ops.wm.alembic_export(filepath=output_path, selected=True)
bpy.ops.object.delete(use_global=False)
print("export {}".format(output_path))
class WM_OT_batch_render(bpy.types.Operator):
bl_label = "batch export render box"
bl_idname = "wm.batch_render"
input_dir = bpy.props.StringProperty(name= "input dir", default= "")
output_dir = bpy.props.StringProperty(name= "output dir:", default= "")
def execute(self, context):
input_dir = self.input_dir
print("input_dir is {}".format(input_dir))
output_dir = self.output_dir
print("output_dir is {}".format(output_dir))
self.ini_render_settings()
names = os.listdir(input_dir)
names = list(filter(lambda x: x.endswith(".fbx"), names))
number = 0
for name in names:
input_path = os.path.join(input_dir, name)
self.import_fbx(input_path)
output_path = os.path.join(output_dir, name[:-3]+'mp4')
self.render_animation(output_path)
number += 1
print("-" * 100)
print("{} render finish".format(name))
print("{}/{}".format(number, len(names)))
return {'FINISHED'}
def invoke(self, context, event):
return context.window_manager.invoke_props_dialog(self)
def ini_render_settings(self):
print("initial render setting")
bpy.context.scene.unit_settings.scale_length = 0.01
bpy.context.scene.render.image_settings.file_format = 'FFMPEG'
bpy.context.scene.render.ffmpeg.format = 'MPEG4'
bpy.context.scene.render.ffmpeg.constant_rate_factor = 'MEDIUM'
bpy.context.scene.render.ffmpeg.codec = 'H264'
bpy.context.scene.render.fps = 60
def set_frame_number(self):
start_number = bpy.data.objects['head_geo'].animation_data.action.frame_range[0]
end_number = bpy.data.objects['head_geo'].animation_data.action.frame_range[1]
bpy.context.scene.frame_start = start_number
bpy.context.scene.frame_end = end_number
def adjust_view(self):
for area in bpy.context.screen.areas:
if area.type == "VIEW_3D":
break
for region in area.regions:
if region.type == "WINDOW":
break
space = area.spaces[0]
context = bpy.context.copy()
context['area'] = area
context['region'] = region
context['space_data'] = space
bpy.data.objects['head_geo'].select_set(True)
bpy.ops.view3d.view_selected(context)
bpy.ops.view3d.view_axis(context, type='FRONT')
#context['space_data'].overlay.show_overlays = False
def render_animation(self, output_path):
for i in range(1, len(bpy.data.objects['head_geo'].data.shape_keys.key_blocks)):
bpy.data.objects['head_geo'].data.shape_keys.key_blocks[i].slider_min = -5
bpy.data.objects['head_geo'].data.shape_keys.key_blocks[i].slider_max = 5
self.set_frame_number()
bpy.context.scene.render.filepath = output_path
self.adjust_view()
bpy.ops.render.opengl(animation=True)
bpy.data.objects['head_geo'].select_set(True)
bpy.ops.object.delete(use_global=False)
def import_fbx(self, file):
bpy.ops.import_scene.fbx(filepath=file, global_scale=0.01) # bas
bpy.context.selected_objects[0].name ='head_geo'
class WM_OT_batch_render_bone_animation(bpy.types.Operator):
bl_label = "batch export render box"
bl_idname = "wm.batch_render_bone_animation"
input_dir = bpy.props.StringProperty(name= "input dir", default= "")
output_dir = bpy.props.StringProperty(name= "output dir:", default= "")
def execute(self, context):
input_dir = self.input_dir
print("input_dir is {}".format(input_dir))
output_dir = self.output_dir
print("output_dir is {}".format(output_dir))
self.ini_render_settings()
names = os.listdir(input_dir)
names = list(filter(lambda x: x.endswith(".fbx"), names))
number = 0
for name in names:
input_path = os.path.join(input_dir, name)
self.import_fbx(input_path)
output_path = os.path.join(output_dir, name[:-3]+'mp4')
self.render_animation(output_path)
number += 1
print("-" * 100)
print("{} render finish".format(name))
print("{}/{}".format(number, len(names)))
return {'FINISHED'}
def invoke(self, context, event):
return context.window_manager.invoke_props_dialog(self)
def ini_render_settings(self):
print("initial render setting")
bpy.context.scene.unit_settings.scale_length = 0.01
bpy.context.scene.render.image_settings.file_format = 'FFMPEG'
bpy.context.scene.render.ffmpeg.format = 'MPEG4'
bpy.context.scene.render.ffmpeg.constant_rate_factor = 'MEDIUM'
bpy.context.scene.render.ffmpeg.codec = 'H264'
bpy.context.scene.render.fps = 60
def set_frame_number(self):
start_number = bpy.data.objects['head_geo_rig'].animation_data.action.frame_range[0]
end_number = bpy.data.objects['head_geo_rig'].animation_data.action.frame_range[1]
bpy.context.scene.frame_start = start_number
bpy.context.scene.frame_end = end_number
def adjust_view(self):
for area in bpy.context.screen.areas:
if area.type == "VIEW_3D":
break
for region in area.regions:
if region.type == "WINDOW":
break
space = area.spaces[0]
context = bpy.context.copy()
context['area'] = area
context['region'] = region
context['space_data'] = space
bpy.data.objects['head_geo_rig'].select_set(False)
bpy.data.objects['head_geo'].select_set(True)
bpy.ops.view3d.view_selected(context)
bpy.ops.view3d.view_axis(context, type='FRONT')
#context['space_data'].overlay.show_overlays = False
def render_animation(self, output_path):
self.set_frame_number()
bpy.context.scene.render.filepath = output_path
self.adjust_view()
bpy.ops.render.opengl(animation=True)
bpy.data.objects['head_geo_rig'].select_set(True)
bpy.ops.object.delete(use_global=False)
def import_fbx(self, file):
bpy.ops.import_scene.fbx(filepath=file, global_scale=0.01) # bas
bpy.context.selected_objects[0].name ='head_geo_rig'
#Here we are Registering the Classes
def register():
bpy.utils.register_class(MAINUI)
bpy.utils.register_class(WM_OT_batch_render)
bpy.utils.register_class(WM_OT_batch_export_abc)
bpy.utils.register_class(WM_OT_batch_render_bone_animation)
#Here we are UnRegistering the Classes
def unregister():
bpy.utils.unregister_class(MAINUI)
bpy.utils.unregister_class(WM_OT_batch_render)
bpy.utils.unregister_class(WM_OT_batch_export_abc)
bpy.utils.unregister_class(WM_OT_batch_render_bone_animation)
#This is required in order for the script to run in the text editor
if __name__ == "__main__":
register() | 38.33935 | 106 | 0.632298 | 1,386 | 10,620 | 4.629149 | 0.136364 | 0.042082 | 0.046758 | 0.050499 | 0.860817 | 0.846789 | 0.843672 | 0.820293 | 0.763092 | 0.756858 | 0 | 0.010458 | 0.243691 | 10,620 | 277 | 107 | 38.33935 | 0.788347 | 0.029379 | 0 | 0.736111 | 0 | 0 | 0.111888 | 0.005828 | 0 | 0 | 0 | 0 | 0 | 1 | 0.101852 | false | 0 | 0.050926 | 0.013889 | 0.282407 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8f723aba7350ac3627881419dc81423f234ecb32 | 9,849 | py | Python | python/tests/generated/errors/validation/test_value_error.py | eno-lang/enolib | 4175f7c1e8246493b6758c29bddc80d20eaf15f7 | [
"MIT"
] | 17 | 2019-04-15T21:03:37.000Z | 2022-01-24T11:03:34.000Z | python/tests/generated/errors/validation/test_value_error.py | eno-lang/enolib | 4175f7c1e8246493b6758c29bddc80d20eaf15f7 | [
"MIT"
] | 20 | 2019-03-13T23:23:40.000Z | 2022-03-29T13:40:57.000Z | python/tests/generated/errors/validation/test_value_error.py | eno-lang/enolib | 4175f7c1e8246493b6758c29bddc80d20eaf15f7 | [
"MIT"
] | 4 | 2019-04-15T21:18:03.000Z | 2019-09-21T16:18:10.000Z | import enolib
def test_querying_a_value_from_a_field_with_a_loader_that_always_produces_an_error_raises_the_expected_validationerror():
error = None
input = ("field: value")
try:
def loader(value):
raise ValueError(f"my error for '{value}'")
enolib.parse(input).field('field').required_value(loader)
except enolib.ValidationError as _error:
if isinstance(_error, enolib.ValidationError):
error = _error
else:
raise _error
assert type(error) is enolib.ValidationError
text = ("There is a problem with the value of this element: my error for 'value'")
assert error.text == text
snippet = (" Line | Content\n"
" > 1 | field: value")
assert error.snippet == snippet
assert error.selection['from']['line'] == 0
assert error.selection['from']['column'] == 7
assert error.selection['to']['line'] == 0
assert error.selection['to']['column'] == 12
def test_requesting_a_value_error_from_a_field_with_a_static_message_raises_the_expected_validationerror():
error = None
input = ("field: value")
try:
raise enolib.parse(input).field('field').value_error('my static message')
except enolib.ValidationError as _error:
if isinstance(_error, enolib.ValidationError):
error = _error
else:
raise _error
assert type(error) is enolib.ValidationError
text = ("There is a problem with the value of this element: my static message")
assert error.text == text
snippet = (" Line | Content\n"
" > 1 | field: value")
assert error.snippet == snippet
assert error.selection['from']['line'] == 0
assert error.selection['from']['column'] == 7
assert error.selection['to']['line'] == 0
assert error.selection['to']['column'] == 12
def test_requesting_a_value_error_from_a_field_with_a_dynamically_generated_message_raises_the_expected_validationerror():
error = None
input = ("field: value")
try:
raise enolib.parse(input).field('field').value_error(lambda value: f"my generated message for '{value}'")
except enolib.ValidationError as _error:
if isinstance(_error, enolib.ValidationError):
error = _error
else:
raise _error
assert type(error) is enolib.ValidationError
text = ("There is a problem with the value of this element: my generated message for 'value'")
assert error.text == text
snippet = (" Line | Content\n"
" > 1 | field: value")
assert error.snippet == snippet
assert error.selection['from']['line'] == 0
assert error.selection['from']['column'] == 7
assert error.selection['to']['line'] == 0
assert error.selection['to']['column'] == 12
def test_requesting_a_value_error_from_a_multiline_field_with_a_static_message_raises_the_expected_validationerror():
error = None
input = ("-- multiline_field\n"
"value\n"
"-- multiline_field")
try:
raise enolib.parse(input).field('multiline_field').value_error('my static message')
except enolib.ValidationError as _error:
if isinstance(_error, enolib.ValidationError):
error = _error
else:
raise _error
assert type(error) is enolib.ValidationError
text = ("There is a problem with the value of this element: my static message")
assert error.text == text
snippet = (" Line | Content\n"
" 1 | -- multiline_field\n"
" > 2 | value\n"
" 3 | -- multiline_field")
assert error.snippet == snippet
assert error.selection['from']['line'] == 1
assert error.selection['from']['column'] == 0
assert error.selection['to']['line'] == 1
assert error.selection['to']['column'] == 5
def test_requesting_a_value_error_from_a_multiline_field_with_a_dynamically_generated_message_raises_the_expected_validationerror():
error = None
input = ("-- multiline_field\n"
"value\n"
"-- multiline_field")
try:
raise enolib.parse(input).field('multiline_field').value_error(lambda value: f"my generated message for '{value}'")
except enolib.ValidationError as _error:
if isinstance(_error, enolib.ValidationError):
error = _error
else:
raise _error
assert type(error) is enolib.ValidationError
text = ("There is a problem with the value of this element: my generated message for 'value'")
assert error.text == text
snippet = (" Line | Content\n"
" 1 | -- multiline_field\n"
" > 2 | value\n"
" 3 | -- multiline_field")
assert error.snippet == snippet
assert error.selection['from']['line'] == 1
assert error.selection['from']['column'] == 0
assert error.selection['to']['line'] == 1
assert error.selection['to']['column'] == 5
def test_requesting_a_value_error_from_an_empty_multiline_field_with_a_static_message_raises_the_expected_validationerror():
error = None
input = ("-- multiline_field\n"
"-- multiline_field")
try:
raise enolib.parse(input).field('multiline_field').value_error('my static message')
except enolib.ValidationError as _error:
if isinstance(_error, enolib.ValidationError):
error = _error
else:
raise _error
assert type(error) is enolib.ValidationError
text = ("There is a problem with the value of this element: my static message")
assert error.text == text
snippet = (" Line | Content\n"
" > 1 | -- multiline_field\n"
" * 2 | -- multiline_field")
assert error.snippet == snippet
assert error.selection['from']['line'] == 0
assert error.selection['from']['column'] == 18
assert error.selection['to']['line'] == 0
assert error.selection['to']['column'] == 18
def test_requesting_a_value_error_from_an_empty_multiline_field_with_a_dynamically_generated_message_raises_the_expected_validationerror():
error = None
input = ("-- multiline_field\n"
"-- multiline_field")
try:
raise enolib.parse(input).field('multiline_field').value_error(lambda _value: f"my generated message")
except enolib.ValidationError as _error:
if isinstance(_error, enolib.ValidationError):
error = _error
else:
raise _error
assert type(error) is enolib.ValidationError
text = ("There is a problem with the value of this element: my generated message")
assert error.text == text
snippet = (" Line | Content\n"
" > 1 | -- multiline_field\n"
" * 2 | -- multiline_field")
assert error.snippet == snippet
assert error.selection['from']['line'] == 0
assert error.selection['from']['column'] == 18
assert error.selection['to']['line'] == 0
assert error.selection['to']['column'] == 18
def test_requesting_a_value_error_from_a_field_with_continuations_with_a_static_message_raises_the_expected_validationerror():
error = None
input = ("field: value\n"
"\\ continuation\n"
"\\ continuation\n"
"|\n"
"\n"
"|")
try:
raise enolib.parse(input).field('field').value_error('my static message')
except enolib.ValidationError as _error:
if isinstance(_error, enolib.ValidationError):
error = _error
else:
raise _error
assert type(error) is enolib.ValidationError
text = ("There is a problem with the value of this element: my static message")
assert error.text == text
snippet = (" Line | Content\n"
" > 1 | field: value\n"
" * 2 | \\ continuation\n"
" * 3 | \\ continuation\n"
" * 4 | |\n"
" * 5 | \n"
" * 6 | |")
assert error.snippet == snippet
assert error.selection['from']['line'] == 0
assert error.selection['from']['column'] == 7
assert error.selection['to']['line'] == 5
assert error.selection['to']['column'] == 1
def test_requesting_a_value_error_from_a_field_with_continuations_with_a_dynamically_generated_message_raises_the_expected_validationerror():
error = None
input = ("field: value\n"
"\\ continuation\n"
"\\ continuation\n"
"|\n"
"\n"
"|")
try:
raise enolib.parse(input).field('field').value_error(lambda value: f"my generated message for '{value}'")
except enolib.ValidationError as _error:
if isinstance(_error, enolib.ValidationError):
error = _error
else:
raise _error
assert type(error) is enolib.ValidationError
text = ("There is a problem with the value of this element: my generated message for 'value continuation continuation'")
assert error.text == text
snippet = (" Line | Content\n"
" > 1 | field: value\n"
" * 2 | \\ continuation\n"
" * 3 | \\ continuation\n"
" * 4 | |\n"
" * 5 | \n"
" * 6 | |")
assert error.snippet == snippet
assert error.selection['from']['line'] == 0
assert error.selection['from']['column'] == 7
assert error.selection['to']['line'] == 5
assert error.selection['to']['column'] == 1 | 33.161616 | 141 | 0.598436 | 1,105 | 9,849 | 5.136652 | 0.064253 | 0.104651 | 0.12685 | 0.07611 | 0.97234 | 0.965116 | 0.965116 | 0.965116 | 0.965116 | 0.965116 | 0 | 0.009666 | 0.285714 | 9,849 | 297 | 142 | 33.161616 | 0.797157 | 0 | 0 | 0.916279 | 0 | 0 | 0.235025 | 0 | 0 | 0 | 0 | 0 | 0.293023 | 1 | 0.046512 | false | 0 | 0.004651 | 0 | 0.051163 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
56af7513389474ae14ddead1ed07c8bf9d7f5504 | 13,251 | py | Python | tests/integration/test_zookeeper_config_load_balancing/test.py | chalice19/ClickHouse | 2f38e7bc5c2113935ab86260439bb543a1737291 | [
"Apache-2.0"
] | 8,629 | 2016-06-14T21:03:01.000Z | 2019-09-23T07:46:38.000Z | tests/integration/test_zookeeper_config_load_balancing/test.py | chalice19/ClickHouse | 2f38e7bc5c2113935ab86260439bb543a1737291 | [
"Apache-2.0"
] | 4,335 | 2016-06-15T12:58:31.000Z | 2019-09-23T11:18:43.000Z | tests/integration/test_zookeeper_config_load_balancing/test.py | chalice19/ClickHouse | 2f38e7bc5c2113935ab86260439bb543a1737291 | [
"Apache-2.0"
] | 1,700 | 2016-06-15T09:25:11.000Z | 2019-09-23T11:16:38.000Z | import pytest
from helpers.cluster import ClickHouseCluster
from helpers.network import PartitionManager
cluster = ClickHouseCluster(
__file__, zookeeper_config_path="configs/zookeeper_load_balancing.xml"
)
# use 3-letter hostnames, so getHostNameDifference("nod1", "zoo1") will work as expected
node1 = cluster.add_instance(
"nod1", with_zookeeper=True, main_configs=["configs/zookeeper_load_balancing.xml"]
)
node2 = cluster.add_instance(
"nod2", with_zookeeper=True, main_configs=["configs/zookeeper_load_balancing.xml"]
)
node3 = cluster.add_instance(
"nod3", with_zookeeper=True, main_configs=["configs/zookeeper_load_balancing.xml"]
)
def change_balancing(old, new, reload=True):
line = "<zookeeper_load_balancing>{}<"
old_line = line.format(old)
new_line = line.format(new)
for node in [node1, node2, node3]:
node.replace_in_config(
"/etc/clickhouse-server/config.d/zookeeper_load_balancing.xml",
old_line,
new_line,
)
if reload:
node.query("select '{}', '{}'".format(old, new))
node.query("system reload config")
@pytest.fixture(scope="module")
def started_cluster():
try:
cluster.start()
yield cluster
finally:
cluster.shutdown()
def test_first_or_random(started_cluster):
try:
change_balancing("random", "first_or_random")
print(
str(
node1.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node1.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo1_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
print(
str(
node2.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node2.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo1_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
print(
str(
node3.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node3.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo1_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
finally:
change_balancing("first_or_random", "random", reload=False)
def test_in_order(started_cluster):
try:
change_balancing("random", "in_order")
print(
str(
node1.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node1.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo1_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
print(
str(
node2.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node2.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo1_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
print(
str(
node3.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node3.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo1_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
finally:
change_balancing("in_order", "random", reload=False)
def test_nearest_hostname(started_cluster):
try:
change_balancing("random", "nearest_hostname")
print(
str(
node1.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node1.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo1_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
print(
str(
node2.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node2.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo2_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
print(
str(
node3.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node3.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo3_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
finally:
change_balancing("nearest_hostname", "random", reload=False)
def test_round_robin(started_cluster):
pm = PartitionManager()
try:
pm._add_rule(
{
"source": node1.ip_address,
"destination": cluster.get_instance_ip("zoo1"),
"action": "REJECT --reject-with tcp-reset",
}
)
pm._add_rule(
{
"source": node2.ip_address,
"destination": cluster.get_instance_ip("zoo1"),
"action": "REJECT --reject-with tcp-reset",
}
)
pm._add_rule(
{
"source": node3.ip_address,
"destination": cluster.get_instance_ip("zoo1"),
"action": "REJECT --reject-with tcp-reset",
}
)
change_balancing("random", "round_robin")
print(
str(
node1.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node1.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo2_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
print(
str(
node2.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node2.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo2_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
print(
str(
node3.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep ':2181' | grep ESTABLISHED",
],
privileged=True,
user="root",
)
)
)
assert (
"1"
== str(
node3.exec_in_container(
[
"bash",
"-c",
"lsof -a -i4 -i6 -itcp -w | grep 'testzookeeperconfigloadbalancing_zoo2_1.*testzookeeperconfigloadbalancing_default:2181' | grep ESTABLISHED | wc -l",
],
privileged=True,
user="root",
)
).strip()
)
finally:
pm.heal_all()
change_balancing("round_robin", "random", reload=False)
| 30.96028 | 174 | 0.380877 | 933 | 13,251 | 5.231511 | 0.12433 | 0.029502 | 0.073755 | 0.093423 | 0.811104 | 0.789797 | 0.766441 | 0.766441 | 0.766441 | 0.766441 | 0 | 0.034834 | 0.519055 | 13,251 | 427 | 175 | 31.032787 | 0.731053 | 0.00649 | 0 | 0.641791 | 0 | 0.029851 | 0.252146 | 0.097926 | 0 | 0 | 0 | 0 | 0.029851 | 1 | 0.014925 | false | 0 | 0.007463 | 0 | 0.022388 | 0.029851 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71090c8ca1aa16a4d926cc77f5b61862cfc42dd5 | 36,713 | py | Python | lasagne_wrapper/lasagne_wrapper/network.py | SCCH-KVS/NuclearSegmentationPipeline | 5b4a37b74890e0a6fb767061c60a9f2a880d370d | [
"MIT"
] | 13 | 2019-05-22T08:41:17.000Z | 2022-03-08T03:09:52.000Z | lasagne_wrapper/lasagne_wrapper/network.py | SCCH-KVS/NuclearSegmentationPipeline | 5b4a37b74890e0a6fb767061c60a9f2a880d370d | [
"MIT"
] | 15 | 2020-01-28T22:53:59.000Z | 2022-03-12T00:55:09.000Z | lasagne_wrapper/lasagne_wrapper/network.py | SCCH-KVS/NuclearSegmentationPipeline | 5b4a37b74890e0a6fb767061c60a9f2a880d370d | [
"MIT"
] | 3 | 2020-06-18T09:35:54.000Z | 2022-02-17T03:55:25.000Z |
from __future__ import print_function
import os
import sys
import time
import pickle
import itertools
import numpy as np
import theano
import lasagne
from lasagne.utils import floatX
from lasagne_wrapper.utils import BColors, print_net_architecture
import theano.tensor as T
from lasagne_wrapper.data_pool import DataPool
from lasagne_wrapper.batch_iterators import threaded_generator_from_iterator
class BaseNetwork_(object):
"""
Neural Network
"""
def __init__(self, net, print_architecture=True):
"""
Constructor
"""
self.net = net
self.compute_output = None
self.compute_output_dict = dict()
# get input shape of network
l_in = lasagne.layers.helper.get_all_layers(self.net)[0]
self.input_shape = l_in.output_shape
if print_architecture:
print_net_architecture(net)
def fit(self, data, training_strategy, dump_file=None, log_file=None):
""" Train model """
print("Training neural network...")
col = BColors()
# create data pool if raw data is given
if "X_train" in data:
data_pools = dict()
data_pools['train'] = DataPool(data['X_train'], data['y_train'])
data_pools['valid'] = DataPool(data['X_valid'], data['y_valid'])
else:
data_pools = data
# check if out_path exists
if dump_file is not None:
out_path = os.path.dirname(dump_file)
if out_path != '' and not os.path.exists(out_path):
os.mkdir(out_path)
# log model evolution
if log_file is not None:
out_path = os.path.dirname(log_file)
if out_path != '' and not os.path.exists(out_path):
os.mkdir(out_path)
# adaptive learning rate
learn_rate = training_strategy.ini_learning_rate
learning_rate = theano.shared(floatX(learn_rate))
learning_rate.set_value(training_strategy.adapt_learn_rate(training_strategy.ini_learning_rate, 0))
# initialize evaluation output
pred_tr_err, pred_val_err, overfitting = [], [], []
tr_accs, va_accs = [], []
print("Compiling theano train functions...")
iter_funcs = self._create_iter_functions(y_tensor_type=training_strategy.y_tensor_type,
objective=training_strategy.objective, learning_rate=learning_rate,
l_2=training_strategy.L2,
compute_updates=training_strategy.update_parameters,
use_weights=training_strategy.use_weights)
print("Starting training...")
now = time.time()
try:
# initialize early stopping
last_improvement = 0
best_model = lasagne.layers.get_all_param_values(self.net)
# iterate training epochs
prev_tr_loss, prev_va_loss = np.inf, np.inf
for epoch in self._train(iter_funcs, data_pools, training_strategy.build_train_batch_iterator(),
training_strategy.build_valid_batch_iterator()):
print("Epoch {} of {} took {:.3f}s".format(epoch['number'], training_strategy.max_epochs,
time.time() - now))
now = time.time()
# --- collect train output ---
tr_loss, va_loss = epoch['train_loss'], epoch['valid_loss']
overfit = epoch['overfitting']
# prepare early stopping
improvement = va_loss < prev_va_loss
if improvement:
last_improvement = 0
best_model = lasagne.layers.get_all_param_values(self.net)
best_epoch = epoch['number']
# dump net parameters during training
if dump_file is not None:
with open(dump_file, 'wb') as fp:
pickle.dump(best_model, fp)
last_improvement += 1
# print train output
txt_tr = 'costs_tr %.5f ' % tr_loss
if tr_loss < prev_tr_loss:
txt_tr = col.print_colored(txt_tr, BColors.OKGREEN)
prev_tr_loss = tr_loss
txt_val = 'costs_val %.5f ' % va_loss
if va_loss < prev_va_loss:
txt_val = col.print_colored(txt_val, BColors.OKGREEN)
prev_va_loss = va_loss
print(' lr: %.5f' % learn_rate)
print(' ' + txt_tr + txt_val + 'tr/val %.3f' % overfit)
# collect model evolution data
pred_tr_err.append(tr_loss)
pred_val_err.append(va_loss)
overfitting.append(overfit)
# save results
exp_res = dict()
exp_res['pred_tr_err'] = pred_tr_err
exp_res['pred_val_err'] = pred_val_err
exp_res['overfitting'] = overfitting
if log_file is not None:
with open(log_file, 'w') as fp:
pickle.dump(exp_res, fp)
# --- early stopping: preserve best model ---
if last_improvement > training_strategy.patience:
print(col.print_colored("Early Stopping!", BColors.WARNING))
status = "Epoch: %d, Best Validation Loss: %.5f" % (
best_epoch, prev_va_loss)
print(col.print_colored(status, BColors.WARNING))
break
# maximum number of epochs reached
if epoch['number'] >= training_strategy.max_epochs:
break
# update learning rate
learn_rate = training_strategy.adapt_learn_rate(learn_rate, epoch['number'])
learning_rate.set_value(learn_rate)
except KeyboardInterrupt:
pass
# set net to best weights
lasagne.layers.set_all_param_values(self.net, best_model)
def predict_proba(self, input):
"""
Predict on test samples
"""
# prepare input for prediction
if not isinstance(input, list):
input = [input]
# reshape to network input
if input[0].ndim < len(self.input_shape):
input[0] = input[0].reshape([1] + list(input[0].shape))
if self.compute_output is None:
self.compute_output = self._compile_prediction_function()
return self.compute_output(*input)
def predict(self, input):
"""
Predict class labels on test samples
"""
# prepare input for prediction
if not isinstance(input, list):
input = [input]
return np.argmax(self.predict_proba(*input), axis=-1)
def compute_layer_output(self, input, layer):
"""
Compute output of given layer
layer: either a string (name of layer) or a layer object
"""
# prepare input for prediction
if not isinstance(input, list):
input = [input]
# reshape to network input
if input[0].ndim < len(self.input_shape):
input[0] = input[0].reshape([1] + list(input[0].shape))
# get layer by name
if not isinstance(layer, lasagne.layers.Layer):
for l in lasagne.layers.helper.get_all_layers(self.net):
if l.name == layer:
layer = l
break
# compile prediction function for target layer
if layer not in self.compute_output_dict:
self.compute_output_dict[layer] = self._compile_prediction_function(target_layer=layer)
return self.compute_output_dict[layer](*input)
def save(self, file_path):
"""
Save model to disk
"""
with open(file_path, 'w') as fp:
params = lasagne.layers.get_all_param_values(self.net)
pickle.dump(params, fp, -1)
def load(self, file_path):
"""
load model from disk
"""
with open(file_path, 'r') as fp:
params = pickle.load(fp)
lasagne.layers.set_all_param_values(self.net, params)
def _compile_prediction_function(self, target_layer=None):
"""
Compile theano prediction function
"""
# collect input vars
all_layers = lasagne.layers.helper.get_all_layers(self.net)
input_vars = []
for l in all_layers:
if isinstance(l, lasagne.layers.InputLayer):
input_vars.append(l.input_var)
# get network output nad compile function
if target_layer is None:
target_layer = self.net
net_output = lasagne.layers.get_output(target_layer, deterministic=True)
return theano.function(inputs=input_vars, outputs=net_output)
def _create_iter_functions(self, y_tensor_type, objective, learning_rate, l_2, compute_updates, use_weights):
""" Create functions for training, validation and testing to iterate one epoch. """
# init target tensor
targets = y_tensor_type('y')
weights = y_tensor_type('w')
# get input layer
all_layers = lasagne.layers.helper.get_all_layers(self.net)
# collect input vars
input_vars = []
for l in all_layers:
if isinstance(l, lasagne.layers.InputLayer):
input_vars.append(l.input_var)
# compute train costs
tr_output = lasagne.layers.get_output(self.net, deterministic=False)
if use_weights:
tr_cost = objective(tr_output, targets, weights)
tr_input = input_vars + [targets, weights]
else:
tr_cost = objective(tr_output, targets)
tr_input = input_vars + [targets]
# regularizer for RNNs
for l in all_layers:
if l.name == "norm_reg_rnn":
H = lasagne.layers.get_output(l, deterministic=False)
H_l2 = T.sqrt(T.sum(H**2, axis=-1))
norm_diffs = (H_l2[:, 1:] - H_l2[:, :-1])**2
norm_preserving_loss = T.mean(norm_diffs)
beta = 1.0
tr_cost += beta * norm_preserving_loss
else:
pass
# compute validation costs
va_output = lasagne.layers.get_output(self.net, deterministic=True)
va_cost = objective(va_output, targets)
# collect all parameters of net and compute updates
all_params = lasagne.layers.get_all_params(self.net, trainable=True)
# add weight decay
if l_2 is not None:
all_layers = lasagne.layers.get_all_layers(self.net)
tr_cost += l_2 * lasagne.regularization.regularize_layer_params(all_layers, lasagne.regularization.l2)
# compute updates
all_grads = lasagne.updates.get_or_compute_grads(tr_cost, all_params)
updates = compute_updates(all_grads, all_params, learning_rate)
# compile iter functions
tr_outputs = [tr_cost, tr_output]
iter_train = theano.function(tr_input, tr_outputs, updates=updates)
va_inputs = input_vars + [targets]
va_outputs = [va_cost, va_output]
iter_valid = theano.function(va_inputs, va_outputs)
return dict(train=iter_train, valid=iter_valid, test=iter_valid)
def _train(self, iter_funcs, data_pools, train_batch_iter, valid_batch_iter):
"""
Train the model with `dataset` with mini-batch training.
Each mini-batch has `batch_size` recordings.
"""
col = BColors()
for epoch in itertools.count(1):
# iterate train batches
batch_train_losses = []
iterator = train_batch_iter(data_pools['train'])
generator = threaded_generator_from_iterator(iterator)
batch_times = np.zeros(5, dtype=np.float32)
start, after = time.time(), time.time()
for i_batch, train_input in enumerate(generator):
batch_res = iter_funcs['train'](*train_input)
batch_train_losses.append(batch_res[0])
# train time
batch_time = time.time() - after
after = time.time()
train_time = (after - start)
# estimate updates per second (running avg)
batch_times[0:4] = batch_times[1:5]
batch_times[4] = batch_time
ups = 1.0 / batch_times.mean()
# report loss during training
perc = 100 * (float(i_batch) / train_batch_iter.n_batches)
dec = int(perc // 4)
progbar = "|" + dec * "#" + (25 - dec) * "-" + "|"
vals = (perc, progbar, train_time, ups, np.mean(batch_train_losses))
loss_str = " (%d%%) %s time: %.2fs, ups: %.2f, loss: %.5f" % vals
print(col.print_colored(loss_str, col.WARNING), end="\r")
sys.stdout.flush()
# print("\x1b[K", end="\r")
print(' ')
print(' ')
avg_train_loss = np.mean(batch_train_losses)
# evaluate classification power of data set
# iterate validation batches
batch_valid_losses = []
iterator = valid_batch_iter(data_pools['valid'])
generator = threaded_generator_from_iterator(iterator)
for va_input in generator:
batch_res = iter_funcs['valid'](*va_input)
batch_valid_losses.append(batch_res[0])
avg_valid_loss = np.mean(batch_valid_losses)
# collect results
yield {
'number': epoch,
'train_loss': avg_train_loss,
'valid_loss': avg_valid_loss,
'overfitting': avg_train_loss / avg_valid_loss,
}
class Network(object):
"""
Neural Network
"""
def __init__(self, net, print_architecture=True):
"""
Constructor
"""
self.net = net
self.compute_output = None
self.compute_output_dict = dict()
# get input shape of network
l_in = lasagne.layers.helper.get_all_layers(self.net)[0]
self.input_shape = l_in.output_shape
if print_architecture:
print_net_architecture(net)
def fit(self, data, training_strategy, dump_file=None, log_file=None):
""" Train model """
print("Training neural network...")
col = BColors()
# create data pool if raw data is given
if "X_train" in data:
data_pools = dict()
data_pools['train'] = DataPool(data['X_train'], data['y_train'])
data_pools['valid'] = DataPool(data['X_valid'], data['y_valid'])
else:
data_pools = data
# check if out_path exists
if dump_file is not None:
out_path = os.path.dirname(dump_file)
if out_path != '' and not os.path.exists(out_path):
os.mkdir(out_path)
# log model evolution
if log_file is not None:
out_path = os.path.dirname(log_file)
if out_path != '' and not os.path.exists(out_path):
os.mkdir(out_path)
# adaptive learning rate
learn_rate = training_strategy.ini_learning_rate
learning_rate = theano.shared(floatX(learn_rate))
learning_rate.set_value(training_strategy.adapt_learn_rate(training_strategy.ini_learning_rate, 0))
# initialize evaluation output
pred_tr_err, pred_val_err, overfitting = [], [], []
tr_accs, va_accs = [], []
print("Compiling theano train functions...")
iter_funcs = self._create_iter_functions(y_tensor_type=training_strategy.y_tensor_type,
objective=training_strategy.objective, learning_rate=learning_rate,
l_2=training_strategy.L2,
compute_updates=training_strategy.update_parameters,
use_weights=training_strategy.use_weights,
use_mask=training_strategy.use_mask)
print("Starting training...")
now = time.time()
try:
# initialize early stopping
last_improvement = 0
best_model = lasagne.layers.get_all_param_values(self.net)
# iterate training epochs
best_va_dice = 0.0
prev_tr_loss, prev_va_loss = 1e7, 1e7
prev_acc_tr, prev_acc_va = 0.0, 0.0
for epoch in self._train(iter_funcs, data_pools, training_strategy.build_train_batch_iterator(),
training_strategy.build_valid_batch_iterator(), training_strategy.report_dices):
print("Epoch {} of {} took {:.3f}s".format(epoch['number'], training_strategy.max_epochs, time.time() - now))
now = time.time()
# --- collect train output ---
tr_loss, va_loss = epoch['train_loss'], epoch['valid_loss']
train_acc, valid_acc = epoch['train_acc'], epoch['valid_acc']
train_dices, valid_dices = epoch['train_dices'], epoch['valid_dices']
overfit = epoch['overfitting']
# prepare early stopping
improvement = va_loss < prev_va_loss
if improvement:
last_improvement = 0
best_model = lasagne.layers.get_all_param_values(self.net)
best_epoch = epoch['number']
# dump net parameters during training
if dump_file is not None:
with open(dump_file, 'wb') as fp:
pickle.dump(best_model, fp)
last_improvement += 1
# print train output
txt_tr = 'costs_tr %.5f ' % tr_loss
if tr_loss < prev_tr_loss:
txt_tr = col.print_colored(txt_tr, BColors.OKGREEN)
prev_tr_loss = tr_loss
txt_tr_acc = '(%.3f)' % train_acc
if train_acc > prev_acc_tr:
txt_tr_acc = col.print_colored(txt_tr_acc, BColors.OKGREEN)
prev_acc_tr = train_acc
txt_tr += txt_tr_acc + ', '
txt_val = 'costs_val %.5f ' % va_loss
if va_loss < prev_va_loss:
txt_val = col.print_colored(txt_val, BColors.OKGREEN)
prev_va_loss = va_loss
txt_va_acc = '(%.3f)' % valid_acc
if valid_acc > prev_acc_va:
txt_va_acc = col.print_colored(txt_va_acc, BColors.OKGREEN)
prev_acc_va = valid_acc
txt_val += txt_va_acc + ', '
print(' lr: %.5f' % learn_rate)
print(' ' + txt_tr + txt_val + 'tr/val %.3f' % overfit)
# report dice coefficients
if training_strategy.report_dices:
train_str = ' train |'
for key in np.sort(train_dices.keys()):
train_str += ' %.2f: %.3f |' % (key, train_dices[key])
print(train_str)
train_acc = np.max(train_dices.values())
valid_str = ' valid |'
for key in np.sort(valid_dices.keys()):
txt_va_dice = ' %.2f: %.3f |' % (key, valid_dices[key])
if valid_dices[key] > best_va_dice and valid_dices[key] == np.max(valid_dices.values()):
best_va_dice = valid_dices[key]
txt_va_dice = col.print_colored(txt_va_dice, BColors.OKGREEN)
valid_str += txt_va_dice
print(valid_str)
valid_acc = np.max(valid_dices.values())
# collect model evolution data
tr_accs.append(train_acc)
va_accs.append(valid_acc)
pred_tr_err.append(tr_loss)
pred_val_err.append(va_loss)
overfitting.append(overfit)
# save results
exp_res = dict()
exp_res['pred_tr_err'] = pred_tr_err
exp_res['tr_accs'] = tr_accs
exp_res['pred_val_err'] = pred_val_err
exp_res['va_accs'] = va_accs
exp_res['overfitting'] = overfitting
if log_file is not None:
with open(log_file, 'wb') as fp:
pickle.dump(exp_res, fp)
# --- early stopping: preserve best model ---
if last_improvement > training_strategy.patience:
print(col.print_colored("Early Stopping!", BColors.WARNING))
status = "Epoch: %d, Best Validation Loss: %.5f: Acc: %.5f" % (
best_epoch, prev_va_loss, prev_acc_va)
print(col.print_colored(status, BColors.WARNING))
if training_strategy.refinement_strategy.n_refinement_steps <= 0:
break
else:
status = "Resetting to best model so far and refining with adopted learn rate."
print(col.print_colored(status, BColors.WARNING))
# reset net to best weights
lasagne.layers.set_all_param_values(self.net, best_model)
# update learn rate
learn_rate = training_strategy.refinement_strategy.adapt_learn_rate(learn_rate)
last_improvement = 0
training_strategy.patience = training_strategy.refinement_strategy.refinement_patience
# maximum number of epochs reached
if epoch['number'] >= training_strategy.max_epochs:
break
# update learning rate
learn_rate = training_strategy.adapt_learn_rate(learn_rate, epoch['number'])
learning_rate.set_value(learn_rate)
except KeyboardInterrupt:
pass
# set net to best weights
lasagne.layers.set_all_param_values(self.net, best_model)
def predict_proba(self, input):
"""
Predict on test samples
"""
# prepare input for prediction
if not isinstance(input, list):
input = [input]
# reshape to network input
if input[0].ndim < len(self.input_shape):
input[0] = input[0].reshape([1] + list(input[0].shape))
if self.compute_output is None:
self.compute_output = self._compile_prediction_function()
return self.compute_output(*input)
def predict(self, input):
"""
Predict class labels on test samples
"""
return np.argmax(self.predict_proba(input), axis=1)
def compute_layer_output(self, input, layer):
"""
Compute output of given layer
layer: either a string (name of layer) or a layer object
"""
# prepare input for prediction
if not isinstance(input, list):
input = [input]
# reshape to network input
if input[0].ndim < len(self.input_shape):
input[0] = input[0].reshape([1] + list(input[0].shape))
# get layer by name
if not isinstance(layer, lasagne.layers.Layer):
for l in lasagne.layers.helper.get_all_layers(self.net):
if l.name == layer:
layer = l
break
# compile prediction function for target layer
if layer not in self.compute_output_dict:
self.compute_output_dict[layer] = self._compile_prediction_function(target_layer=layer)
return self.compute_output_dict[layer](*input)
def save(self, file_path):
"""
Save model to disk
"""
with open(file_path, 'w') as fp:
params = lasagne.layers.get_all_param_values(self.net)
pickle.dump(params, fp, -1)
def load(self, file_path):
"""
load model from disk
"""
with open(file_path, 'rb') as fp:
params = pickle.load(fp)
lasagne.layers.set_all_param_values(self.net, params)
def _compile_prediction_function(self, target_layer=None):
"""
Compile theano prediction function
"""
# collect input vars
all_layers = lasagne.layers.helper.get_all_layers(self.net)
input_vars = []
for l in all_layers:
if isinstance(l, lasagne.layers.InputLayer):
input_vars.append(l.input_var)
# get network output nad compile function
if target_layer is None:
target_layer = self.net
net_output = lasagne.layers.get_output(target_layer, deterministic=True)
return theano.function(inputs=input_vars, outputs=net_output)
def _create_iter_functions(self, y_tensor_type, objective, learning_rate, l_2, compute_updates, use_weights, use_mask):
""" Create functions for training, validation and testing to iterate one epoch. """
# init target tensor
targets = y_tensor_type('y')
weights = y_tensor_type('w').astype("float32")
# get input layer
all_layers = lasagne.layers.helper.get_all_layers(self.net)
# collect input vars
input_vars = []
for l in all_layers:
if isinstance(l, lasagne.layers.InputLayer):
input_vars.append(l.input_var)
# compute train costs
tr_output = lasagne.layers.get_output(self.net, deterministic=False)
if use_weights or use_mask:
tr_cost = objective(tr_output, targets, weights)
tr_input = input_vars + [targets, weights]
else:
tr_cost = objective(tr_output, targets)
tr_input = input_vars + [targets]
# regularize RNNs
for l in all_layers:
# if l.name == "norm_reg_rnn":
#
# H = lasagne.layers.get_output(l, deterministic=False)
# H_l2 = T.sqrt(T.sum(H ** 2, axis=-1))
# norm_diffs = (H_l2[:, 1:] - H_l2[:, :-1]) ** 2
# norm_preserving_loss = T.mean(norm_diffs)
#
# beta = 1.0
# tr_cost += beta * norm_preserving_loss
if l.name == "norm_reg_rnn":
H = lasagne.layers.get_output(l, deterministic=False)
steps = T.arange(1, l.output_shape[1])
def compute_norm_diff(k, H):
n0 = ((H[:, k - 1, :]) ** 2).sum(1).sqrt()
n1 = ((H[:, k, :]) ** 2).sum(1).sqrt()
return (n1 - n0) ** 2
norm_diffs, _ = theano.scan(fn=compute_norm_diff, outputs_info=None,
non_sequences=[H], sequences=[steps])
beta = 1.0
norm_preserving_loss = T.mean(norm_diffs)
tr_cost += beta * norm_preserving_loss
else:
pass
# compute validation costs
va_output = lasagne.layers.get_output(self.net, deterministic=True)
#va_output_stochastic = lasagne.layers.get_output(self.net, deterministic=False)
# estimate accuracy
if y_tensor_type == T.ivector:
va_acc = 100.0 * T.mean(T.eq(T.argmax(va_output, axis=1), targets), dtype=theano.config.floatX)
tr_acc = 100.0 * T.mean(T.eq(T.argmax(tr_output, axis=1), targets), dtype=theano.config.floatX)
else:
va_acc, tr_acc = None, None
# collect all parameters of net and compute updates
all_params = lasagne.layers.get_all_params(self.net, trainable=True)
# add weight decay
if l_2 is not None:
all_layers = lasagne.layers.get_all_layers(self.net)
tr_cost += l_2 * lasagne.regularization.regularize_layer_params(all_layers, lasagne.regularization.l2)
# compute updates
all_grads = lasagne.updates.get_or_compute_grads(tr_cost, all_params)
updates = compute_updates(all_grads, all_params, learning_rate)
# compile iter functions
tr_outputs = [tr_cost, tr_output]
if tr_acc is not None:
tr_outputs.append(tr_acc)
iter_train = theano.function(tr_input, tr_outputs, updates=updates)
if use_mask:
va_inputs = input_vars + [targets, weights]
va_cost = objective(va_output, targets, weights)
else:
va_inputs = input_vars + [targets]
va_cost = objective(va_output, targets)
va_outputs = [va_cost, va_output]
if va_acc is not None:
va_outputs.append(va_acc)
iter_valid = theano.function(va_inputs, va_outputs )
return dict(train=iter_train, valid=iter_valid, test=iter_valid)
def _train(self, iter_funcs, data_pools, train_batch_iter, valid_batch_iter, estimate_dices):
"""
Train the model with `dataset` with mini-batch training.
Each mini-batch has `batch_size` recordings.
"""
col = BColors()
from lasagne_wrapper.segmentation_utils import dice
for epoch in itertools.count(1):
# evaluate various thresholds
if estimate_dices:
threshs = [0.3, 0.4, 0.5, 0.6, 0.7]
tr_dices = dict()
for thr in threshs:
tr_dices[thr] = []
va_dices = dict()
for thr in threshs:
va_dices[thr] = []
else:
tr_dices = None
va_dices = None
# iterate train batches
batch_train_losses, batch_train_accs = [], []
iterator = train_batch_iter(data_pools['train'])
generator = threaded_generator_from_iterator(iterator)
batch_times = np.zeros(5, dtype=np.float32)
start, after = time.time(), time.time()
for i_batch, train_input in enumerate(generator):
batch_res = iter_funcs['train'](*train_input)
batch_train_losses.append(batch_res[0])
# collect classification accuracies
if len(batch_res) > 2:
batch_train_accs.append(batch_res[2])
# estimate dices for various thresholds
if estimate_dices:
y_b = train_input[1]
pred = batch_res[1]
for thr in threshs:
for i in xrange(pred.shape[0]):
seg = pred[i, 0] > thr
tr_dices[thr].append(100 * dice(seg, y_b[i, 0]))
# train time
batch_time = time.time() - after
after = time.time()
train_time = (after - start)
# estimate updates per second (running avg)
batch_times[0:4] = batch_times[1:5]
batch_times[4] = batch_time
ups = 1.0 / batch_times.mean()
# report loss during training
perc = 100 * (float(i_batch) / train_batch_iter.n_batches)
dec = int(perc // 4)
progbar = "|" + dec * "#" + (25 - dec) * "-" + "|"
vals = (perc, progbar, train_time, ups, np.mean(batch_train_losses))
loss_str = " (%d%%) %s time: %.2fs, ups: %.2f, loss: %.5f" % vals
print(col.print_colored(loss_str, col.WARNING), end="\r")
sys.stdout.flush()
# print("\x1b[K", end="\r")
print(' ')
print(' ')
avg_train_loss = np.mean(batch_train_losses)
avg_train_acc = np.mean(batch_train_accs) if len(batch_train_accs) > 0 else 0.0
if estimate_dices:
for thr in threshs:
tr_dices[thr] = np.mean(tr_dices[thr])
# evaluate classification power of data set
# iterate validation batches
batch_valid_losses, batch_valid_accs = [], []
iterator = valid_batch_iter(data_pools['valid'])
generator = threaded_generator_from_iterator(iterator)
for va_input in generator:
batch_res = iter_funcs['valid'](*va_input)
batch_valid_losses.append(batch_res[0])
# collect classification accuracies
if len(batch_res) > 2:
batch_valid_accs.append(batch_res[2])
# estimate dices for various thresholds
if estimate_dices:
y_b = va_input[1]
pred = batch_res[1]
for thr in threshs:
for i in xrange(pred.shape[0]):
seg = pred[i, 0] > thr
va_dices[thr].append(100 * dice(seg, y_b[i, 0]))
avg_valid_loss = np.mean(batch_valid_losses)
avg_valid_accs = np.mean(batch_valid_accs) if len(batch_valid_accs) > 0 else 0.0
if estimate_dices:
for thr in threshs:
va_dices[thr] = np.mean(va_dices[thr])
# collect results
yield {
'number': epoch,
'train_loss': avg_train_loss,
'train_acc': avg_train_acc,
'valid_loss': avg_valid_loss,
'valid_acc': avg_valid_accs,
'valid_dices': va_dices,
'train_dices': tr_dices,
'overfitting': avg_train_loss / avg_valid_loss,
}
class SegmentationNetwork(Network):
"""
Segmentation Neural Network
"""
def predict_proba(self, input, squeeze=True):
"""
Predict on test samples
"""
if self.compute_output is None:
self.compute_output = self._compile_prediction_function()
# get network input shape
l_in = lasagne.layers.helper.get_all_layers(self.net)[0]
in_shape = l_in.output_shape[-2::]
# standard prediction
if input.shape[-2::] == in_shape:
proba = self.compute_output(input)
# sliding window prediction if images do not match
else:
proba = self._predict_proba_sliding_window(input)
if squeeze:
proba = proba.squeeze()
return proba
def predict(self, input, thresh=0.5):
"""
Predict label map on test samples
"""
P = self.predict_proba(input, squeeze=False)
# binary segmentation
if P.shape[1] == 1:
return (P > thresh).squeeze()
# categorical segmentation
else:
return np.argmax(P, axis=1).squeeze()
def _predict_proba_sliding_window(self, images):
"""
Sliding window prediction for images larger than the input layer
"""
n_images = images.shape[0]
h, w = images.shape[2:4]
_, Nc, sh, sw = self.net.output_shape
step_h = sh // 2
step_w = sw // 2
row_0 = np.arange(0, h - step_h, step_h)
row_1 = row_0 + sh
shift = h - row_1[-1]
row_0[-1] += shift
row_1[-1] += shift
col_0 = np.arange(0, w - step_w, step_w)
col_1 = col_0 + sw
shift = w - col_1[-1]
col_0[-1] += shift
col_1[-1] += shift
# initialize result image
R = np.zeros((n_images, Nc, h, w))
V = np.zeros((n_images, Nc, h, w))
for ir in xrange(len(row_0)):
for ic in xrange(len(col_0)):
I = images[:, :, row_0[ir]:row_1[ir], col_0[ic]:col_1[ic]]
# predict on test image
P = self.predict_proba(I, squeeze=False)
R[:, :, row_0[ir]:row_1[ir], col_0[ic]:col_1[ic]] += P
V[:, :, row_0[ir]:row_1[ir], col_0[ic]:col_1[ic]] += 1
# normalize predictions
R /= V
return R
| 37.196555 | 125 | 0.551766 | 4,279 | 36,713 | 4.474877 | 0.082496 | 0.027157 | 0.016712 | 0.009192 | 0.83873 | 0.820242 | 0.800762 | 0.784155 | 0.77324 | 0.767391 | 0 | 0.010702 | 0.356059 | 36,713 | 986 | 126 | 37.23428 | 0.799247 | 0.11421 | 0 | 0.711604 | 0 | 0 | 0.037471 | 0 | 0.003413 | 0 | 0 | 0 | 0 | 1 | 0.040956 | false | 0.006826 | 0.025597 | 0 | 0.09727 | 0.068259 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
855b87e90d7767aa432b88650d3ba038a52c8ab6 | 2,195 | py | Python | src/ConnectSignal/SelectModeRadioAndCombo.py | Lovely-XPP/tkzgeom | bf68e139dc05f759542d6611f4dc07f4f2727b92 | [
"MIT"
] | 41 | 2021-11-24T05:54:08.000Z | 2022-03-26T10:19:30.000Z | src/ConnectSignal/SelectModeRadioAndCombo.py | Lovely-XPP/tkzgeom | bf68e139dc05f759542d6611f4dc07f4f2727b92 | [
"MIT"
] | 1 | 2022-02-28T04:34:51.000Z | 2022-03-07T10:49:27.000Z | src/ConnectSignal/SelectModeRadioAndCombo.py | Lovely-XPP/tkzgeom | bf68e139dc05f759542d6611f4dc07f4f2727b92 | [
"MIT"
] | 10 | 2021-11-24T07:35:17.000Z | 2022-03-25T18:42:14.000Z |
def point_radio_func(main_window):
"""Connect point radio button."""
main_window.scene.select_mode.set_mode(0, main_window.point_combo.currentIndex(), True)
print(main_window.scene.select_mode.get_type())
def segment_radio_func(main_window):
"""Connect segment radio button."""
main_window.scene.select_mode.set_mode(1, 0, True)
print(main_window.scene.select_mode.get_type())
def circle_radio_func(main_window):
"""Connect circle radio button."""
main_window.scene.select_mode.set_mode(2, main_window.circle_combo.currentIndex(), True)
print(main_window.scene.select_mode.get_type(), True)
def polygon_radio_func(main_window):
"""Connect polygon radio button."""
main_window.scene.select_mode.set_mode(1, 1, True)
print(main_window.scene.select_mode.get_type())
def linestring_radio_func(main_window):
"""Connect linestring radio button."""
main_window.scene.select_mode.set_mode(1, 2, True)
print(main_window.scene.select_mode.get_type())
def angle_radio_func(main_window):
"""Connect angle radio button."""
main_window.scene.select_mode.set_mode(4, 0, True)
print(main_window.scene.select_mode.get_type())
def right_angle_radio_func(main_window):
"""Connect right angle radio button."""
main_window.scene.select_mode.set_mode(4, 1, True)
print(main_window.scene.select_mode.get_type())
def cloud_radio_func(main_window):
"""Connect point cloud radio button."""
main_window.scene.select_mode.set_mode(6, main_window.cloud_combo.currentIndex(), True)
print(main_window.scene.select_mode.get_type(), True)
def point_combo_func(value, main_window):
"""Connect point comboBox."""
main_window.scene.select_mode.set_mode(0, value, False)
print(main_window.scene.select_mode.get_type())
def circle_combo_func(value, main_window):
"""Connect circle comboBox."""
main_window.scene.select_mode.set_mode(2, value, False)
print(main_window.scene.select_mode.get_type())
def cloud_combo_func(value, main_window):
"""Connect point cloud comboBox."""
main_window.scene.select_mode.set_mode(6, value, False)
print(main_window.scene.select_mode.get_type())
| 33.257576 | 92 | 0.749886 | 326 | 2,195 | 4.723926 | 0.101227 | 0.233766 | 0.214286 | 0.3 | 0.915584 | 0.841558 | 0.740909 | 0.694156 | 0.612338 | 0.528571 | 0 | 0.008303 | 0.122096 | 2,195 | 65 | 93 | 33.769231 | 0.790867 | 0.147608 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.333333 | 0.333333 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a457e611146932e8bf2c7117e78e5aedd91a2ebe | 7,360 | py | Python | cubepy/test/test_factory.py | jorgedouglas71/pyplan-ide | 5ad0e4a2592b5f2716ff680018f717c65de140f5 | [
"MIT"
] | 17 | 2019-12-04T19:22:19.000Z | 2021-07-28T11:17:05.000Z | cubepy/test/test_factory.py | jorgedouglas71/pyplan-ide | 5ad0e4a2592b5f2716ff680018f717c65de140f5 | [
"MIT"
] | 9 | 2019-12-13T15:34:43.000Z | 2022-02-10T11:43:00.000Z | cubepy/test/test_factory.py | jorgedouglas71/pyplan-ide | 5ad0e4a2592b5f2716ff680018f717c65de140f5 | [
"MIT"
] | 5 | 2019-12-04T15:57:06.000Z | 2021-08-20T19:59:26.000Z | import unittest
import cubepy.factory as cp
class FactoryTests(unittest.TestCase):
indexA = cp.index("IndexA", ['Item A '+str(xx) for xx in range(1,21)])
indexB = cp.index("IndexB", ['Item B '+str(xx) for xx in range(1,31)])
indexC = cp.index("IndexC", ['Item C '+str(xx) for xx in range(1,41)])
cube1 = cp.cube([indexA, indexB],[[42,78,96,48,86,42,32,84, 4,26,61,55,85,72,48,96,97,82, 5,28,47,99,94,58, 11,87,97,20,30,52], [37,83,28, 2,75,56,72,24,72,21,71,93,83,43,44, 3,34,19,73,16,58, 9,46,33, 81,44,67,83,69,20], [61,41,55,31,78,43,43,86,71,44,64,79,98,31,78, 1,69,76,26,37,48,66,51,37, 57,50,66,35,43,88], [16,51,28,78,80, 9,57,11,19,69,29,52,29,45, 8,28,90,18,64,33,21,36,31,21, 97,90, 7,20,29,21], [18,73, 4,84,13,73,85,65,27,14,68, 8,21,49,19,64,37, 3,19,33,48, 6,24,57, 65,88,24,67,65,42], [49, 4,80, 6,71, 8,15,52,27,11,80,36, 4,33,77,49,77,89,16, 1,15,67,88,64, 23, 0,78,39,24,45], [62,53,49,50,56,19,44,70, 5,45,35,95,93,21,13,42,53,86,78,64,86,31,14,28, 33,23,81,44,56,58], [ 4,17,89, 6,36,87, 9,13,82, 2,21,69,16,40,96, 1,75, 2,19,41,34, 9,40,57, 4,17,34,37,97,74], [32,97, 2,20,61,38,25,90,19,60,68,54,85,78,46,36,21,64,17,98,82,85,40,81, 91,84,28,79,85,72], [60,25,56,26, 2,13,74,45,10,97,75,73,40,32,73,22,50,22, 4,19,66,78,64,14, 11, 2,99, 9,52,31], [83,42,36,45,28,88,74,82,34,49,93,59,88,59,25,91,29,93,68,34, 9,90,18,48, 57,16,58,67,81,48], [23,81, 6,85,96,98,92,28,18, 9,15,65,75,54,22,13,32,38,69,56,39,90,43,34, 93,45,74,80, 9,64], [94,28,63,77,96,33,75,23, 1, 9,61, 2,47,28,19,41,71, 5,93,68,79,39,51,51, 18,32,94,92,19,63], [28,28,10,75,30,49,10,64,17,51,28,26,41, 5,34,29,50,92,37, 8,29,89,49,76, 5,53,63,96,86,19], [74,67, 8,59,78,27,70,30,15,17,46,36,41,45, 3,25, 1,73,23,77,24, 4,28,89, 96,47,63,28, 8,89], [96,97,97,14,31,69,91,71,13,23,87,57,50,14, 7,34,17,63,51,88,46,82,27,43, 58,29,11,16, 5,22], [ 9, 3,68,93,18,71,53,10,54,40,88,14,54,67,15,71,28,45,74,63,21,50,63,78, 95,15,34,83,56,95], [56,53,30,37, 0,72,85,71,77, 3,30,34, 9,15,69,44,91, 5,43,13,72,41,45,21, 84,92,46, 0,60,15], [13,40,97,37,76,93,18,88,51,93, 8,83,97,78,22,42,73,51,16, 6,70,15, 3,49, 43, 4,60,88,28,65], [98,62,12,83,56,33,59,13,10,17,37, 7,84,10,87,28, 2,43, 8,80,69,60,43, 5, 40,54,33,71,13,74]])
cube2 = cp.cube([indexB, indexC],[[60, 9, 0,28,60,24,58,64,76,84,59,26,79,82, 7,58,85,57,13,69,87,14,23,53, 10,24,16, 0,32, 4,89,75,10,96,20,91,40,71,17,60], [86,52,66,37,62,12,11,77,51,13,98,32,65,76,28,85,17, 4,18,74,44,58,36,17, 63,92,20,91,64,41, 5,67,66,84,16,17,37,68,92,85], [69,40,22,14,85,35,97,46,48,72, 3,80, 0,26,68,35,79,61,83,87,10,87,46,86, 2,18, 7,92,86,87,21,37,65,11,35,18,26,16,23,55], [ 2,86,88,38,87,70,12, 4,85,95,36,12,57,64,98,60,62,25,90,73,53,75,12,21, 54,34,43,59,67,58,90,71,50,84,64,83,33,52,49,68], [71,63,41,91,46,67,47,31,45,59,64,92,88, 4,31,15,29,59,94,45,31,52,57,26, 58,98,97,24,29, 2,87,48,61,74,76,92,57,73,22,10], [27,97,93, 6,98,24,63,72,36,31,38, 4,11,76,53,73,52, 6,54,46, 9,31,73,73, 13,95,64,43,10,73, 6,23,40,47,32,36,71, 2,19,50], [ 8,54,76,25,38,13,75,92,67,42,68,52,99,47,18,28,97,93,68, 4, 7, 5,47, 9, 65,33,65,89,67,85,97,91,85,14,42,71,25,90,24,38], [31,32,47,56,45,86,45,90,87,77,11,76,46, 4,74,73,77,39,81,84,58,28,23,51, 48,15,64,20,16,66,19, 0,93,65,37,77,74,97,52,85], [86,11,21,28,53,77,35, 5,42,11,19,77, 4,15,76,82,62, 9, 0,45,21,30,47,87, 38,75,58,51,99,27,31,47,49,26,69,39,19,12,77,16], [ 6, 7,99,45,12,36,91,23,42, 6,10,82,80,81,18,24,28,75,82,67,20,41,69,41, 88,83,72,40,21, 9, 6,27,21,79, 7, 9,44,36,94,54], [88,35,67,31,97,33,13,48,44,34, 7,37,46,98, 6,62,34,26, 5,68,72,17,59,84, 68, 2, 0,72, 3, 0,88,19,38,80, 2,50, 8,94,23,81], [55,30,83,96,60,12,56,60,35,79,26,34,56,63,49,60,87,63,90,42,95,71,48,89, 85,29,87,60,33, 5,24,38,43, 5,21,27,81,87, 8,83], [67,96,29,72,94,25,45,12,54,24,39, 7,37,65,52,98,56,14,33,88,88,73,43,53, 34,94, 5,18,65,51,31,69,38, 2,84,46,66,16,13, 0], [23,99,45,46,54,46,80,36,94,34,94,60,56,92, 2,15,75,29, 8,93,11,79,21, 6, 38,67, 4,13, 6,14,64,47,46,63,67,27,10,44,83, 8], [44, 2,62,31,88,39, 4, 7,45,75,14,15,60, 1,63,93,96,74,83,88,22,22,13,28, 76,72, 9,28,22, 4,36,43,34,36,74,10, 1,70,41,74], [30,75,69,75,83,54,76,90, 9,56, 9,46,25, 2,53,89,76,98,81,99,11,59,19,30, 99,48,23,30,28,56,74,58,95,71,35,40,56,81, 0,30], [41,93,67,11,67,66,32,42,85,43,85,21, 2,88,65,33,84,81,96,39,42,61,80,18, 56,45,60,11,49, 4, 9,33,57,37,43,84,29,42,88, 1], [22,58,72, 5,93,24,72,36,10,14,62,86,63,27,66,95,73,17,80,14, 9,67,42,98, 27,92,55,85,12,16,40,36,20,91,95,51,74,43, 1,37], [64,79, 3,95,25, 5,31,84,27,86,29,46,83,88,17,33,93,91,10,83, 4,14,73,73, 5,91,33,24, 0,97,28,70,66,29,46,40, 2,25,90,20], [70,71,85,37, 9,32,17,43,65,41,11,16,73,24,44,50,51,13,70,78,90,53,95,49, 39,86,78,44,19,87,99,29,44,38,31,12,18,42,50,66], [16,47,12,91,43,58,85,66,34, 0,16, 9,26,33,50,94,93,74,97,12,30,70,15, 1, 19, 9,98,55, 1,44,53,47,35,48,18,19,17,29,62, 1], [63,19,84,27, 0,26,99,82,81,27,73,44,27,95, 9,75,32,36,18, 6,14,74,98,32, 15,21, 9,46,62,32,21,37,80,71,34,40,53,25,33, 9], [58,23,92,59,43,69,80,41,26,74,16,47, 0,92, 4,11,91, 0,39, 5,11,42,34,60, 84,13,49,67,33,21,37,37, 8,45,79,15,37,15,37,84], [64,62,97,26,51,94,90,79,10,81,35,57,69,60, 7,69,13,42,95,47,93,61,54,60, 32,27,25,94,16,86,16,39,57, 6,79,50,54,63,21,19], [80,87,99, 8,69,90,50,93,84,91,56,80,10,70,85,37,17,23,46,76,97,54, 5,40, 88,87, 5,50,28,50,16,42,50,13, 1,96, 7,63,29,69], [ 7,68,89,12,42,98,68,83,86,23,31,69,84,69,21, 6,76,70, 6,53,44, 2,13,21, 8,40,35,66,34,17,83,18,25,95,49,33,33,25,97,49], [66,81,43,26,44,53,82,78,69,61,31,85,13,73,54,54,87,53,96,82,47,39,70,84, 75,67,32,13,62, 1,78,58,92,67,15,23,98,93,31,60], [84,53,72,54,12,44, 0,62,46,75,90,91,85,37,45,63,23,74,36,41,44,17,80,68, 76,57,40,14,64,13,92,75,70,76,60,65,91,86,23,99], [87,21,25,76,97,76,75,55,62,15,76,40,86,26, 5,21,92, 7,75,96,17,55,28,49, 51,16,40,48,71,74, 7,55,33,61, 2,48,76,33,21,46], [33, 6,32,32, 8,65,34,12,13,23,55,25,89, 5,22,83,40,55,48,46,18,21,31,23, 64,68,78,86,66,77,27,61,66,89,71,29,95,73,79,79]])
def test_find(self):
tmp = cp.cube([self.indexA],[False, False, False, False, False, False, False, False, False, True, False, False, False, False,False, False, False, False, False, True])
res = cp.find("0", self.indexA, cp.end_with) == tmp
self.assertTrue( res.any())
subA = cp.index("SubA",['Item A '+str(xx) for xx in range(10,16)])
tmp= cp.cube([self.indexA, subA],[[ True, True, True, True, True, True], [False,False,False,False,False,False], [False,False,False,False,False,False], [False,False,False,False,False,False], [False,False,False,False,False,False], [False,False,False,False,False,False], [False,False,False,False,False,False], [False,False,False,False,False,False], [False,False,False,False,False,False], [ True,False,False,False,False,False], [False, True,False,False,False,False], [False,False, True,False,False,False], [False,False,False, True,False,False], [False,False,False,False, True,False], [False,False,False,False,False, True], [False,False,False,False,False,False], [False,False,False,False,False,False], [False,False,False,False,False,False], [False,False,False,False,False,False], [False,False,False,False,False,False]])
res = cp.find(self.indexA, subA, cp.end_with) == tmp
self.assertTrue( res.any())
| 230 | 3,788 | 0.618614 | 2,060 | 7,360 | 2.208738 | 0.06699 | 0.257143 | 0.356044 | 0.435165 | 0.190989 | 0.178242 | 0.178242 | 0.17011 | 0.146374 | 0.146374 | 0 | 0.505457 | 0.078804 | 7,360 | 31 | 3,789 | 237.419355 | 0.165634 | 0 | 0 | 0.125 | 0 | 0 | 0.006931 | 0 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.0625 | false | 0 | 0.125 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a464c0552ece56a828b7fabe69ac0521055c3227 | 10,575 | py | Python | crm/migrations/0001_initial.py | ondrejsika/webtime_crm | 71fa61ae43307f99491ac65c751b9e1a521e5480 | [
"MIT"
] | 1 | 2016-10-26T12:13:05.000Z | 2016-10-26T12:13:05.000Z | crm/migrations/0001_initial.py | ondrejsika/webtime_crm | 71fa61ae43307f99491ac65c751b9e1a521e5480 | [
"MIT"
] | 9 | 2016-11-01T11:17:41.000Z | 2017-04-13T12:32:00.000Z | crm/migrations/0001_initial.py | ondrejsika/sikacrm | 71fa61ae43307f99491ac65c751b9e1a521e5480 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.3 on 2016-12-22 08:06
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Account',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('name', models.CharField(max_length=32)),
('company_id', models.CharField(blank=True, max_length=32, null=True)),
('vat_id', models.CharField(blank=True, max_length=32, null=True)),
('contact_address', models.TextField(blank=True, null=True)),
('billing_address', models.TextField(blank=True, null=True)),
('email', models.EmailField(blank=True, max_length=254, null=True)),
('phone', models.CharField(blank=True, max_length=16, null=True)),
('www', models.URLField(blank=True, null=True)),
('note', models.TextField(blank=True, null=True)),
('last_activity', models.DateField(blank=True, null=True)),
('next_activity', models.DateField(blank=True, null=True)),
('owner', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Case',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('state', models.CharField(choices=[('new', 'New'), ('in_progress', 'In progress'), ('approved', 'Approved'), ('done', 'Done'), ('cancelled', 'Cancelled')], default='new', max_length=16)),
('name', models.CharField(blank=True, max_length=32, null=True)),
('description', models.TextField(blank=True, null=True)),
('note', models.TextField(blank=True, null=True)),
('account', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Account')),
('owner', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Contact',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('name', models.CharField(max_length=32)),
('email', models.EmailField(blank=True, max_length=254, null=True)),
('phone', models.CharField(blank=True, max_length=16, null=True)),
('www', models.URLField(blank=True, null=True)),
('note', models.TextField(blank=True, null=True)),
('account', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Account')),
('owner', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Contract',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('start_at', models.DateField()),
('price', models.IntegerField()),
('length', models.IntegerField(help_text='Length in hours, day=8h')),
('name', models.CharField(blank=True, max_length=32, null=True)),
('description', models.TextField(blank=True, null=True)),
('note', models.TextField(blank=True, null=True)),
('account', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Account')),
('case', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Case')),
('owner', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Email',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('message_id', models.CharField(max_length=256)),
('date', models.DateTimeField()),
('email_from', models.CharField(blank=True, max_length=256, null=True)),
('email_to', models.CharField(blank=True, max_length=256, null=True)),
('folder', models.CharField(choices=[('inbox', 'Inbox'), ('outbox', 'Outbox')], max_length=16)),
('subject', models.CharField(blank=True, max_length=256, null=True)),
('body', models.TextField(blank=True, null=True)),
],
options={
'ordering': ('-date',),
},
),
migrations.CreateModel(
name='EmailAccount',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('email', models.EmailField(max_length=254)),
('smtp_host', models.CharField(blank=True, max_length=64, null=True)),
('smtp_port', models.IntegerField(blank=True, null=True)),
('smtp_user', models.CharField(blank=True, max_length=64, null=True)),
('smtp_password', models.CharField(blank=True, max_length=64, null=True)),
('smtp_tls', models.BooleanField(default=True)),
('imap_host', models.CharField(blank=True, max_length=64, null=True)),
('imap_port', models.IntegerField(blank=True, null=True)),
('imap_user', models.CharField(blank=True, max_length=64, null=True)),
('imap_password', models.CharField(blank=True, max_length=64, null=True)),
('imap_tls', models.BooleanField(default=True)),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='EmailAccountFolder',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('folder', models.CharField(max_length=64)),
('last_id', models.IntegerField(blank=True, null=True)),
('email_account', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='folder_set', to='crm.EmailAccount')),
],
),
migrations.CreateModel(
name='EmailConversation',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('name', models.CharField(blank=True, max_length=64, null=True)),
('case', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Case')),
('email_account', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='email_conversation_set', to='crm.EmailAccount')),
],
),
migrations.CreateModel(
name='EmailConversationReference',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('reference', models.CharField(db_index=True, max_length=256)),
('email_conversation', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.EmailConversation')),
],
),
migrations.CreateModel(
name='Tag',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=32)),
],
),
migrations.AddField(
model_name='emailconversation',
name='tag',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Tag'),
),
migrations.AddField(
model_name='email',
name='email_account',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='email_set', to='crm.EmailAccount'),
),
migrations.AddField(
model_name='email',
name='email_conversation',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='crm.EmailConversation'),
),
migrations.AddField(
model_name='contract',
name='tag',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Tag'),
),
migrations.AddField(
model_name='contact',
name='tag',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Tag'),
),
migrations.AddField(
model_name='case',
name='tag',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Tag'),
),
migrations.AddField(
model_name='account',
name='tag',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='crm.Tag'),
),
]
| 55.952381 | 204 | 0.592435 | 1,127 | 10,575 | 5.420586 | 0.1189 | 0.073662 | 0.068096 | 0.089049 | 0.813063 | 0.799312 | 0.783434 | 0.721067 | 0.721067 | 0.694876 | 0 | 0.010164 | 0.255697 | 10,575 | 188 | 205 | 56.25 | 0.765976 | 0.00643 | 0 | 0.616667 | 1 | 0 | 0.105484 | 0.008568 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.011111 | 0.022222 | 0 | 0.044444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f11721650b9d3752c3eefb7fd4baee3075d26bdf | 106 | py | Python | vendor-local/lib/python/django_browserid/__init__.py | Koenkk/popcorn_maker | 0978b9f98dacd4e8eb753404b24eb584f410aa11 | [
"BSD-3-Clause"
] | 15 | 2015-03-23T02:55:20.000Z | 2021-01-12T12:42:30.000Z | vendor-local/lib/python/django_browserid/__init__.py | Koenkk/popcorn_maker | 0978b9f98dacd4e8eb753404b24eb584f410aa11 | [
"BSD-3-Clause"
] | null | null | null | vendor-local/lib/python/django_browserid/__init__.py | Koenkk/popcorn_maker | 0978b9f98dacd4e8eb753404b24eb584f410aa11 | [
"BSD-3-Clause"
] | 16 | 2015-02-18T21:43:31.000Z | 2021-11-09T22:50:03.000Z | from django_browserid.auth import BrowserIDBackend
from django_browserid.base import get_audience, verify
| 35.333333 | 54 | 0.886792 | 14 | 106 | 6.5 | 0.714286 | 0.21978 | 0.417582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084906 | 106 | 2 | 55 | 53 | 0.938144 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
74dcf024d43a34377d6f55e552a2a3c65824cc72 | 89 | py | Python | backend/src/pox/ext/gini/core/info_packet_dump.py | anrl/gini4 | d26649c8c02a1737159e48732cf1ee15ba2a604d | [
"MIT"
] | 11 | 2019-03-02T20:39:34.000Z | 2021-09-02T19:47:38.000Z | backend/src/pox/ext/gini/core/info_packet_dump.py | anrl/gini4 | d26649c8c02a1737159e48732cf1ee15ba2a604d | [
"MIT"
] | 29 | 2019-01-17T15:44:48.000Z | 2021-06-02T00:19:40.000Z | backend/src/pox/ext/gini/core/info_packet_dump.py | anrl/gini4 | d26649c8c02a1737159e48732cf1ee15ba2a604d | [
"MIT"
] | 11 | 2019-01-28T05:00:55.000Z | 2021-11-12T03:08:32.000Z | #!/usr/bin/python2
from info import packet_dump
def launch():
packet_dump.launch()
| 12.714286 | 28 | 0.719101 | 13 | 89 | 4.769231 | 0.769231 | 0.322581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013333 | 0.157303 | 89 | 6 | 29 | 14.833333 | 0.813333 | 0.191011 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
776c07ddfb277b6e527e08c405691d275819738a | 5,874 | py | Python | global_cache_control/ac_control.py | LBNL-ETA/LPDM-Drivers | 0190ecb1348b10d5fb7c5b60ca30ebbbbebe094e | [
"BSD-3-Clause-LBNL"
] | null | null | null | global_cache_control/ac_control.py | LBNL-ETA/LPDM-Drivers | 0190ecb1348b10d5fb7c5b60ca30ebbbbebe094e | [
"BSD-3-Clause-LBNL"
] | null | null | null | global_cache_control/ac_control.py | LBNL-ETA/LPDM-Drivers | 0190ecb1348b10d5fb7c5b60ca30ebbbbebe094e | [
"BSD-3-Clause-LBNL"
] | null | null | null |
################################################################################################################################
# *** Copyright Notice ***
#
# "Price Based Local Power Distribution Management System (Local Power Distribution Manager) v1.0"
# Copyright (c) 2016, The Regents of the University of California, through Lawrence Berkeley National Laboratory
# (subject to receipt of any required approvals from the U.S. Dept. of Energy). All rights reserved.
#
# If you have questions about your rights to use or distribute this software, please contact
# Berkeley Lab's Innovation & Partnerships Office at IPO@lbl.gov.
################################################################################################################################
from global_cache_controls import GlobalCacheBridge
#Not all codes are trimmed properly. They all work, but some are extremely slow.
class ACController:
def __init__(self, bridge, emitter=1, on=False, mode='off', temp=72, speed='low', timer=0):
self.bridge = bridge
self.emitter = 1
self.on = on
self.mode = mode
self.temp = temp
self.speed = speed
self.timer = timer
def power(self):
"Sends the power function"
self.on = not self.on
return self.bridge.sendir(1, '1,1,38343,1,1,345,171,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,64,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,64,22,64,22,21,22,64,22,64,22,63,22,1561,342,86,22,3800')
def temp_timer_up(self):
"Sends the up function"
if self.mode == 'timer':
self.timer += 1
else:
self.temp += 1
return self.bridge.sendir(1, '1,1,38343,1,1,345,172,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,21,22,64,22,64,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,64,22,64,22,64,22,63,22,1562,342,86,22,3670,340,86,22,3670,340,86,22,3670,340,86,22,3800')
def temp_timer_down(self):
"Sends the down function"
if self.mode == 'timer':
self.timer -= 1
else:
self.temp -= 1
return self.bridge.sendir(1, '1,4,38343,1,1,342,172,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,64,22,21,22,64,22,64,22,21,22,21,22,21,22,21,22,21,22,64,22,21,22,21,22,64,22,64,22,64,22,63,22,1562,342,86,22,3670,340,86,22,3670,340,86,22,3670,340,86,22,3670,340,86,22,3800')
def fan_slower(self):
"Sends the fan slower function"
if self.speed == 'high':
self.speed = 'low'
return self.bridge.sendir(1, '1,1,38343,1,1,343,171,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,21,22,64,22,64,22,21,22,64,22,64,22,64,22,64,22,63,22,3800')
def fan_faster(self):
"Sends the fan faster function"
if self.speed == 'low':
self.speed = 'high'
return self.bridge.sendir(1, '1,1,38343,1,1,341,171,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,64,22,21,22,21,22,21,22,21,22,21,22,21,22,21,22,21,22,64,22,64,22,64,22,64,22,64,22,64,22,63,22,1563,342,86,22,3672,340,86,22,3671,340,86,22,3671,340,86,22,3671,340,86,22,3800')
def cool(self):
"Sends the cool setting function"
self.mode = 'cool'
return self.bridge.sendir(1, '1,1,38343,1,1,343,172,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,64,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,21,22,64,22,64,22,21,22,64,22,64,22,64,22,63,22,1563,342,86,22,3671,340,86,22,3671,340,86,22,3800')
def energy_saver(self):
"Sends the energy saver setting function"
self.mode = 'energy_saver'
return self.bridge.sendir(1, '1,1,38343,1,1,344,172,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,21,22,64,22,21,22,21,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,64,22,63,22,1563,342,86,22,3671,340,86,22,3671,340,86,22,3671,340,86,22,3800')
def fan_only(self):
"Sends the fan only setting function"
self.mode = 'fan_only'
return self.bridge.sendir(1, '1,4,38343,1,1,342,171,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,64,22,64,22,64,22,21,22,21,22,21,22,21,22,21,22,21,22,21,22,21,22,64,22,64,22,64,22,64,22,63,22,1563,342,86,22,3671,340,86,22,3671,340,86,22,3671,340,86,22,3800')
def sleep(self):
"Sends the sleep setting function"
self.mode = 'sleep'
return self.bridge.sendir(1, '1,5,38343,1,1,345,171,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,21,22,21,22,21,22,21,22,21,22,21,22,21,22,21,22,64,22,64,22,64,22,64,22,64,22,64,22,64,22,63,22,1562,342,86,22,3671,340,86,22,3671,340,86,22,3670,340,86,22,3670,340,86,22,3800')
def auto_fan(self):
"Sends the auto fan setting function"
self.mode = 'auto_fan'
return self.bridge.sendir(1, '1,6,38343,1,1,342,171,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,64,22,64,22,64,22,64,22,21,22,21,22,21,22,21,22,21,22,21,22,21,22,21,22,64,22,64,22,64,22,63,22,1562,342,86,22,3670,340,86,22,3670,340,86,22,3670,340,86,22,3800')
def timer(self):
"Sends the timer function"
self.mode = 'timer'
return self.bridge.sendir(1, '1,2,38343,1,1,342,172,22,21,22,21,22,21,22,64,22,21,22,21,22,21,22,21,22,64,22,21,22,64,22,21,22,64,22,64,22,64,22,64,22,21,22,64,22,64,22,21,22,21,22,21,22,21,22,21,22,64,22,21,22,21,22,64,22,64,22,64,22,64,22,63,22,3800')
| 64.549451 | 336 | 0.618148 | 1,287 | 5,874 | 2.807304 | 0.104118 | 0.20703 | 0.310545 | 0.261279 | 0.664268 | 0.664268 | 0.634653 | 0.633546 | 0.633546 | 0.633546 | 0 | 0.40204 | 0.131937 | 5,874 | 90 | 337 | 65.266667 | 0.306531 | 0.153899 | 0 | 0.064516 | 0 | 0.177419 | 0.66521 | 0.586974 | 0 | 0 | 0 | 0 | 0 | 1 | 0.193548 | false | 0 | 0.016129 | 0 | 0.403226 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
776ca3bb94d4d4ec408cedacf97d12f824b35428 | 2,735 | py | Python | scripts/mailgun.py | dtekluva/djangodashboard | 42e308d583e5c5fcaafd6d847ced5f1573e35d81 | [
"MIT"
] | null | null | null | scripts/mailgun.py | dtekluva/djangodashboard | 42e308d583e5c5fcaafd6d847ced5f1573e35d81 | [
"MIT"
] | null | null | null | scripts/mailgun.py | dtekluva/djangodashboard | 42e308d583e5c5fcaafd6d847ced5f1573e35d81 | [
"MIT"
] | 1 | 2022-01-26T13:24:33.000Z | 2022-01-26T13:24:33.000Z | import os
import requests, wyre.settings
class Mailer:
@staticmethod
def send_simple_message( sender, title, message, receievers):
"""
SEND SIMPLE MAIL REQUIRES :
- SENDER - NUMERIC CODE
- TITLE -> STRING
- MESSAGE -> STRING
- RECEIVERS -> LIST
POSSIBLE SENDERS:
1. WYRE-MONITOR
2. WYRE-ALERTS
"""
api_key = os.environ.get("api_key")
mailgun_url = os.environ.get("mailgun_url")
response = requests.post(
mailgun_url,
auth=("api", api_key),
data={"from": f"{wyre.settings.POSSIBLE_SENDERS[sender]} <mailer@wyreng.com>",
"to": receievers,
"subject": title,
"text": message})
if response.ok:
print("Sending successful ")
open("maillogs.txt", "a").write("Sending Successful")
return {
"status": True,
"message": "E-mail successfully sent"
}
else:
print("Sending failed ")
open("maillogs.txt", "a").write(f"Sending failed ({response.content})")
return {
"status": True,
"message": response.content
}
@staticmethod
def send_simple_message_with_attachment( sender, title, message, receievers, attachment: str):
"""
SEND SIMPLE MAIL REQUIRES :
- SENDER - NUMERIC CODE
- TITLE -> STRING
- MESSAGE -> STRING
- RECEIVERS -> LIST
POSSIBLE SENDERS:
1. WYRE-MONITOR
2. WYRE-ALERTS
3. WYRE-GENIUS
"""
api_key = os.environ.get("api_key")
mailgun_url = os.environ.get("mailgun_url")
response = requests.post(
mailgun_url,
auth=("api", api_key),
files=[("attachment",("Report.pdf", open(attachment, "rb").read())) ],
data={"from": f"{wyre.settings.POSSIBLE_SENDERS[sender]} <mailer@wyreng.com>",
"to": receievers,
"subject": title,
"text": message})
if response.ok:
print("Sending successful ")
open("maillogs.txt", "a").write("Sending Successful")
return {
"status": True,
"message": "E-mail successfully sent"
}
else:
print("Sending failed ")
open("maillogs.txt", "a").write(f"Sending failed ({response.content})")
return {
"status": False,
"message": response.content
} | 31.079545 | 98 | 0.486289 | 245 | 2,735 | 5.346939 | 0.306122 | 0.027481 | 0.036641 | 0.048855 | 0.819847 | 0.770992 | 0.770992 | 0.770992 | 0.770992 | 0.770992 | 0 | 0.003012 | 0.393053 | 2,735 | 88 | 99 | 31.079545 | 0.786145 | 0.129433 | 0 | 0.796296 | 0 | 0 | 0.253377 | 0.037261 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.037037 | 0 | 0.166667 | 0.074074 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
777cbf8f3c65323c6ca1ff42cbe9d489ae239079 | 42 | py | Python | src/__init__.py | Niger-Volta-LTI/iranlowo | 0046b61105ffadfff21dd8b37754b9d95177fbf8 | [
"MIT"
] | 17 | 2019-07-05T20:30:35.000Z | 2022-02-28T10:00:24.000Z | src/__init__.py | Olamyy/iranlowo | 1feb123988a8afac3ac53c7acfb72df862c4bc18 | [
"MIT"
] | 17 | 2019-07-06T09:10:10.000Z | 2020-11-13T08:30:37.000Z | src/__init__.py | ruohoruotsi/iranlowo | 0046b61105ffadfff21dd8b37754b9d95177fbf8 | [
"MIT"
] | 7 | 2019-07-01T01:59:07.000Z | 2020-11-27T17:12:46.000Z | from . import onmt
from . import torchtext | 21 | 23 | 0.785714 | 6 | 42 | 5.5 | 0.666667 | 0.606061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 42 | 2 | 23 | 21 | 0.942857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7acb7e946080ea2f071818dbb07c725bd7f1792c | 153 | py | Python | _main_decision/_measurement_iteration/_helper_functions/__init__.py | oxquantum/CVAE | 0352ddc51fbfd8d57b155e6de66b4c34e010beac | [
"MIT"
] | null | null | null | _main_decision/_measurement_iteration/_helper_functions/__init__.py | oxquantum/CVAE | 0352ddc51fbfd8d57b155e6de66b4c34e010beac | [
"MIT"
] | null | null | null | _main_decision/_measurement_iteration/_helper_functions/__init__.py | oxquantum/CVAE | 0352ddc51fbfd8d57b155e6de66b4c34e010beac | [
"MIT"
] | null | null | null | from .estimate_error import estimate_error
from .get_observaiton_mask import get_observation_mask
from .update_log_prob_mask import update_log_prob_mask
| 38.25 | 54 | 0.901961 | 24 | 153 | 5.25 | 0.458333 | 0.206349 | 0.206349 | 0.269841 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078431 | 153 | 3 | 55 | 51 | 0.893617 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
7ad92fa444030b0ddf1cbcbd4e6f419d3ee91d0d | 494 | py | Python | codechef/aug18a/inmat-map-gen.py | Ashindustry007/competitive-programming | 2eabd3975c029d235abb7854569593d334acae2f | [
"WTFPL"
] | 506 | 2018-08-22T10:30:38.000Z | 2022-03-31T10:01:49.000Z | codechef/aug18a/inmat-map-gen.py | Ashindustry007/competitive-programming | 2eabd3975c029d235abb7854569593d334acae2f | [
"WTFPL"
] | 13 | 2019-08-07T18:31:18.000Z | 2020-12-15T21:54:41.000Z | codechef/aug18a/inmat-map-gen.py | Ashindustry007/competitive-programming | 2eabd3975c029d235abb7854569593d334acae2f | [
"WTFPL"
] | 234 | 2018-08-06T17:11:41.000Z | 2022-03-26T10:56:42.000Z | #!/usr/bin/env python3
n = 1000
c = 1
for i in range(n//3):
x = []
for j in range(n):
x.append(c)
c += 1
print(' '.join([str(i) for i in x]))
x = []
for j in range(n):
x.append(c)
c += 1
print(' '.join([str(i) for i in x]))
x = [0] * n
for j in range(n - 1, -1, -1):
x[j] = c
c += 1
print(' '.join([str(i) for i in x]))
x = []
for j in range(n):
x.append(c)
c += 1
print(' '.join([str(i) for i in x]))
| 19.76 | 40 | 0.421053 | 95 | 494 | 2.189474 | 0.210526 | 0.048077 | 0.144231 | 0.211538 | 0.793269 | 0.735577 | 0.735577 | 0.735577 | 0.735577 | 0.735577 | 0 | 0.047771 | 0.364372 | 494 | 24 | 41 | 20.583333 | 0.61465 | 0.04251 | 0 | 0.73913 | 0 | 0 | 0.008475 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.173913 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bb3faa044a836cb16f7e89bae5e92aad782e360d | 156 | py | Python | keras/engine/base_layer.py | ikingye/keras | 1a3ee8441933fc007be6b2beb47af67998d50737 | [
"MIT"
] | 5 | 2020-11-30T22:26:03.000Z | 2020-12-01T22:34:25.000Z | keras/engine/base_layer.py | ikingye/keras | 1a3ee8441933fc007be6b2beb47af67998d50737 | [
"MIT"
] | 10 | 2020-12-01T22:55:29.000Z | 2020-12-11T18:31:46.000Z | keras/engine/base_layer.py | ikingye/keras | 1a3ee8441933fc007be6b2beb47af67998d50737 | [
"MIT"
] | 15 | 2020-11-30T22:12:22.000Z | 2020-12-09T01:32:48.000Z | """Contains the base Layer class, from which all layers inherit."""
from tensorflow.keras.layers import Layer
from tensorflow.keras.layers import InputSpec
| 39 | 67 | 0.807692 | 22 | 156 | 5.727273 | 0.636364 | 0.222222 | 0.301587 | 0.396825 | 0.492063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 156 | 3 | 68 | 52 | 0.913043 | 0.391026 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
2494610560d9b2db41aa326e0a92c72280358ff9 | 10,346 | py | Python | tests/gems/arch/test_aur_data_mapper.py | DN-debug/bauh | 83aeccae87d7fe26f6c5bf24be005288d5d54d84 | [
"Zlib"
] | 507 | 2019-08-12T16:15:55.000Z | 2022-03-28T15:49:39.000Z | tests/gems/arch/test_aur_data_mapper.py | DN-debug/bauh | 83aeccae87d7fe26f6c5bf24be005288d5d54d84 | [
"Zlib"
] | 176 | 2019-08-14T02:35:21.000Z | 2022-03-31T21:43:56.000Z | tests/gems/arch/test_aur_data_mapper.py | DN-debug/bauh | 83aeccae87d7fe26f6c5bf24be005288d5d54d84 | [
"Zlib"
] | 57 | 2019-09-02T04:09:22.000Z | 2022-03-21T21:37:16.000Z | import warnings
from unittest import TestCase
from unittest.mock import Mock
from bauh.gems.arch.mapper import AURDataMapper
from bauh.gems.arch.model import ArchPackage
class ArchDataMapperTest(TestCase):
@classmethod
def setUpClass(cls):
warnings.filterwarnings('ignore', category=DeprecationWarning)
def test_check_version_update(self):
self.assertTrue(AURDataMapper.check_version_update('1.0.0-1', '1.0.0-2'))
self.assertFalse(AURDataMapper.check_version_update('1.0.0-2', '1.0.0-1'))
self.assertTrue(AURDataMapper.check_version_update('1.0.0-5', '1.0.1-1'))
self.assertFalse(AURDataMapper.check_version_update('1.0.1-1', '1.0.0-1'))
self.assertTrue(AURDataMapper.check_version_update('1.0.5-5', '1.1.0-2'))
self.assertFalse(AURDataMapper.check_version_update('1.1.0-2', '1.0.5-5'))
self.assertTrue(AURDataMapper.check_version_update('1.5.0-2', '1.5.1-1'))
self.assertFalse(AURDataMapper.check_version_update('1.5.1-1', '1.5.0-2'))
self.assertTrue(AURDataMapper.check_version_update('1.5.1-1', '1.5.1-2'))
self.assertTrue(AURDataMapper.check_version_update('1.5.1-1', '2.0.0-1'))
self.assertFalse(AURDataMapper.check_version_update('2.0.0-1', '1.5.1-1'))
self.assertTrue(AURDataMapper.check_version_update('77.0.3865.90-1', '77.0.3865.120-1'))
self.assertTrue(AURDataMapper.check_version_update('77.0.3865.90-1', '77.0.3865.90-2'))
self.assertFalse(AURDataMapper.check_version_update('77.0.3865.900-1', '77.0.3865.120-1'))
self.assertTrue(AURDataMapper.check_version_update('77.0.3865.120-1', '77.0.3865.900-1'))
self.assertFalse(AURDataMapper.check_version_update('77.0.3865.120-1', '77.0.3865.90-1'))
self.assertTrue(AURDataMapper.check_version_update('77.0.3865.a-1', '77.0.3865.b-1'))
self.assertFalse(AURDataMapper.check_version_update('77.0.b.0-1', '77.0.a.1-1'))
self.assertFalse(AURDataMapper.check_version_update('r25.e22697c-1', 'r8.19fe011-1'))
self.assertTrue(AURDataMapper.check_version_update('0.9.7.RC-9', '0.9.7.RC-10'))
self.assertFalse(AURDataMapper.check_version_update('1.1.0.r11.caacf30-1', 'r65.4c7144a-1'))
self.assertFalse(AURDataMapper.check_version_update('1.2.16.r688.8b2c199-1', 'r2105.e91f0e9-3'))
def test_check_version_update__versions_with_epics(self):
self.assertTrue(AURDataMapper.check_version_update('1.2-1', '1:1.1-1'))
self.assertFalse(AURDataMapper.check_version_update('1:1.1-1', '1.2-1'))
self.assertTrue(AURDataMapper.check_version_update('1:1.2-1', '2:0.1-1'))
self.assertFalse(AURDataMapper.check_version_update('2:0.1-1', '1:1.2-1'))
self.assertTrue(AURDataMapper.check_version_update('10:1.1-1', '10:1.2-1'))
self.assertFalse(AURDataMapper.check_version_update('10:1.2-1', '10:1.2-1'))
self.assertTrue(AURDataMapper.check_version_update('9:1.2-1', '10:0.1-1'))
self.assertTrue(AURDataMapper.check_version_update('9:1.1.1.1-2', '10:0.0'))
self.assertFalse(AURDataMapper.check_version_update('10:0.0', '9:1.1.1.1-2'))
def test_check_update__pkg_no_last_modified_and_same_versions(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = None
pkg.version = '1.0.0'
pkg.latest_version = pkg.version
self.assertFalse(mapper.check_update(pkg=pkg, last_modified=1608143812))
def test_check_update__pkg_no_last_modified_and_latest_version_higher_than_version(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = None
pkg.version = '1.0.0'
pkg.latest_version = '1.1.0'
self.assertTrue(mapper.check_update(pkg=pkg, last_modified=1608143812))
def test_check_update__pkg_no_last_modified_and_no_install_date_and_version_higher_than_latest_version(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = None
pkg.install_date = None
pkg.version = '1.1.0'
pkg.latest_version = '1.0.0'
self.assertFalse(mapper.check_update(pkg=pkg, last_modified=1608143812))
def test_check_update__none_last_modified_and_latest_version_higher_than_version(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = 1608143812
pkg.version = '1.0.0'
pkg.latest_version = '1.1.0'
self.assertTrue(mapper.check_update(pkg=pkg, last_modified=None))
def test_check_update__none_last_modified_and_version_equal_latest_version(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = 1608143812
pkg.version = '1.0.0'
pkg.latest_version = pkg.version
self.assertFalse(mapper.check_update(pkg=pkg, last_modified=None))
def test_check_update__string_last_modified_and_latest_version_higher_than_version(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = 1608143812
pkg.version = '1.0.0'
pkg.latest_version = '1.1.0'
self.assertTrue(mapper.check_update(pkg=pkg, last_modified='abc'))
def test_check_update__pkg_last_modified_equal_last_modified_and_version_equal_latest_version(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = 1608143812
pkg.version = '1.0.0'
pkg.latest_version = pkg.version
self.assertFalse(mapper.check_update(pkg=pkg, last_modified=pkg.last_modified))
def test_check_update__pkg_last_modified_higher_than_last_modified_and_latest_version_higher_than_version(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = 1608143812
pkg.version = '1.0.0'
pkg.latest_version = '1.1.0'
self.assertTrue(mapper.check_update(pkg=pkg, last_modified=pkg.last_modified - 100))
def test_check_update__pkg_last_modified_less_than_last_modified_and_version_higher_than_latest_version(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = 1608143812
pkg.version = '2.0.0'
pkg.latest_version = '1.0.0'
# in this case, last modified is more relevant than the string version
self.assertTrue(mapper.check_update(pkg=pkg, last_modified=pkg.last_modified + 100))
def test_check_update__pkg_no_last_modified_and_install_date_less_than_last_modified_and_version_higher_than_latest(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = None
pkg.install_date = 1608143812
pkg.version = '3.0.0'
pkg.latest_version = '2.0.0'
# in this case, install_date will be considered instead of package's last_modified.
# even that 'version' is higher than 'latest_version', 'last_modified' is greater than 'install_date'
self.assertTrue(mapper.check_update(pkg=pkg, last_modified=pkg.install_date + 100))
def test_check_update__pkg_no_last_modified_and_install_date_higher_than_last_modified_and_version_equal_latest(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = None
pkg.install_date = 1608143812
pkg.version = '2.0.0'
pkg.latest_version = pkg.version
# in this case, install_date will be considered instead of package's last_modified.
# as 'install_date' is higher, only the string versions will be compared
self.assertFalse(mapper.check_update(pkg=pkg, last_modified=pkg.install_date - 100))
def test_check_update__pkg_no_last_modified_and_install_date_higher_than_last_modified_and_latest_higher(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = None
pkg.install_date = 1608143812
pkg.version = '1.0.0'
pkg.latest_version = '1.1.0'
# in this case, install_date will be considered instead of package's last_modified.
# as 'install_date' is higher, only the string versions will be compared
self.assertTrue(mapper.check_update(pkg=pkg, last_modified=pkg.install_date - 100))
def test_check_update__pkg_no_last_modified_and_install_date_and_no_last_modified_and_latest_higher(self):
mapper = AURDataMapper(i18n=Mock(), logger=Mock(), http_client=Mock())
pkg = ArchPackage(name='test')
pkg.last_modified = None
pkg.install_date = 1608143812
pkg.version = '1.0.0'
pkg.latest_version = '1.1.0'
# in this case, install_date will be considered instead of package's last_modified.
# as 'install_date' is higher, only the string versions will be compared
self.assertTrue(mapper.check_update(pkg=pkg, last_modified=None))
def test_check_version_update__one_version_contain_mixed_letters_and_symbols(self):
self.assertTrue(AURDataMapper.check_version_update('2.2.6', '2.3.3op2'))
self.assertFalse(AURDataMapper.check_version_update('2.3', '2.2.a'))
self.assertTrue(AURDataMapper.check_version_update('2.2.a.123', '2.3'))
def test_check_update__only_installed_version_with_release_number(self):
self.assertTrue(AURDataMapper.check_version_update('2.2.6-1', '2.3'))
self.assertTrue(AURDataMapper.check_version_update('2.2', '2.2-2'))
self.assertFalse(AURDataMapper.check_version_update('2.2', '2.2-1'))
def test_check_update__versions_with_epoch(self):
self.assertTrue(AURDataMapper.check_version_update('2.3', '1:2.1'))
self.assertFalse(AURDataMapper.check_version_update('1:1.0', '2.1'))
| 52.252525 | 126 | 0.706263 | 1,489 | 10,346 | 4.64137 | 0.078576 | 0.0955 | 0.109391 | 0.174939 | 0.892635 | 0.876574 | 0.869773 | 0.830415 | 0.773839 | 0.662422 | 0 | 0.07422 | 0.163928 | 10,346 | 197 | 127 | 52.517767 | 0.72474 | 0.068529 | 0 | 0.469799 | 0 | 0 | 0.087151 | 0.002181 | 0 | 0 | 0 | 0 | 0.348993 | 1 | 0.127517 | false | 0 | 0.033557 | 0 | 0.167785 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
24fde2aa039bbef26eb7d45c70d30b894d055507 | 4,941 | py | Python | api/tests/test_match_project_posting.py | matchd-ch/matchd-backend | 84be4aab1b4708cae50a8988301b15df877c8db0 | [
"Apache-2.0"
] | 1 | 2022-03-03T09:55:57.000Z | 2022-03-03T09:55:57.000Z | api/tests/test_match_project_posting.py | matchd-ch/matchd-backend | 84be4aab1b4708cae50a8988301b15df877c8db0 | [
"Apache-2.0"
] | 7 | 2022-02-09T10:44:53.000Z | 2022-03-28T03:29:43.000Z | api/tests/test_match_project_posting.py | matchd-ch/matchd-backend | 84be4aab1b4708cae50a8988301b15df877c8db0 | [
"Apache-2.0"
] | null | null | null | import pytest
from django.contrib.auth.models import AnonymousUser
from django.core import mail
from db.models import Match
@pytest.mark.django_db
def test_company_match_student_project_posting(user_student, user_employee,
company_project_posting_object,
match_project_posting, login):
company_project_posting_object.employee = user_employee.employee
company_project_posting_object.save()
login(user_student)
data, errors = match_project_posting(user_student, company_project_posting_object)
assert errors is None
assert data is not None
match_project_posting_data = data.get('matchProjectPosting')
assert match_project_posting_data is not None
assert match_project_posting_data.get('success') is True
assert match_project_posting_data.get('errors') is None
match_obj_exists = Match.objects.filter(student=user_student.student,
project_posting=company_project_posting_object,
initiator=user_student.type,
student_confirmed=True,
company_confirmed=True).exists()
assert match_obj_exists is True
mail_to_company = mail.outbox[0]
assert user_employee.email in mail_to_company.recipients()
@pytest.mark.django_db
def test_company_match_student_project_posting_with_invalid_project_posting(
user_employee, student_project_posting_object, match_project_posting, login):
login(user_employee)
student_project_posting_object.id = 1337
data, errors = match_project_posting(user_employee, student_project_posting_object)
assert errors is None
assert data is not None
match_project_posting_data = data.get('matchProjectPosting')
assert match_project_posting_data is not None
assert match_project_posting_data.get('success') is False
assert match_project_posting_data.get('errors') is not None
@pytest.mark.django_db
def test_company_match_student_project_posting_without_login(company_project_posting_object,
match_project_posting):
data, errors = match_project_posting(AnonymousUser(), company_project_posting_object)
assert errors is not None
assert data is not None
match_project_posting_data = data.get('matchProjectPosting')
assert match_project_posting_data is None
@pytest.mark.django_db
def test_student_match_company_project_posting(user_student, user_employee,
student_project_posting_object,
match_project_posting, login):
login(user_employee)
data, errors = match_project_posting(user_employee, student_project_posting_object)
assert errors is None
assert data is not None
match_project_posting_data = data.get('matchProjectPosting')
assert match_project_posting_data is not None
assert match_project_posting_data.get('success') is True
assert match_project_posting_data.get('errors') is None
match_obj_exists = Match.objects.filter(company=user_employee.company,
project_posting=student_project_posting_object,
initiator=user_employee.type,
student_confirmed=True,
company_confirmed=True).exists()
assert match_obj_exists is True
mail_to_student = mail.outbox[0]
assert user_student.email in mail_to_student.recipients()
@pytest.mark.django_db
def test_student_match_company_project_posting_with_invalid_project_posting(
user_student, user_employee, company_project_posting_object, match_project_posting, login):
company_project_posting_object.employee = user_employee.employee
company_project_posting_object.save()
login(user_student)
company_project_posting_object.id = 1337
data, errors = match_project_posting(user_student, company_project_posting_object)
assert errors is None
assert data is not None
match_project_posting_data = data.get('matchProjectPosting')
assert match_project_posting_data is not None
assert match_project_posting_data.get('success') is False
assert match_project_posting_data.get('errors') is not None
@pytest.mark.django_db
def test_student_match_company_project_posting_without_login(company_project_posting_object,
match_project_posting):
data, errors = match_project_posting(AnonymousUser(), company_project_posting_object)
assert errors is not None
assert data is not None
match_project_posting_data = data.get('matchProjectPosting')
assert match_project_posting_data is None
| 42.230769 | 99 | 0.707144 | 585 | 4,941 | 5.567521 | 0.090598 | 0.266503 | 0.186675 | 0.155358 | 0.923549 | 0.880258 | 0.872582 | 0.859687 | 0.855388 | 0.855388 | 0 | 0.002674 | 0.243068 | 4,941 | 116 | 100 | 42.594828 | 0.868182 | 0 | 0 | 0.704545 | 0 | 0 | 0.033596 | 0 | 0 | 0 | 0 | 0 | 0.340909 | 1 | 0.068182 | false | 0 | 0.045455 | 0 | 0.113636 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
7023d63d0ccc292d0e5f5b42f02f07608512b429 | 164 | py | Python | pytorch-research/tutorial/db-connect/adhoc.py | ryansimpsonuk/GAN-202005 | 6cfe226f642f222d6320a8630acf2d086e20276e | [
"BSD-3-Clause"
] | null | null | null | pytorch-research/tutorial/db-connect/adhoc.py | ryansimpsonuk/GAN-202005 | 6cfe226f642f222d6320a8630acf2d086e20276e | [
"BSD-3-Clause"
] | null | null | null | pytorch-research/tutorial/db-connect/adhoc.py | ryansimpsonuk/GAN-202005 | 6cfe226f642f222d6320a8630acf2d086e20276e | [
"BSD-3-Clause"
] | null | null | null | # Databricks notebook source
dbutils.fs.ls('/tmp/ryansimpson/dataset/runs/')
# COMMAND ----------
dbutils.tensorboard.start('/dbfs/tmp/ryansimpson/dataset/runs/') | 27.333333 | 64 | 0.72561 | 19 | 164 | 6.263158 | 0.736842 | 0.235294 | 0.352941 | 0.420168 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060976 | 164 | 6 | 64 | 27.333333 | 0.772727 | 0.27439 | 0 | 0 | 0 | 0 | 0.555556 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
706101e6a21f5a067529bff984e25684bf373e35 | 96 | py | Python | src/__init__.py | gannon93/python_challenge_work | a13465b2640562e438133f5c5fe60208d897654a | [
"MIT"
] | null | null | null | src/__init__.py | gannon93/python_challenge_work | a13465b2640562e438133f5c5fe60208d897654a | [
"MIT"
] | null | null | null | src/__init__.py | gannon93/python_challenge_work | a13465b2640562e438133f5c5fe60208d897654a | [
"MIT"
] | null | null | null | try:
from . import requests_helper as rh
except Exception:
import requests_helper as rh
| 19.2 | 39 | 0.75 | 14 | 96 | 5 | 0.642857 | 0.4 | 0.571429 | 0.628571 | 0.685714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21875 | 96 | 4 | 40 | 24 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
7081c7a5f94f2423518187fe6e9b65ac1fd83a20 | 14,455 | py | Python | sdk/python/pulumi_lxd/publish_image.py | soupdiver/pulumi-lxd | 258395aefd6a4cf138d470d7de70babed3310063 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_lxd/publish_image.py | soupdiver/pulumi-lxd | 258395aefd6a4cf138d470d7de70babed3310063 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_lxd/publish_image.py | soupdiver/pulumi-lxd | 258395aefd6a4cf138d470d7de70babed3310063 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = ['PublishImageArgs', 'PublishImage']
@pulumi.input_type
class PublishImageArgs:
def __init__(__self__, *,
container: pulumi.Input[str],
aliases: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
compression_algorithm: Optional[pulumi.Input[str]] = None,
filename: Optional[pulumi.Input[str]] = None,
properties: Optional[pulumi.Input[Mapping[str, Any]]] = None,
public: Optional[pulumi.Input[bool]] = None,
triggers: Optional[pulumi.Input[Mapping[str, Any]]] = None):
"""
The set of arguments for constructing a PublishImage resource.
:param pulumi.Input[Mapping[str, Any]] triggers: A map of arbitrary strings that, when changed, will force the resource to be replaced.
"""
pulumi.set(__self__, "container", container)
if aliases is not None:
pulumi.set(__self__, "aliases", aliases)
if compression_algorithm is not None:
pulumi.set(__self__, "compression_algorithm", compression_algorithm)
if filename is not None:
pulumi.set(__self__, "filename", filename)
if properties is not None:
pulumi.set(__self__, "properties", properties)
if public is not None:
pulumi.set(__self__, "public", public)
if triggers is not None:
pulumi.set(__self__, "triggers", triggers)
@property
@pulumi.getter
def container(self) -> pulumi.Input[str]:
return pulumi.get(self, "container")
@container.setter
def container(self, value: pulumi.Input[str]):
pulumi.set(self, "container", value)
@property
@pulumi.getter
def aliases(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
return pulumi.get(self, "aliases")
@aliases.setter
def aliases(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "aliases", value)
@property
@pulumi.getter(name="compressionAlgorithm")
def compression_algorithm(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "compression_algorithm")
@compression_algorithm.setter
def compression_algorithm(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_algorithm", value)
@property
@pulumi.getter
def filename(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "filename")
@filename.setter
def filename(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "filename", value)
@property
@pulumi.getter
def properties(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
return pulumi.get(self, "properties")
@properties.setter
def properties(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "properties", value)
@property
@pulumi.getter
def public(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "public")
@public.setter
def public(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "public", value)
@property
@pulumi.getter
def triggers(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
A map of arbitrary strings that, when changed, will force the resource to be replaced.
"""
return pulumi.get(self, "triggers")
@triggers.setter
def triggers(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "triggers", value)
@pulumi.input_type
class _PublishImageState:
def __init__(__self__, *,
aliases: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
compression_algorithm: Optional[pulumi.Input[str]] = None,
container: Optional[pulumi.Input[str]] = None,
filename: Optional[pulumi.Input[str]] = None,
properties: Optional[pulumi.Input[Mapping[str, Any]]] = None,
public: Optional[pulumi.Input[bool]] = None,
triggers: Optional[pulumi.Input[Mapping[str, Any]]] = None):
"""
Input properties used for looking up and filtering PublishImage resources.
:param pulumi.Input[Mapping[str, Any]] triggers: A map of arbitrary strings that, when changed, will force the resource to be replaced.
"""
if aliases is not None:
pulumi.set(__self__, "aliases", aliases)
if compression_algorithm is not None:
pulumi.set(__self__, "compression_algorithm", compression_algorithm)
if container is not None:
pulumi.set(__self__, "container", container)
if filename is not None:
pulumi.set(__self__, "filename", filename)
if properties is not None:
pulumi.set(__self__, "properties", properties)
if public is not None:
pulumi.set(__self__, "public", public)
if triggers is not None:
pulumi.set(__self__, "triggers", triggers)
@property
@pulumi.getter
def aliases(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
return pulumi.get(self, "aliases")
@aliases.setter
def aliases(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "aliases", value)
@property
@pulumi.getter(name="compressionAlgorithm")
def compression_algorithm(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "compression_algorithm")
@compression_algorithm.setter
def compression_algorithm(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_algorithm", value)
@property
@pulumi.getter
def container(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "container")
@container.setter
def container(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "container", value)
@property
@pulumi.getter
def filename(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "filename")
@filename.setter
def filename(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "filename", value)
@property
@pulumi.getter
def properties(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
return pulumi.get(self, "properties")
@properties.setter
def properties(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "properties", value)
@property
@pulumi.getter
def public(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "public")
@public.setter
def public(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "public", value)
@property
@pulumi.getter
def triggers(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
A map of arbitrary strings that, when changed, will force the resource to be replaced.
"""
return pulumi.get(self, "triggers")
@triggers.setter
def triggers(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "triggers", value)
class PublishImage(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
aliases: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
compression_algorithm: Optional[pulumi.Input[str]] = None,
container: Optional[pulumi.Input[str]] = None,
filename: Optional[pulumi.Input[str]] = None,
properties: Optional[pulumi.Input[Mapping[str, Any]]] = None,
public: Optional[pulumi.Input[bool]] = None,
triggers: Optional[pulumi.Input[Mapping[str, Any]]] = None,
__props__=None):
"""
Create a PublishImage resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Mapping[str, Any]] triggers: A map of arbitrary strings that, when changed, will force the resource to be replaced.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: PublishImageArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Create a PublishImage resource with the given unique name, props, and options.
:param str resource_name: The name of the resource.
:param PublishImageArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(PublishImageArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
aliases: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
compression_algorithm: Optional[pulumi.Input[str]] = None,
container: Optional[pulumi.Input[str]] = None,
filename: Optional[pulumi.Input[str]] = None,
properties: Optional[pulumi.Input[Mapping[str, Any]]] = None,
public: Optional[pulumi.Input[bool]] = None,
triggers: Optional[pulumi.Input[Mapping[str, Any]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = PublishImageArgs.__new__(PublishImageArgs)
__props__.__dict__["aliases"] = aliases
__props__.__dict__["compression_algorithm"] = compression_algorithm
if container is None and not opts.urn:
raise TypeError("Missing required property 'container'")
__props__.__dict__["container"] = container
__props__.__dict__["filename"] = filename
__props__.__dict__["properties"] = properties
__props__.__dict__["public"] = public
__props__.__dict__["triggers"] = triggers
super(PublishImage, __self__).__init__(
'lxd:index/publishImage:PublishImage',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
aliases: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
compression_algorithm: Optional[pulumi.Input[str]] = None,
container: Optional[pulumi.Input[str]] = None,
filename: Optional[pulumi.Input[str]] = None,
properties: Optional[pulumi.Input[Mapping[str, Any]]] = None,
public: Optional[pulumi.Input[bool]] = None,
triggers: Optional[pulumi.Input[Mapping[str, Any]]] = None) -> 'PublishImage':
"""
Get an existing PublishImage resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Mapping[str, Any]] triggers: A map of arbitrary strings that, when changed, will force the resource to be replaced.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _PublishImageState.__new__(_PublishImageState)
__props__.__dict__["aliases"] = aliases
__props__.__dict__["compression_algorithm"] = compression_algorithm
__props__.__dict__["container"] = container
__props__.__dict__["filename"] = filename
__props__.__dict__["properties"] = properties
__props__.__dict__["public"] = public
__props__.__dict__["triggers"] = triggers
return PublishImage(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def aliases(self) -> pulumi.Output[Optional[Sequence[str]]]:
return pulumi.get(self, "aliases")
@property
@pulumi.getter(name="compressionAlgorithm")
def compression_algorithm(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "compression_algorithm")
@property
@pulumi.getter
def container(self) -> pulumi.Output[str]:
return pulumi.get(self, "container")
@property
@pulumi.getter
def filename(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "filename")
@property
@pulumi.getter
def properties(self) -> pulumi.Output[Optional[Mapping[str, Any]]]:
return pulumi.get(self, "properties")
@property
@pulumi.getter
def public(self) -> pulumi.Output[Optional[bool]]:
return pulumi.get(self, "public")
@property
@pulumi.getter
def triggers(self) -> pulumi.Output[Optional[Mapping[str, Any]]]:
"""
A map of arbitrary strings that, when changed, will force the resource to be replaced.
"""
return pulumi.get(self, "triggers")
| 41.182336 | 143 | 0.64054 | 1,587 | 14,455 | 5.620038 | 0.089477 | 0.098666 | 0.127817 | 0.059199 | 0.80435 | 0.791793 | 0.754008 | 0.728669 | 0.718242 | 0.698172 | 0 | 0.000092 | 0.246212 | 14,455 | 350 | 144 | 41.3 | 0.818465 | 0.134071 | 0 | 0.789272 | 1 | 0 | 0.07771 | 0.018323 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16092 | false | 0.003831 | 0.019157 | 0.068966 | 0.275862 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
56389f3c13fe184ae909eef73d08c0900b0d7e50 | 100 | py | Python | djangocms_versioning_filer/monkeypatch/admin/tools.py | jonathan-s/djangocms-versioning-filer | 664a65ecbe08ae0808d60e07f6b5168d6073ab65 | [
"BSD-3-Clause"
] | null | null | null | djangocms_versioning_filer/monkeypatch/admin/tools.py | jonathan-s/djangocms-versioning-filer | 664a65ecbe08ae0808d60e07f6b5168d6073ab65 | [
"BSD-3-Clause"
] | 10 | 2019-08-13T13:50:59.000Z | 2022-03-11T13:00:05.000Z | djangocms_versioning_filer/monkeypatch/admin/tools.py | jonathan-s/djangocms-versioning-filer | 664a65ecbe08ae0808d60e07f6b5168d6073ab65 | [
"BSD-3-Clause"
] | 9 | 2018-09-28T12:43:27.000Z | 2020-10-13T09:06:47.000Z | from filer.admin import tools
tools.ALLOWED_PICK_TYPES = tools.ALLOWED_PICK_TYPES + ('grouper', )
| 20 | 67 | 0.78 | 14 | 100 | 5.285714 | 0.642857 | 0.324324 | 0.432432 | 0.567568 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 100 | 4 | 68 | 25 | 0.840909 | 0 | 0 | 0 | 0 | 0 | 0.07 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
563a991437e22f280874994c6047f95e60a20afa | 11,758 | py | Python | Few-shot_Learning/algorithms.py | LanGe-Chen/AutoML | 227c1082b8ced47022104808958006a43aa90335 | [
"Apache-2.0"
] | null | null | null | Few-shot_Learning/algorithms.py | LanGe-Chen/AutoML | 227c1082b8ced47022104808958006a43aa90335 | [
"Apache-2.0"
] | null | null | null | Few-shot_Learning/algorithms.py | LanGe-Chen/AutoML | 227c1082b8ced47022104808958006a43aa90335 | [
"Apache-2.0"
] | null | null | null | ####################################################################
# Implement the two techniques Finetuning and Reptile
####################################################################
import matplotlib.pyplot as plt
import torch
import numpy as np
from copy import deepcopy
from data_loader import SineLoader
from networks import SineNetwork
#Plotting Parameters
HORIZON = 10
TRAIN_C = (255/255, 165/255, 0)
TRAIN_T = (TRAIN_C[0],TRAIN_C[1],TRAIN_C[2],0.5)
TEST_C = (0, 1, 1)
TEST_T = (TEST_C[0],TEST_C[1],TEST_C[2],0.5)
ITER_S = ':'
MEAN_S = '-'
class Finetuning:
def __init__(self, iterations, samples, batch_size, support_size, query_size, step):
self.iterations = iterations
self.samples = samples
self.batch_size = batch_size
self.support_size = support_size
self.query_size = query_size
self.step = step
def plot_fn(self, ax, style, label, color=None):
fx = np.arange(-5.0, 5.0, 0.1)
fy = self.model(torch.Tensor(np.asarray(fx)).reshape(len(fx), 1)).detach().numpy()
if color == None:
ax.plot(fx, fy, style, label=label)
else:
ax.plot(fx, fy, style, label=label, color=color)
def train_batch(self, x, y):
# Updates the model on a single batch using SGD
self.model.zero_grad()
pred = self.model(x)
loss = self.model.criterion(pred, y)
loss.backward()
for param in self.model.parameters():
param.data -= self.step * param.grad.data
def freeze_model(self, state):
for layer in self.model.model['features'].children():
for param in layer.parameters():
param.requires_grad = state
if state:
self.model.model['out'].reset_parameters()
def run(self, eval=100):
self.model = SineNetwork()
rng = np.random.RandomState(0)
# Data Loaders
train_loader = SineLoader(k=self.samples, k_test=0).generator(episodic=False, batch_size=self.samples, mode="train", reset_ptr=True)
test_loader = SineLoader(k=self.support_size, k_test=self.query_size).generator(episodic=True, batch_size=None, mode="test", reset_ptr=True)
_, (ax1, ax2) = plt.subplots(1, 2)
iters = []
train_losses = []
train_means = []
test_losses = []
test_means = []
for iteration in range(self.iterations):
# Train on a batch of sine data
x_batch, y_batch, _, _ = next(train_loader)
# Train over the entire distribution of sine wave tasks for fine-tuning
indices = rng.permutation(len(x_batch))
for start in range(0, len(x_batch), self.batch_size):
gather = indices[start:start+self.batch_size]
self.train_batch(x_batch[gather], y_batch[gather])
# Evaluate on a new unseen task and plot results
if iteration==0 or (iteration+1) % eval == 0:
ax1.cla()
ax2.cla()
iters.append(iteration)
# Keep weights for restoring later
weights_before = deepcopy(self.model.state_dict())
self.freeze_model(True)
# Test the model for a fixed number of training epochs on the new task and evaluate against query set
x_support, y_support, x_query, y_query = next(test_loader)
train_losses.append(self.model.criterion(self.model(x_query), y_query).item())
self.plot_fn(ax1, "--", "Initial Model", (0,0,0))
for inneriter in range(32):
self.train_batch(x_support, y_support)
if (inneriter+1) % 8 == 0:
frac = (inneriter+1) / 32
self.plot_fn(ax1, "-", "Model at %i"%(inneriter+1), (1-frac, frac, 0, frac))
test_losses.append(self.model.criterion(self.model(x_query), y_query).item())
train_means.append(np.mean(train_losses[-HORIZON:]))
test_means.append(np.mean(test_losses[-HORIZON:]))
ax1.plot(x_support, y_support, "*", label="Support", color=(1,0,0))
ax1.plot(x_query, y_query, "*", label="Query", color=(0,0,1))
ax1.set_ylim(-5,5)
ax1.legend(loc='upper right', fancybox=True, shadow=True)
ax1.title.set_text("Fine-Tuning Transfer")
ax1.set_xlabel("X")
ax1.set_ylabel("Y")
ax2.plot(iters[1:], train_losses[1:], ITER_S, label="Train Loss", color=TRAIN_T)
ax2.plot(iters[1:], train_means[1:], MEAN_S, label="Train Mean", color=TRAIN_C)
ax2.plot(iters[1:], test_losses[1:], ITER_S, label="Test Loss", color=TEST_T)
ax2.plot(iters[1:], test_means[1:], MEAN_S, label="Test Mean", color=TEST_C)
ax2.legend(loc="upper right", fancybox=True, shadow=True)
ax2.title.set_text("Learning Curves")
ax2.set_xlabel("Iterations")
ax2.set_ylabel("MSE")
plt.pause(0.01)
print(f"-----------------------------")
print(f"Iteration {iteration+1}")
print("Before Transfer |", f"Train Loss: {train_losses[-1]:.3f}", f"Mean: {train_means[-1]:.3f}")
print("After Transfer |", f"Test Loss: {test_losses[-1]:.3f}", f"Mean: {test_means[-1]:.3f}")
# Restore weights from before testing
self.freeze_model(False)
self.model.load_state_dict(weights_before)
class Reptile:
def __init__(self, iterations, samples, batch_size, support_size, query_size, outerstep, innerstep, innerepochs):
self.iterations = iterations
self.samples = samples
self.batch_size = batch_size
self.support_size = support_size
self.query_size = query_size
self.outerstep = outerstep
self.innerstep = innerstep
self.innerepochs = innerepochs
def plot_fn(self, ax, style, label, color=None):
fx = np.arange(-5.0, 5.0, 0.1)
fy = self.model(torch.Tensor(np.asarray(fx)).reshape(len(fx), 1)).detach().numpy()
if color == None:
ax.plot(fx, fy, style, label=label)
else:
ax.plot(fx, fy, style, label=label, color=color)
def train_batch(self, x, y):
# Updates the model on a single batch using SGD
self.model.zero_grad()
pred = self.model(x)
loss = self.model.criterion(pred, y)
loss.backward()
for param in self.model.parameters():
param.data -= self.innerstep * param.grad.data
def run(self, eval=100):
self.model = SineNetwork()
rng = np.random.RandomState(0)
# Data Loaders
train_loader = SineLoader(k=self.samples, k_test=0).generator(episodic=True, batch_size=None, mode="train", reset_ptr=True)
test_loader = SineLoader(k=self.support_size, k_test=self.query_size).generator(episodic=True, batch_size=None, mode="test", reset_ptr=True)
_, (ax1, ax2) = plt.subplots(1, 2)
iters = []
train_losses = []
train_means = []
test_losses = []
test_means = []
for iteration in range(self.iterations):
# Train on a batch of sine data
# Keep current weights for outerloop update
weights_before = deepcopy(self.model.state_dict())
# Train over a single task of sine waves at each inner epoch
for _ in range(self.innerepochs):
x_batch, y_batch, _, _ = next(train_loader)
indices = rng.permutation(len(x_batch))
# get minibatchs for inner loop
for start in range(0, len(x_batch), self.batch_size):
gather = indices[start:start+self.batch_size]
self.train_batch(x_batch[gather], y_batch[gather])
# Get new weights for outerloop update
weights_after = self.model.state_dict()
# performs an outerloop weight update separately from the innerloop training
# re-scale the learning update
outerstepsize = self.outerstep * (1 - iteration/self.iterations)
# interpolating old and new parameters
self.model.load_state_dict({name :
weights_before[name] + (weights_after[name] - weights_before[name]) * outerstepsize
for name in weights_before
})
# Evaluate on a new unseen task and plot results
if iteration==0 or (iteration+1) % eval == 0:
ax1.cla()
ax2.cla()
iters.append(iteration)
# Keep weights for restoring later
weights_before = deepcopy(self.model.state_dict())
# Test the model for a fixed number of training epochs on the new task and evaluate against query set
x_support, y_support, x_query, y_query = next(test_loader)
train_losses.append(self.model.criterion(self.model(x_query), y_query).item())
self.plot_fn(ax1, "--", "Initial Model", (0,0,0))
for inneriter in range(32):
self.train_batch(x_support, y_support)
if (inneriter+1) % 8 == 0:
frac = (inneriter+1) / 32
self.plot_fn(ax1, "-", "Model at %i"%(inneriter+1), (1-frac, frac, 0, frac))
test_losses.append(self.model.criterion(self.model(x_query), y_query).item())
train_means.append(np.mean(train_losses[-HORIZON:]))
test_means.append(np.mean(test_losses[-HORIZON:]))
ax1.plot(x_support, y_support, "*", label="Support", color=(1,0,0))
ax1.plot(x_query, y_query, "*", label="Query", color=(0,0,1))
ax1.set_ylim(-5,5)
ax1.legend(loc='upper right', fancybox=True, shadow=True)
ax1.title.set_text("Reptile Transfer")
ax1.set_xlabel("X")
ax1.set_ylabel("Y")
ax2.plot(iters[1:], train_losses[1:], ITER_S, label="Train Loss", color=TRAIN_T)
ax2.plot(iters[1:], train_means[1:], MEAN_S, label="Train Mean", color=TRAIN_C)
ax2.plot(iters[1:], test_losses[1:], ITER_S, label="Test Loss", color=TEST_T)
ax2.plot(iters[1:], test_means[1:], MEAN_S, label="Test Mean", color=TEST_C)
ax2.legend(loc="upper right", fancybox=True, shadow=True)
ax2.title.set_text("Learning Curves")
ax2.set_xlabel("Iterations")
ax2.set_ylabel("MSE")
plt.pause(0.01)
print(f"-----------------------------")
print(f"Iteration {iteration+1}")
print("Before Transfer |", f"Train Loss: {train_losses[-1]:.3f}", f"Mean: {train_means[-1]:.3f}")
print("After Transfer |", f"Test Loss: {test_losses[-1]:.3f}", f"Mean: {test_means[-1]:.3f}")
# Restore weights from before testing
self.model.load_state_dict(weights_before)
| 47.991837 | 149 | 0.543545 | 1,436 | 11,758 | 4.29805 | 0.143454 | 0.042288 | 0.009073 | 0.015554 | 0.813513 | 0.800713 | 0.790992 | 0.762476 | 0.762476 | 0.762476 | 0 | 0.024034 | 0.32412 | 11,758 | 244 | 150 | 48.188525 | 0.752611 | 0.091852 | 0 | 0.770492 | 0 | 0 | 0.071741 | 0.018106 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04918 | false | 0 | 0.032787 | 0 | 0.092896 | 0.043716 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
563b3a2d668c0c7d0d8a10b2adfbcfb74031197e | 162 | py | Python | register.py | carryking1988/heimatest | 62e7c43651cf58ee9f41469fb8e76892f581af43 | [
"MIT"
] | null | null | null | register.py | carryking1988/heimatest | 62e7c43651cf58ee9f41469fb8e76892f581af43 | [
"MIT"
] | null | null | null | register.py | carryking1988/heimatest | 62e7c43651cf58ee9f41469fb8e76892f581af43 | [
"MIT"
] | null | null | null |
def register(func):
def wrapper(*args,**kwargs):
print('hello')
return wrapper(*args,**kwargs)
return wrapper
def register2():
pass
| 16.2 | 38 | 0.598765 | 18 | 162 | 5.388889 | 0.611111 | 0.226804 | 0.350515 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008333 | 0.259259 | 162 | 9 | 39 | 18 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.031056 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.142857 | 0 | 0 | 0.714286 | 0.142857 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
564650e4c3812ee2e6fca260281d3cfe6ffffe49 | 6,234 | py | Python | nailgun/nailgun/orchestrator/graph_configuration.py | prmtl/fuel-web | 3577169e209596a8e4a95d1c41d2dde099a3945f | [
"Apache-2.0"
] | null | null | null | nailgun/nailgun/orchestrator/graph_configuration.py | prmtl/fuel-web | 3577169e209596a8e4a95d1c41d2dde099a3945f | [
"Apache-2.0"
] | null | null | null | nailgun/nailgun/orchestrator/graph_configuration.py | prmtl/fuel-web | 3577169e209596a8e4a95d1c41d2dde099a3945f | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#(dshulyak) temporary, this config will be moved to fuel-library
#until we will stabilize our api
DEPLOYMENT_51_60 = """
- id: deploy_start
type: stage
- id: deploy_end
type: stage
requires: [deploy_start]
- id: primary-controller
type: group
role: [primary-controller]
required_for: [deploy_end]
requires: [deploy_start]
parameters:
strategy:
type: one_by_one
- id: controller
type: group
role: [controller]
requires: [primary-controller]
required_for: [deploy_end]
parameters:
strategy:
type: parallel
amount: 6
- id: cinder
type: group
role: [cinder]
requires: [controller]
required_for: [deploy_end]
parameters:
strategy:
type: parallel
- id: compute
type: group
role: [compute]
requires: [controller]
required_for: [deploy_end]
parameters:
strategy:
type: parallel
- id: zabbix-server
type: group
role: [zabbix-server]
required_for: [deploy_end]
requires: [deploy_start]
parameters:
strategy:
type: one_by_one
- id: mongo
type: group
role: [mongo]
requires: [zabbix-server]
required_for: [deploy_end, primary-controller, controller]
parameters:
strategy:
type: parallel
- id: primary-mongo
type: group
role: [primary-mongo]
requires: [mongo]
required_for: [deploy_end, primary-controller, controller]
parameters:
strategy:
type: one_by_one
- id: ceph-osd
type: group
role: [ceph-osd]
requires: [controller]
required_for: [deploy_end]
parameters:
strategy:
type: parallel
- id: base-os
type: group
role: [base-os]
required_for: [deploy_end]
parameters:
strategy:
type: parallel
- id: deploy_legacy
type: puppet
groups: [primary-controller, controller,
cinder, compute, ceph-osd,
zabbix-server, primary-mongo, mongo]
required_for: [deploy_end]
requires: [deploy_start]
parameters:
puppet_manifest: /etc/puppet/manifests/site.pp
puppet_modules: /etc/puppet/modules
timeout: 3600
"""
DEPLOYMENT_50 = """
- id: deploy_start
type: stage
- id: deploy_end
type: stage
requires: [deploy_start]
- id: primary-controller
type: group
role: [primary-controller]
required_for: [deploy_end]
requires: [deploy_start]
parameters:
strategy:
type: one_by_one
- id: controller
type: group
role: [controller]
requires: [primary-controller]
required_for: [deploy_end]
parameters:
strategy:
type: one_by_one
- id: cinder
type: group
role: [cinder]
requires: [controller]
required_for: [deploy_end]
parameters:
strategy:
type: parallel
- id: compute
type: group
role: [compute]
requires: [controller]
required_for: [deploy_end]
parameters:
strategy:
type: parallel
- id: zabbix-server
type: group
role: [zabbix-server]
required_for: [deploy_end]
requires: [deploy_start]
parameters:
strategy:
type: one_by_one
- id: mongo
type: group
role: [mongo]
requires: [zabbix-server]
required_for: [deploy_end, primary-controller, controller]
parameters:
strategy:
type: one_by_one
- id: primary-mongo
type: group
role: [primary-mongo]
requires: [mongo]
required_for: [deploy_end, primary-controller, controller]
parameters:
strategy:
type: one_by_one
- id: ceph-osd
type: group
role: [ceph-osd]
requires: [controller]
required_for: [deploy_end]
parameters:
strategy:
type: parallel
- id: deploy_legacy
type: puppet
groups: [primary-controller, controller,
cinder, compute, ceph-osd,
zabbix-server, primary-mongo, mongo]
required_for: [deploy_end]
requires: [deploy_start]
parameters:
puppet_manifest: /etc/puppet/manifests/site.pp
puppet_modules: /etc/puppet/modules
timeout: 3600
"""
PATCHING = """
- id: deploy_start
type: stage
- id: deploy_end
type: stage
requires: [deploy_start]
- id: primary-controller
type: group
role: [primary-controller]
required_for: [deploy_end]
requires: [deploy_start]
parameters:
strategy:
type: one_by_one
- id: controller
type: group
role: [controller]
requires: [primary-controller]
required_for: [deploy_end]
parameters:
strategy:
type: one_by_one
- id: cinder
type: group
role: [cinder]
requires: [controller]
required_for: [deploy_end]
parameters:
strategy:
type: one_by_one
- id: compute
type: group
role: [compute]
requires: [controller]
required_for: [deploy_end]
parameters:
strategy:
type: one_by_one
- id: zabbix-server
type: group
role: [zabbix-server]
required_for: [deploy_end]
requires: [deploy_start]
parameters:
strategy:
type: one_by_one
- id: mongo
type: group
role: [mongo]
requires: [zabbix-server]
required_for: [deploy_end, primary-controller, controller]
parameters:
strategy:
type: one_by_one
- id: primary-mongo
type: group
role: [primary-mongo]
requires: [mongo]
required_for: [deploy_end, primary-controller, controller]
parameters:
strategy:
type: one_by_one
- id: ceph-osd
type: group
role: [ceph-osd]
requires: [controller]
required_for: [deploy_end]
parameters:
strategy:
type: one_by_one
- id: deploy_legacy
type: puppet
groups: [primary-controller, controller,
cinder, compute, ceph-osd,
zabbix-server, primary-mongo, mongo]
required_for: [deploy_end]
requires: [deploy_start]
parameters:
puppet_manifest: /etc/puppet/manifests/site.pp
puppet_modules: /etc/puppet/modules
timeout: 3600
"""
| 21.645833 | 78 | 0.685435 | 762 | 6,234 | 5.452756 | 0.158793 | 0.067148 | 0.114561 | 0.134777 | 0.85728 | 0.855355 | 0.855355 | 0.855355 | 0.855355 | 0.85343 | 0 | 0.005711 | 0.213507 | 6,234 | 287 | 79 | 21.721254 | 0.84173 | 0.111967 | 0 | 0.976378 | 0 | 0 | 0.984236 | 0.015764 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
569312dfa205c092a4296503316a2a0ca850603a | 5,613 | py | Python | example/takeaway/tests.py | Hedde/django-networth | 496311e31f3b49202bda9c0b2997ee4508cc9477 | [
"MIT"
] | null | null | null | example/takeaway/tests.py | Hedde/django-networth | 496311e31f3b49202bda9c0b2997ee4508cc9477 | [
"MIT"
] | 3 | 2020-02-12T00:02:35.000Z | 2021-06-10T19:38:41.000Z | example/takeaway/tests.py | Hedde/django-networth | 496311e31f3b49202bda9c0b2997ee4508cc9477 | [
"MIT"
] | null | null | null | __author__ = 'heddevanderheide'
# Django specific
from django import test
# App specific
from takeaway.models import Pizza, Topping
class TestNetworthModel(test.TestCase):
def setUp(self):
self.obj_1 = Pizza.objects.create(name='Margherita')
self.obj_2 = Pizza.objects.create(name='Funghi')
self.obj_3 = Pizza.objects.create(name='Double Dutch')
self.topping_1 = Topping.objects.create(name='Mushrooms')
self.topping_2 = Topping.objects.create(name='Cheese')
self.topping_3 = Topping.objects.create(name='Onions')
self.obj_2.toppings.add(self.topping_1)
self.obj_3.toppings.add(self.topping_2, self.topping_3)
def test_obj_1(self):
# no commit
self.assertEqual(self.obj_1.networth(realtime=True), 1)
self.assertEqual(self.obj_1._networth, 1)
# commit
self.assertEqual(self.obj_1.networth(commit=True), 1)
self.assertEqual(self.obj_1._networth, 1)
# relative
self.assertEqual(self.obj_1.relative_networth(realtime=True, commit=True), 100)
self.assertEqual(self.obj_1._relative_networth, 100)
def test_obj_2(self):
# no commit
self.assertEqual(self.obj_2.networth(realtime=True), 2)
self.assertEqual(self.obj_2._networth, 1)
# commit
self.assertEqual(self.obj_2.networth(realtime=True, commit=True), 2)
self.assertEqual(self.obj_2._networth, 2)
# relative
self.assertEqual(self.obj_2.relative_networth(realtime=True, commit=True), 100)
self.assertEqual(self.obj_2._relative_networth, 100)
def test_obj_3(self):
# no commit
self.assertEqual(self.obj_3.networth(realtime=True), 3)
self.assertEqual(self.obj_3._networth, 1)
# commit
self.assertEqual(self.obj_3.networth(realtime=True, commit=True), 3)
self.assertEqual(self.obj_3._networth, 3)
# relative
self.assertEqual(self.obj_3.relative_networth(realtime=True, commit=True), 100)
self.assertEqual(self.obj_3._relative_networth, 100)
def test_relative_networth_multiple_objects(self):
# no commit
self.assertEqual(self.obj_1.networth(realtime=True), 1)
self.assertEqual(self.obj_2.networth(realtime=True), 2)
self.assertEqual(self.obj_3.networth(realtime=True), 3)
# OBJ 1
# commit
self.assertEqual(self.obj_1.networth(commit=True), 1)
self.assertEqual(self.obj_1._networth, 1)
# relative
self.assertEqual(self.obj_1.relative_networth(realtime=True, commit=True), 100)
self.assertEqual(self.obj_1._relative_networth, 100)
# OBJ 2
# commit
self.assertEqual(self.obj_2.networth(realtime=True, commit=True), 2)
self.assertEqual(self.obj_2._networth, 2)
# relative
self.assertEqual(self.obj_2.relative_networth(realtime=True, commit=True), 100)
self.assertEqual(self.obj_2._relative_networth, 100)
# re-test previous object(s)
self.assertEqual(self.obj_1.relative_networth(realtime=True, commit=True), 50)
self.assertEqual(self.obj_1._relative_networth, 50)
# OBJ 3
# commit
self.assertEqual(self.obj_3.networth(realtime=True, commit=True), 3)
self.assertEqual(self.obj_3._networth, 3)
# relative
self.assertEqual(self.obj_3.relative_networth(realtime=True, commit=True), 100)
self.assertEqual(self.obj_3._relative_networth, 100)
# re-test previous object(s)
self.assertEqual(self.obj_1.relative_networth(realtime=True, commit=True), 33)
self.assertEqual(self.obj_1._relative_networth, 33)
self.assertEqual(self.obj_2.relative_networth(realtime=True, commit=True), 66)
self.assertEqual(self.obj_2._relative_networth, 66)
# OBJ 2 UPGRADE
self.obj_2.toppings.add(self.topping_2, self.topping_3)
# commit
self.assertEqual(self.obj_2.networth(realtime=True, commit=True), 4)
self.assertEqual(self.obj_2._networth, 4)
# relative
self.assertEqual(self.obj_2.relative_networth(realtime=True, commit=True), 100)
self.assertEqual(self.obj_2._relative_networth, 100)
# re-test previous object(s)
self.assertEqual(self.obj_1.relative_networth(realtime=True, commit=True), 25)
self.assertEqual(self.obj_1._relative_networth, 25)
self.assertEqual(self.obj_3.relative_networth(realtime=True, commit=True), 75)
self.assertEqual(self.obj_3._relative_networth, 75)
# OBJ 2 DOWNGRADE
self.obj_2.toppings.remove(self.topping_2, self.topping_3)
# commit
self.assertEqual(self.obj_2.networth(realtime=True, commit=True), 2)
self.assertEqual(self.obj_2._networth, 2)
# relative
self.assertEqual(self.obj_2.relative_networth(realtime=True, commit=True), 66)
self.assertEqual(self.obj_2._relative_networth, 66)
# re-test previous object(s)
# self.assertEqual(self.obj_1.networth(realtime=True, commit=True), 1)
# self.assertEqual(self.obj_1._networth, 1)
# self.assertEqual(self.obj_3.networth(realtime=True, commit=True), 3)
# self.assertEqual(self.obj_3._networth, 3)
#
# self.assertEqual(self.obj_1.relative_networth(realtime=True, commit=True), 33)
# self.assertEqual(self.obj_1._relative_networth, 33)
# self.assertEqual(self.obj_3.relative_networth(realtime=True, commit=True), 100)
# self.assertEqual(self.obj_3._relative_networth, 100) | 38.183673 | 89 | 0.682166 | 742 | 5,613 | 4.960916 | 0.075472 | 0.125509 | 0.304537 | 0.352622 | 0.860364 | 0.857376 | 0.843249 | 0.797338 | 0.768541 | 0.744906 | 0 | 0.042169 | 0.201497 | 5,613 | 147 | 90 | 38.183673 | 0.779116 | 0.149296 | 0 | 0.557143 | 0 | 0 | 0.013728 | 0 | 0 | 0 | 0 | 0 | 0.728571 | 1 | 0.071429 | false | 0 | 0.028571 | 0 | 0.114286 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
3b4ed99a466cfa1abfeb469aee245dbcdffb73a6 | 89,631 | py | Python | localnet/excluded_tests/test_staking.py | 3100/harmony-test | 4bd83656447248ae94a758ea49d3b7e83367d45b | [
"MIT"
] | 6 | 2020-07-10T13:39:39.000Z | 2021-04-27T16:17:34.000Z | localnet/excluded_tests/test_staking.py | 3100/harmony-test | 4bd83656447248ae94a758ea49d3b7e83367d45b | [
"MIT"
] | 4 | 2020-10-30T20:42:18.000Z | 2021-05-06T16:27:54.000Z | localnet/excluded_tests/test_staking.py | 3100/harmony-test | 4bd83656447248ae94a758ea49d3b7e83367d45b | [
"MIT"
] | 9 | 2021-04-03T07:59:13.000Z | 2022-01-14T05:13:29.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Tests here are related to staking functions & require a feedback loop with the chain.
TODO: negative test cases
As with all tests, there are 2 JSON-RPC versions/namespaces (v1 & v2) where their difference
is only suppose to be in the types of their params & returns. v1 keeps everything in hex and
v2 uses decimal when possible. However, there are some (legacy) discrepancies that some tests
enforce. These tests are noted and should NOT be broken.
"""
import json
import time
import random
import traceback
import pytest
from flaky import flaky
from pyhmy import (
blockchain,
staking
)
from pyhmy.rpc.request import (
base_request
)
import txs
from txs import (
tx_timeout,
beacon_shard_id,
initial_funding,
endpoints,
send_and_confirm_staking_transaction,
send_staking_transaction,
get_staking_transaction
)
from utils import (
check_and_unpack_rpc_response,
assert_valid_json_structure,
mutually_exclusive_test,
rerun_delay_filter,
assert_no_null_in_list
)
_mutex_scope = "staking"
def _assert_validator_info(validator_data, validator_info):
"""
Helper function to check `validator_info` with the given `validator_data`.
Validator data is expected to follow `stx` in s0_validator & s1_validator
"""
val = validator_info["validator"]
for attr in ["name", "identity", "website", "security-contact", "details"]:
assert validator_data[attr] == val[attr], f"Expected {validator_data[attr]}, got {val[attr]}"
for attr in ["rate", "max-rate", "max-change-rate"]:
assert validator_data[attr] == float(val[attr]), f"Expected {validator_data[attr]}, got {val[attr]}"
for attr in ["min-self-delegation", "max-total-delegation"]:
assert validator_data[attr] * 1e18 == float(val[attr]), f"Expected {validator_data[attr]}, got {val[attr]}"
assert validator_data["pub-bls-key"] in val[
"bls-public-keys"], f"Expected pub-bls-key {validator_data['pub-bls-key']} " \
f"in {val['bls-public-keys']}"
@pytest.fixture(scope="module")
@txs.staking
def s0_validator():
"""
Fixture for the shard 0 validator (with a running external node).
Returns the validator's create validator transaction (`stx`)
"""
stx = {
"validator-addr": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"delegator-addr": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"name": "test",
"identity": "test0",
"website": "test",
"security-contact": "test",
"details": "test",
"rate": 0.1,
"max-rate": 0.9,
"max-change-rate": 0.05,
"min-self-delegation": 10000,
"max-total-delegation": 10000000,
"amount": 10000,
"pub-bls-key": "4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c",
"hash": "0xf80460f1ad041a0a0e841da717fc5b7959b1a7e9a0ce9a25cd70c0ce40d5ff26",
"nonce": "0x0",
"signed-raw-tx": "0xf9015780f90106947946f5ce1eeb290965deef936cd9154c22173efeda8474657374857465737430847465737484746573748474657374ddc988016345785d8a0000c9880c7d713b49da0000c887b1a2bc2ec500008a021e19e0c9bab24000008b084595161401484a000000f1b04f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270cf862b8606e1204740c90329827178361b635109e515a2334d970f44f29f3a98ff10bb351d8dd7fa03ceadcbe3e53be7b1bd0940c1e1fc58d2725e4bacf06831974edaf3291dfd5a0aa1e81c8a078e7e5e6cb9e58c750d6005afdd7b1548823804039a2118a021e19e0c9bab240000080843b9aca00835121c427a02348daabe696c4370379b9102dd85da6d4fed52f0f511ff0448a21c001ee75a7a01a67f9f40e0de02b50d5d7295f200fea7f950c1b59aa7efa8d225294c4fdbc5e"
}
in_initially_funded = False
for tx in initial_funding:
if tx["to"] == stx["validator-addr"] and tx["to-shard"] == beacon_shard_id:
in_initially_funded = True
break
if not in_initially_funded:
raise AssertionError(f"Test staking transaction from address {stx['validator-addr']} "
f"not found in set of initially funded accounts (or not founded on s{beacon_shard_id})")
if get_staking_transaction(stx["hash"]) is None:
tx = send_and_confirm_staking_transaction(stx)
assert tx["hash"] == stx["hash"], f"Expected create validator transaction hash to be {stx['hash']}, " \
f"got {tx['hash']}"
assert get_staking_transaction(stx["hash"]) is not None, f"Transaction (hash {stx['hash']}) not found on chain."
return stx
@pytest.fixture(scope="module")
@txs.staking
def s1_validator():
"""
Fixture for the shard 1 validator (with a running external node).
Returns the validator's create validator transaction (`stx`)
"""
stx = {
"validator-addr": "one1nmy8quw0924fss4r9km640pldzqegjk4wv4wts",
"delegator-addr": "one1nmy8quw0924fss4r9km640pldzqegjk4wv4wts",
"name": "test",
"identity": "test1",
"website": "test",
"security-contact": "test",
"details": "test",
"rate": 0.1,
"max-rate": 0.9,
"max-change-rate": 0.05,
"min-self-delegation": 10000,
"max-total-delegation": 10000000,
"amount": 10000,
#"pub-bls-key": "5e2f14abeadf0e759beb1286ed6095d9d1b2d64ad394316991161c6f95237710e0a4beda8adeaefde4844ab4c4b2bf98",
"pub-bls-key": "5e2f14abeadf0e759beb1286ed6095d9d1b2d64ad394316991161c6f95237710e0a4beda8adeaefde4844ab4c4b2bf98",
"hash": "0x37743ed5a112e54134d610b18284ab8967c926a2d53eaf23ba836431cf9bd96a",
"nonce": "0x0",
"signed-raw-tx": "0xf9015780f90106949ec87071cf2aaa9842a32db7aabc3f6881944ad5da8474657374857465737431847465737484746573748474657374ddc988016345785d8a0000c9880c7d713b49da0000c887b1a2bc2ec500008a021e19e0c9bab24000008b084595161401484a000000f1b05e2f14abeadf0e759beb1286ed6095d9d1b2d64ad394316991161c6f95237710e0a4beda8adeaefde4844ab4c4b2bf98f862b860e8bc184c4d5779ab7ab9fb8902b157b1257b1c4fa7e39649b2d900f0415f3aec0701f89e6840d42854559620627e871862b7b5075fad456fb43bc9eb5811c5b305d1d82838332623b109fbc033fd144387bb402e3bd1626a640b58d0b3ae66098a021e19e0c9bab240000080843b9aca008351220427a0d9d4bfabdc1dd7c63c951e0353d0fdee583e9cf55dcd0253aa6eb2d1066ccb2aa0202841a6ebc536d04ca7ae2ea1d83d4d2c5d1ef1af879202613b60ee2304b27b"
}
in_initially_funded = False
for tx in initial_funding:
if tx["to"] == stx["validator-addr"] and tx["to-shard"] == beacon_shard_id:
in_initially_funded = True
break
if not in_initially_funded:
raise AssertionError(f"Test staking transaction from address {stx['validator-addr']} "
f"not found in set of initially funded accounts (or not founded on s{beacon_shard_id})")
if get_staking_transaction(stx["hash"]) is None:
tx = send_and_confirm_staking_transaction(stx)
assert tx["hash"] == stx["hash"], f"Expected create validator transaction hash to be {stx['hash']}, " \
f"got {tx['hash']}"
assert get_staking_transaction(stx["hash"]) is not None, f"Transaction (hash {stx['hash']}) not found on chain."
return stx
@txs.staking
@mutually_exclusive_test(scope=_mutex_scope)
@pytest.mark.run(after="test_get_validator_information")
def test_delegation(s1_validator):
"""
Note that this is not an explicit RPC test. It just tests that delegation works.
"""
stx = {
"validator-addr": "one1nmy8quw0924fss4r9km640pldzqegjk4wv4wts",
"delegator-addr": "one1v895jcvudcktswcmg2sldvmxvtvvdj2wuxj3hx",
# web topple now acid repeat inspire tomato inside nominee reflect latin salmon garbage negative liberty win royal faith hammer lawsuit west toddler payment coffee
"amount": 10000,
"hash": "0x832e5af2305167d5d9a891c51eafc6510c89bbc76c01818e4ce02de0fc8c854e",
"nonce": "0x0",
"signed-raw-tx": "0xf88302f59461cb49619c6e2cb83b1b42a1f6b36662d8c6c94e949ec87071cf2aaa9842a32db7aabc3f6881944ad58a021e19e0c9bab240000080843b9aca00825fe027a0d8912da6a925af17701a2600df60e90fa4a61858b51758a03f57ac9d2797dc0ca004313c6865bde8704594be44d3ebbadfa6420922eef73d38ffba8ec42d8d3550"
}
assert stx["validator-addr"] == s1_validator["validator-addr"], f"Sanity check: Expected validator address " \
f"to be {s1_validator['validator-addr']}"
submitted_tx = False
if get_staking_transaction(stx["hash"]) is None:
tx = send_and_confirm_staking_transaction(stx)
submitted_tx = True
assert tx["hash"] == stx["hash"], f"Expected contract transaction hash to be {stx['hash']}, " \
f"got {tx['hash']}"
assert get_staking_transaction(stx["hash"]) is not None, f"Transaction (hash {stx['hash']}) not found on chain."
validator_info = staking.get_validator_information(stx["validator-addr"], endpoint=endpoints[beacon_shard_id])
for delegation in validator_info["validator"]["delegations"]:
if delegation["delegator-address"] == stx["delegator-addr"]:
if submitted_tx:
assert delegation["amount"] == stx[
"amount"] * 1e18, f"Expected delegated amount to be {stx['amount']} ONE"
return
raise AssertionError(f"New delegation from {stx['delegator-addr']} not found on validator {stx['validator-addr']}")
@txs.staking
@mutually_exclusive_test(scope=_mutex_scope)
@pytest.mark.run(after="test_delegation")
@flaky(max_runs=6)
def test_undelegation(s1_validator):
"""
Note that this is not an explicit RPC test. It just tests that undelegation works.
"""
stx = {
"validator-addr": "one1nmy8quw0924fss4r9km640pldzqegjk4wv4wts",
"delegator-addr": "one1v895jcvudcktswcmg2sldvmxvtvvdj2wuxj3hx",
# web topple now acid repeat inspire tomato inside nominee reflect latin salmon garbage negative liberty win royal faith hammer lawsuit west toddler payment coffee
"amount": 10000,
"hash": "0x79d27d042c157a4c1cdcbe931155515bfdd78d3162be79d348bab33113d8e08e",
"nonce": "0x1",
"signed-raw-tx": "0xf88203f49461cb49619c6e2cb83b1b42a1f6b36662d8c6c94e949ec87071cf2aaa9842a32db7aabc3f6881944ad5891b1ae4d6e2ef50000001843b9aca00825f9c28a0d12eb6e84a48356e079319642902b5c203806cba6960a3da2b5c43cf8021f510a00a73e7315c3ef773995eb3ebac033ab6c6b60b032f294af944f161d8e9ca2d4e"
}
assert stx["validator-addr"] == s1_validator["validator-addr"], f"Sanity check: Expected validator address " \
f"to be {s1_validator['validator-addr']}"
submitted_tx = False
if get_staking_transaction(stx["hash"]) is None:
tx = send_and_confirm_staking_transaction(stx)
submitted_tx = True
assert tx["hash"] == stx["hash"], f"Expected contract transaction hash to be {stx['hash']}, " \
f"got {tx['hash']}"
assert get_staking_transaction(stx["hash"]) is not None, f"Transaction (hash {stx['hash']}) not found on chain."
validator_info = staking.get_validator_information(stx["validator-addr"], endpoint=endpoints[beacon_shard_id])
for delegation in validator_info["validator"]["delegations"]:
if delegation["delegator-address"] == stx["delegator-addr"]:
if submitted_tx:
assert len(
delegation["undelegations"]) > 0, f"Expected undelegations on validator {stx['validator-addr']}"
return
raise AssertionError(f"New delegation from {stx['delegator-addr']} not found on validator {stx['validator-addr']}")
@txs.staking
@pytest.mark.run('first')
def test_get_all_validator_addresses(s0_validator, s1_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = [
'one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e',
'one1nmy8quw0924fss4r9km640pldzqegjk4wv4wts'
]
# Check v1
raw_response = base_request("hmy_getAllValidatorAddresses", params=[], endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert s0_validator["validator-addr"] in response, f"Expected validator {s0_validator['validator-addr']} " \
f"in validator list {response}"
assert s1_validator["validator-addr"] in response, f"Expected validator {s1_validator['validator-addr']} " \
f"in validator list {response}"
# Check v2
raw_response = base_request("hmyv2_getAllValidatorAddresses", params=[], endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert s0_validator["validator-addr"] in response, f"Expected validator {s0_validator['validator-addr']} " \
f"in validator list {response}"
assert s1_validator["validator-addr"] in response, f"Expected validator {s1_validator['validator-addr']} " \
f"in validator list {response}"
@txs.staking
def test_get_transaction_receipt_v1(s0_validator):
reference_response = {
"blockHash": "0x5890ceb902713f4f32f80764359e5b2ffec1fd84ad6f0bf75d5c22a6f1530d1d",
"blockNumber": "0x7",
"contractAddress": None,
"cumulativeGasUsed": "0x5121c4",
"gasUsed": "0x5121c4",
"logs": [],
"logsBloom": "0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
"sender": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"status": "0x1",
"transactionHash": "0xf80460f1ad041a0a0e841da717fc5b7959b1a7e9a0ce9a25cd70c0ce40d5ff26",
"transactionIndex": "0x0",
"type": 0
}
raw_response = base_request("hmy_getTransactionReceipt",
params=[s0_validator["hash"]],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert response["transactionHash"] == s0_validator["hash"], f"Expected transaction {s0_validator['hash']}, " \
f"got {response['transactionHash']}"
@txs.staking
def test_get_transaction_receipt_v2(s0_validator):
reference_response = {
"blockHash": "0x5890ceb902713f4f32f80764359e5b2ffec1fd84ad6f0bf75d5c22a6f1530d1d",
"blockNumber": 7,
"contractAddress": None,
"cumulativeGasUsed": 5317060,
"gasUsed": 5317060,
"logs": [],
"logsBloom": "0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
"sender": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"status": 1,
"transactionHash": "0xf80460f1ad041a0a0e841da717fc5b7959b1a7e9a0ce9a25cd70c0ce40d5ff26",
"transactionIndex": 0,
"type": 0
}
raw_response = base_request("hmyv2_getTransactionReceipt",
params=[s0_validator["hash"]],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert response["transactionHash"] == s0_validator["hash"], f"Expected transaction {s0_validator['hash']}, " \
f"got {response['transactionHash']}"
@txs.staking
def test_get_staking_transactions_count(s0_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = 0
# Check v1, SENT
raw_response = base_request("hmy_getStakingTransactionsCount",
params=[s0_validator["validator-addr"], "SENT"],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert response == 1, f"Expected account {s0_validator['validator-addr']} to have 1 sent transactions"
# Check v1, SENT
raw_response = base_request("hmyv2_getStakingTransactionsCount",
params=[s0_validator["validator-addr"], "SENT"],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert response == 1, f"Expected account {s0_validator['validator-addr']} to have 1 sent transactions"
# Check v1, RECEIVED
raw_response = base_request("hmy_getStakingTransactionsCount",
params=[s0_validator["validator-addr"], "RECEIVED"],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert response == 0, f"Expected account {s0_validator['validator-addr']} to have 0 received transactions"
# Check v1, RECEIVED
raw_response = base_request("hmyv2_getStakingTransactionsCount",
params=[s0_validator["validator-addr"], "RECEIVED"],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert response == 0, f"Expected account {s0_validator['validator-addr']} to have 0 received transactions"
# Check v1, ALL
raw_response = base_request("hmy_getStakingTransactionsCount",
params=[s0_validator["validator-addr"], "ALL"],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert response == 1, f"Expected account {s0_validator['validator-addr']} to have 1 total transactions"
# Check v1, ALL
raw_response = base_request("hmyv2_getStakingTransactionsCount",
params=[s0_validator["validator-addr"], "ALL"],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert response == 1, f"Expected account {s0_validator['validator-addr']} to have 1 total transactions"
@txs.staking
def test_get_all_validator_information(s0_validator, s1_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = [
{
"validator": {
"bls-public-keys": [
"4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c"
],
"last-epoch-in-committee": 0,
"min-self-delegation": 10000000000000000000000,
"max-total-delegation": 10000000000000000000000000,
"rate": "0.100000000000000000",
"max-rate": "0.900000000000000000",
"max-change-rate": "0.050000000000000000",
"update-height": 4,
"name": "test",
"identity": "test0",
"website": "test",
"security-contact": "test",
"details": "test",
"creation-height": 4,
"address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"delegations": [
{
"delegator-address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"amount": 10000000000000000000000,
"reward": 0,
"undelegations": []
}
]
},
"current-epoch-performance": {
"current-epoch-signing-percent": {
"current-epoch-signed": 0,
"current-epoch-to-sign": 0,
"current-epoch-signing-percentage": "0.000000000000000000"
}
},
"metrics": None,
"total-delegation": 10000000000000000000000,
"currently-in-committee": True,
"epos-status": "currently elected",
"epos-winning-stake": None,
"booted-status": "not booted",
"active-status": "active",
"lifetime": {
"reward-accumulated": 0,
"blocks": {
"to-sign": 0,
"signed": 0
},
"apr": "0.000000000000000000",
"epoch-apr": None
}
}
]
# Check v1
raw_response = base_request("hmy_getAllValidatorInformation", params=[0], endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_s0, found_s1 = False, False
for validator in response:
if validator["validator"]["address"] == s0_validator["validator-addr"]:
found_s0 = True
_assert_validator_info(s0_validator, validator)
elif validator["validator"]["address"] == s1_validator["validator-addr"]:
found_s1 = True
_assert_validator_info(s1_validator, validator)
for delegation in validator["validator"]["delegations"]:
assert_no_null_in_list(delegation["undelegations"])
assert found_s0 and found_s1, f"Expected to find validator information for " \
f"{s0_validator['validator-addr']} and {s1_validator['validator-addr']}"
# Check v2
raw_response = base_request("hmyv2_getAllValidatorInformation", params=[0], endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_s0, found_s1 = False, False
for validator in response:
if validator["validator"]["address"] == s0_validator["validator-addr"]:
found_s0 = True
_assert_validator_info(s0_validator, validator)
elif validator["validator"]["address"] == s1_validator["validator-addr"]:
found_s1 = True
_assert_validator_info(s1_validator, validator)
for delegation in validator["validator"]["delegations"]:
assert_no_null_in_list(delegation["undelegations"])
assert found_s0 and found_s1, f"Expected to found validator information for " \
f"{s0_validator['validator-addr']} and {s1_validator['validator-addr']}"
@txs.staking
def test_get_validator_information(s0_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = {
"validator": {
"bls-public-keys": [
"4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c"
],
"last-epoch-in-committee": 0,
"min-self-delegation": 10000000000000000000000,
"max-total-delegation": 10000000000000000000000000,
"rate": "0.100000000000000000",
"max-rate": "0.900000000000000000",
"max-change-rate": "0.050000000000000000",
"update-height": 4,
"name": "test",
"identity": "test0",
"website": "test",
"security-contact": "test",
"details": "test",
"creation-height": 4,
"address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"delegations": [
{
"delegator-address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"amount": 10000000000000000000000,
"reward": 0,
"undelegations": []
}
]
},
"current-epoch-performance": {
"current-epoch-signing-percent": {
"current-epoch-signed": 0,
"current-epoch-to-sign": 0,
"current-epoch-signing-percentage": "0.000000000000000000"
}
},
"metrics": None,
"total-delegation": 10000000000000000000000,
"currently-in-committee": True,
"epos-status": "currently elected",
"epos-winning-stake": None,
"booted-status": "not booted",
"active-status": "active",
"lifetime": {
"reward-accumulated": 0,
"blocks": {
"to-sign": 0,
"signed": 0
},
"apr": "0.000000000000000000",
"epoch-apr": None
}
}
# Check v1
raw_response = base_request("hmy_getValidatorInformation", params=[s0_validator["validator-addr"]],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
_assert_validator_info(s0_validator, response)
for delegation in response["validator"]["delegations"]:
assert_no_null_in_list(delegation["undelegations"])
# Check v2
raw_response = base_request("hmyv2_getValidatorInformation", params=[s0_validator["validator-addr"]],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
_assert_validator_info(s0_validator, response)
for delegation in response["validator"]["delegations"]:
assert_no_null_in_list(delegation["undelegations"])
@txs.staking
def test_get_validator_information_by_block_number(s0_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = {
"validator": {
"bls-public-keys": [
"4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c"
],
"last-epoch-in-committee": 0,
"min-self-delegation": 10000000000000000000000,
"max-total-delegation": 10000000000000000000000000,
"rate": "0.100000000000000000",
"max-rate": "0.900000000000000000",
"max-change-rate": "0.050000000000000000",
"update-height": 4,
"name": "test",
"identity": "test0",
"website": "test",
"security-contact": "test",
"details": "test",
"creation-height": 4,
"address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"delegations": [
{
"delegator-address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"amount": 10000000000000000000000,
"reward": 0,
"undelegations": []
}
]
},
"current-epoch-performance": {
"current-epoch-signing-percent": {
"current-epoch-signed": 0,
"current-epoch-to-sign": 0,
"current-epoch-signing-percentage": "0.000000000000000000"
}
},
"metrics": None,
"total-delegation": 10000000000000000000000,
"currently-in-committee": True,
"epos-status": "currently elected",
"epos-winning-stake": None,
"booted-status": "not booted",
"active-status": "active",
"lifetime": {
"reward-accumulated": 0,
"blocks": {
"to-sign": 0,
"signed": 0
},
"apr": "0.000000000000000000",
"epoch-apr": None
}
}
curr_block = blockchain.get_block_number(endpoint=endpoints[beacon_shard_id])
# Check v1
raw_response = base_request("hmy_getValidatorInformationByBlockNumber",
params=[s0_validator["validator-addr"], hex(curr_block)],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
_assert_validator_info(s0_validator, response)
# Check v2
raw_response = base_request("hmyv2_getValidatorInformationByBlockNumber",
params=[s0_validator["validator-addr"], curr_block],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
_assert_validator_info(s0_validator, response)
@txs.staking
def test_get_all_validator_information_by_block_number(s0_validator, s1_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = [
{
"validator": {
"bls-public-keys": [
"4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c"
],
"last-epoch-in-committee": 0,
"min-self-delegation": 10000000000000000000000,
"max-total-delegation": 10000000000000000000000000,
"rate": "0.100000000000000000",
"max-rate": "0.900000000000000000",
"max-change-rate": "0.050000000000000000",
"update-height": 4,
"name": "test",
"identity": "test0",
"website": "test",
"security-contact": "test",
"details": "test",
"creation-height": 4,
"address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"delegations": [
{
"delegator-address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"amount": 10000000000000000000000,
"reward": 0,
"undelegations": []
}
]
},
"current-epoch-performance": {
"current-epoch-signing-percent": {
"current-epoch-signed": 0,
"current-epoch-to-sign": 0,
"current-epoch-signing-percentage": "0.000000000000000000"
}
},
"metrics": None,
"total-delegation": 10000000000000000000000,
"currently-in-committee": True,
"epos-status": "currently elected",
"epos-winning-stake": None,
"booted-status": "not booted",
"active-status": "active",
"lifetime": {
"reward-accumulated": 0,
"blocks": {
"to-sign": 0,
"signed": 0
},
"apr": "0.000000000000000000",
"epoch-apr": None
}
}
]
curr_block = blockchain.get_block_number(endpoint=endpoints[beacon_shard_id])
# Check v1
raw_response = base_request("hmy_getAllValidatorInformationByBlockNumber", params=[0, hex(curr_block)],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_s0, found_s1 = False, False
for validator in response:
if validator["validator"]["address"] == s0_validator["validator-addr"]:
found_s0 = True
_assert_validator_info(s0_validator, validator)
elif validator["validator"]["address"] == s1_validator["validator-addr"]:
found_s1 = True
_assert_validator_info(s1_validator, validator)
assert found_s0 and found_s1, f"Expected to found validator information for " \
f"{s0_validator['validator-addr']} and {s0_validator['validator-addr']}"
# Check v2
raw_response = base_request("hmyv2_getAllValidatorInformationByBlockNumber", params=[0, curr_block],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_s0, found_s1 = False, False
for validator in response:
if validator["validator"]["address"] == s0_validator["validator-addr"]:
found_s0 = True
_assert_validator_info(s0_validator, validator)
elif validator["validator"]["address"] == s1_validator["validator-addr"]:
found_s1 = True
_assert_validator_info(s1_validator, validator)
assert found_s0 and found_s1, f"Expected to found validator information for " \
f"{s0_validator['validator-addr']} and {s0_validator['validator-addr']}"
@txs.staking
@flaky(max_runs=6, rerun_filter=rerun_delay_filter(delay=8))
@pytest.mark.run(after="test_get_validator_information")
def test_get_elected_validator_addresses(s0_validator, s1_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = [
'one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e',
'one1nmy8quw0924fss4r9km640pldzqegjk4wv4wts'
]
staking_epoch = blockchain.get_staking_epoch(endpoints[beacon_shard_id])
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[beacon_shard_id])["epoch"]
val_0_info = staking.get_validator_information(s0_validator["validator-addr"], endpoint=endpoints[beacon_shard_id])
val_1_info = staking.get_validator_information(s1_validator["validator-addr"], endpoint=endpoints[beacon_shard_id])
s0_creation_epoch = int(blockchain.get_block_by_number(val_0_info["validator"]["creation-height"])["epoch"], 16)
s1_creation_epoch = int(blockchain.get_block_by_number(val_1_info["validator"]["creation-height"])["epoch"], 16)
while curr_epoch <= s0_creation_epoch or curr_epoch <= s1_creation_epoch or curr_epoch < staking_epoch:
time.sleep(random.uniform(0.5, 1.5)) # Random to stop burst spam of RPC calls.
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[beacon_shard_id])["epoch"]
# Check v1
raw_response = base_request("hmy_getElectedValidatorAddresses", params=[], endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert s0_validator["validator-addr"] in response, f"Expected validator {s0_validator['validator-addr']} " \
f"in elected validator list {response}"
#assert s1_validator["validator-addr"] in response, f"Expected validator {s1_validator['validator-addr']} " \
# f"in elected validator list {response}"
# Check v2
raw_response = base_request("hmyv2_getElectedValidatorAddresses", params=[], endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert s0_validator["validator-addr"] in response, f"Expected validator {s0_validator['validator-addr']} " \
f"in elected validator list {response}"
#assert s1_validator["validator-addr"] in response, f"Expected validator {s1_validator['validator-addr']} " \
# f"in elected validator list {response}"
@txs.staking
@pytest.mark.run(after="test_delegation")
def test_get_delegations_by_delegator(s1_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = [
{
"validator_address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"delegator_address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"amount": 10000000000000000000000,
"reward": 0,
"Undelegations": []
},
]
val_addr = s1_validator["validator-addr"]
validator_info = staking.get_validator_information(val_addr, endpoint=endpoints[beacon_shard_id])
for delegator in validator_info["validator"]["delegations"]:
# Check v1
del_addr = delegator["delegator-address"]
raw_response = base_request("hmy_getDelegationsByDelegator", params=[del_addr],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert_no_null_in_list(response)
found_validator = False
for del_delegator in response:
assert_no_null_in_list(del_delegator["Undelegations"])
if del_delegator["validator_address"] == val_addr:
found_validator = True
assert del_addr == del_delegator["delegator_address"], f"Expected delegator address {del_addr}, " \
f"got {del_delegator['delegator_address']}"
assert found_validator, f"Expected to found validator {val_addr} in {json.dumps(response, indent=2)}"
# Check v2
raw_response = base_request("hmyv2_getDelegationsByDelegator", params=[del_addr],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert_no_null_in_list(response)
found_validator = False
for del_delegator in response:
assert_no_null_in_list(del_delegator["Undelegations"])
if del_delegator["validator_address"] == val_addr:
found_validator = True
assert del_addr == del_delegator["delegator_address"], f"Expected delegator address {del_addr}, " \
f"got {del_delegator['delegator_address']}"
assert found_validator, f"Expected to found validator {val_addr} in {json.dumps(response, indent=2)}"
@txs.staking
@mutually_exclusive_test(scope=_mutex_scope)
@pytest.mark.run(after="test_delegation")
def test_get_delegations_by_delegator_by_block_number(s1_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = [
{
"validator_address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"delegator_address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"amount": 10000000000000000000000,
"reward": 0,
"Undelegations": []
},
]
curr_block = blockchain.get_block_number(endpoint=endpoints[beacon_shard_id])
val_addr = s1_validator["validator-addr"]
validator_info = staking.get_validator_information(val_addr, endpoint=endpoints[beacon_shard_id])
for delegator in validator_info["validator"]["delegations"]:
# Check v1
del_addr = delegator["delegator-address"]
raw_response = base_request("hmy_getDelegationsByDelegatorByBlockNumber", params=[del_addr, hex(curr_block)],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert_no_null_in_list(response)
found_validator = False
for del_delegator in response:
assert_no_null_in_list(del_delegator["Undelegations"])
if del_delegator["validator_address"] == val_addr:
found_validator = True
assert del_addr == del_delegator["delegator_address"], f"Expected delegator address {del_addr}, " \
f"got {del_delegator['delegator_address']}"
assert found_validator, f"Expected to found validator {val_addr} in {json.dumps(response, indent=2)}"
# Check v2
raw_response = base_request("hmyv2_getDelegationsByDelegatorByBlockNumber", params=[del_addr, curr_block],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert_no_null_in_list(response)
found_validator = False
for del_delegator in response:
assert_no_null_in_list(del_delegator["Undelegations"])
if del_delegator["validator_address"] == val_addr:
found_validator = True
assert del_addr == del_delegator["delegator_address"], f"Expected delegator address {del_addr}, " \
f"got {del_delegator['delegator_address']}"
assert found_validator, f"Expected to found validator {val_addr} in {json.dumps(response, indent=2)}"
@txs.staking
@mutually_exclusive_test(scope=_mutex_scope)
@pytest.mark.run(after="test_delegation")
def test_get_delegations_by_validator(s1_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = [
{
"validator_address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"delegator_address": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"amount": 10000000000000000000000,
"reward": 0,
"Undelegations": []
},
]
val_addr = s1_validator["validator-addr"]
validator_info = staking.get_validator_information(val_addr, endpoint=endpoints[beacon_shard_id])
val_del_addrs = {d["delegator-address"] for d in validator_info["validator"]["delegations"]}
# Check v1
raw_response = base_request("hmy_getDelegationsByValidator", params=[val_addr],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert_no_null_in_list(response)
for del_delegator in response:
assert_no_null_in_list(del_delegator["Undelegations"])
del_val_addr, del_del_addr = del_delegator["validator_address"], del_delegator["delegator_address"]
assert del_val_addr == val_addr, f"Expected validator addr {val_addr}, got {del_val_addr}"
assert del_del_addr in val_del_addrs, f"Expected delegator addr {val_addr} in {val_del_addrs}"
# Check v2
raw_response = base_request("hmyv2_getDelegationsByValidator", params=[val_addr],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
for del_delegator in response:
del_val_addr, del_del_addr = del_delegator["validator_address"], del_delegator["delegator_address"]
assert del_val_addr == val_addr, f"Expected validator addr {val_addr}, got {del_val_addr}"
assert del_del_addr in val_del_addrs, f"Expected delegator addr {val_addr} in {val_del_addrs}"
@txs.staking
def test_get_current_utility_metrics(s0_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = {
"AccumulatorSnapshot": 5768000000000000000000,
"CurrentStakedPercentage": "0.000004311108610723",
"Deviation": "0.349995688891389277",
"Adjustment": "13999827555655571080.000000000000000000"
}
# Check v1
raw_response = base_request("hmy_getCurrentUtilityMetrics", params=[],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
# Check v2
raw_response = base_request("hmyv2_getCurrentUtilityMetrics", params=[],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
@txs.staking
@flaky(max_runs=6, rerun_filter=rerun_delay_filter(delay=8))
@pytest.mark.run(after="test_get_validator_information")
def test_get_median_raw_stake_snapshot(s0_validator):
"""
Note that v1 & v2 have the same responses.
Use shard 0 endpoint, NOT beacon endpoint as we are checking with `s0_validator`
"""
reference_response = {
"epos-median-stake": "10000000000000000000000.000000000000000000",
"max-external-slots": 6,
"epos-slot-winners": [
{
"slot-owner": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"bls-public-key": "4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c",
"raw-stake": "10000000000000000000000.000000000000000000",
"eposed-stake": "10000000000000000000000.000000000000000000"
}
],
"epos-slot-candidates": [
{
"stake": 10000000000000000000000,
"keys-at-auction": [
"4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c"
],
"percentage-of-total-auction-stake": "1.000000000000000000",
"stake-per-key": 10000000000000000000000,
"validator": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e"
}
]
}
staking_epoch = blockchain.get_staking_epoch(endpoints[beacon_shard_id])
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[0])["epoch"]
val_0_info = staking.get_validator_information(s0_validator["validator-addr"], endpoint=endpoints[0])
s0_creation_epoch = int(blockchain.get_block_by_number(val_0_info["validator"]["creation-height"])["epoch"], 16)
while curr_epoch <= s0_creation_epoch or curr_epoch < staking_epoch:
time.sleep(random.uniform(0.5, 1.5)) # Random to stop burst spam of RPC calls.
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[beacon_shard_id])["epoch"]
# First block of an epoch does not have correct snapshot, wait for next block.
curr_block = blockchain.get_latest_header(endpoint=endpoints[0])["blockNumber"]
prev_block_epoch = int(blockchain.get_block_by_number(curr_block - 1)["epoch"], 16)
while prev_block_epoch != curr_epoch:
time.sleep(random.uniform(0.5, 1.5)) # Random to stop burst spam of RPC calls.
curr_block = blockchain.get_latest_header(endpoint=endpoints[0])["blockNumber"]
prev_block_epoch = int(blockchain.get_block_by_number(curr_block - 1)["epoch"], 16)
# Check v1
raw_response = base_request("hmy_getMedianRawStakeSnapshot", params=[], endpoint=endpoints[0])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_s0_winner, found_s0_candidate = False, False
for val in response["epos-slot-winners"]:
if val["slot-owner"] == s0_validator["validator-addr"]:
found_s0_winner = True
break
assert found_s0_winner, f"Expected validator {s0_validator['validator-addr']} to win election"
for val in response["epos-slot-candidates"]:
if val["validator"] == s0_validator["validator-addr"]:
found_s0_candidate = True
break
assert found_s0_candidate, f"Expected validator {s0_validator['validator-addr']} to be candidate for next epoch"
# Check v2
raw_response = base_request("hmyv2_getMedianRawStakeSnapshot", params=[], endpoint=endpoints[0])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_s0_winner, found_s0_candidate = False, False
for val in response["epos-slot-winners"]:
if val["slot-owner"] == s0_validator["validator-addr"]:
found_s0_winner = True
break
assert found_s0_winner, f"Expected validator {s0_validator['validator-addr']} to win election"
for val in response["epos-slot-candidates"]:
if val["validator"] == s0_validator["validator-addr"]:
found_s0_candidate = True
break
assert found_s0_candidate, f"Expected validator {s0_validator['validator-addr']} to be candidate for next epoch"
@txs.staking
@flaky(max_runs=6, rerun_filter=rerun_delay_filter(delay=8))
@pytest.mark.run(after="test_get_median_raw_stake_snapshot")
def test_get_super_committees(s0_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = {
"previous": {
"quorum-deciders": {
"shard-0": {
"policy": "SuperMajorityStake",
"count": 7,
"external-validator-slot-count": 1,
"committee-members": [
{
"is-harmony-slot": True,
"earning-account": "one1spshr72utf6rwxseaz339j09ed8p6f8ke370zj",
"bls-public-key": "2d61379e44a772e5757e27ee2b3874254f56073e6bd226eb8b160371cc3c18b8c4977bd3dcb71fd57dc62bf0e143fd08",
"voting-power-unnormalized": "0.166666666666666666",
"voting-power-%": "0.113333333333333333"
},
],
"hmy-voting-power": "0.679999999999999998",
"staked-voting-power": "0.320000000000000002",
"total-raw-stake": "10000000000000000000000.000000000000000000",
"total-effective-stake": "10000000000000000000000.000000000000000000"
},
"shard-1": {
"policy": "SuperMajorityStake",
"count": 6,
"external-validator-slot-count": 0,
"committee-members": [
{
"is-harmony-slot": True,
"earning-account": "one1m6m0ll3q7ljdqgmth2t5j7dfe6stykucpj2nr5",
"bls-public-key": "40379eed79ed82bebfb4310894fd33b6a3f8413a78dc4d43b98d0adc9ef69f3285df05eaab9f2ce5f7227f8cb920e809",
"voting-power-unnormalized": "0.166666666666666666",
"voting-power-%": "0.113333333333333333"
},
],
"hmy-voting-power": "0.679999999999999998",
"staked-voting-power": "0.000000000000000000",
"total-raw-stake": "0.000000000000000000",
"total-effective-stake": "0.000000000000000000"
}
},
"external-slot-count": 6,
"epos-median-stake": "10000000000000000000000.000000000000000000"
},
"current": {
"quorum-deciders": {
"shard-0": {
"policy": "SuperMajorityStake",
"count": 7,
"external-validator-slot-count": 1,
"committee-members": [
{
"is-harmony-slot": True,
"earning-account": "one1pdv9lrdwl0rg5vglh4xtyrv3wjk3wsqket7zxy",
"bls-public-key": "65f55eb3052f9e9f632b2923be594ba77c55543f5c58ee1454b9cfd658d25e06373b0f7d42a19c84768139ea294f6204",
"voting-power-unnormalized": "0.166666666666666666",
"voting-power-%": "0.113333333333333333"
},
],
"hmy-voting-power": "0.679999999999999998",
"staked-voting-power": "0.320000000000000002",
"total-raw-stake": "10000000000000000000000.000000000000000000",
"total-effective-stake": "10000000000000000000000.000000000000000000"
},
"shard-1": {
"policy": "SuperMajorityStake",
"count": 6,
"external-validator-slot-count": 0,
"committee-members": [
{
"is-harmony-slot": True,
"earning-account": "one1m6m0ll3q7ljdqgmth2t5j7dfe6stykucpj2nr5",
"bls-public-key": "40379eed79ed82bebfb4310894fd33b6a3f8413a78dc4d43b98d0adc9ef69f3285df05eaab9f2ce5f7227f8cb920e809",
"voting-power-unnormalized": "0.166666666666666666",
"voting-power-%": "0.113333333333333333"
},
],
"hmy-voting-power": "0.679999999999999998",
"staked-voting-power": "0.000000000000000000",
"total-raw-stake": "0.000000000000000000",
"total-effective-stake": "0.000000000000000000"
}
},
"external-slot-count": 6,
"epos-median-stake": "10000000000000000000000.000000000000000000"
}
}
staking_epoch = blockchain.get_staking_epoch(endpoints[beacon_shard_id])
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[beacon_shard_id])["epoch"]
val_0_info = staking.get_validator_information(s0_validator["validator-addr"], endpoint=endpoints[beacon_shard_id])
s0_creation_epoch = int(blockchain.get_block_by_number(val_0_info["validator"]["creation-height"])["epoch"], 16)
while curr_epoch <= s0_creation_epoch or curr_epoch < staking_epoch:
time.sleep(random.uniform(0.5, 1.5)) # Random to stop burst spam of RPC calls.
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[0])["epoch"]
# Check v1
raw_response = base_request("hmy_getSuperCommittees", params=[], endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_validator, found_key = False, False
for member in response["current"]["quorum-deciders"]["shard-0"]["committee-members"]:
if member["earning-account"] == s0_validator["validator-addr"]:
found_validator = True
if member["bls-public-key"] == s0_validator["pub-bls-key"]:
found_key = True
assert found_validator, f"Expected to find validator {s0_validator['validator-addr']} in current committee"
assert found_key, f"Expected to pub bls key {s0_validator['bls-public-key']} in current committee"
# Check v2
raw_response = base_request("hmyv2_getSuperCommittees", params=[], endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_validator, found_key = False, False
for member in response["current"]["quorum-deciders"]["shard-0"]["committee-members"]:
if member["earning-account"] == s0_validator["validator-addr"]:
found_validator = True
if member["bls-public-key"] == s0_validator["pub-bls-key"]:
found_key = True
assert found_validator, f"Expected to find validator {s0_validator['validator-addr']} in current committee"
assert found_key, f"Expected to pub bls key {s0_validator['bls-public-key']} in current committee"
@txs.staking
def test_get_staking_network_info(s0_validator):
"""
Note that v1 & v2 have the same responses.
"""
reference_response = {
"total-supply": "12600000000.000000000000000000",
"circulating-supply": "6842781705.882339000000000000",
"epoch-last-block": 59,
"total-staking": 10000000000000000000000,
"median-raw-stake": "10000000000000000000000.000000000000000000"
}
# Check v1
raw_response = base_request("hmy_getStakingNetworkInfo", params=[],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
# Check v2
raw_response = base_request("hmyv2_getStakingNetworkInfo", params=[],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
@txs.staking
@flaky(max_runs=6, rerun_filter=rerun_delay_filter(delay=8))
def test_get_validator_keys(s0_validator):
"""
Note that v1 & v2 have the same responses.
Use shard 0 endpoint, NOT beacon endpoint as we are checking with `s0_validator`
"""
reference_response = [
"65f55eb3052f9e9f632b2923be594ba77c55543f5c58ee1454b9cfd658d25e06373b0f7d42a19c84768139ea294f6204",
]
staking_epoch = blockchain.get_staking_epoch(endpoints[beacon_shard_id])
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[0])["epoch"]
val_0_info = staking.get_validator_information(s0_validator["validator-addr"], endpoint=endpoints[0])
s0_creation_epoch = int(blockchain.get_block_by_number(val_0_info["validator"]["creation-height"])["epoch"], 16)
while curr_epoch <= s0_creation_epoch or curr_epoch < staking_epoch:
time.sleep(random.uniform(0.5, 1.5)) # Random to stop burst spam of RPC calls.
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[beacon_shard_id])["epoch"]
# Check v1
raw_response = base_request("hmy_getValidatorKeys", params=[curr_epoch],
endpoint=endpoints[0])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert s0_validator["pub-bls-key"] in response, f"Expected pub bls key {s0_validator['pub-bls-key']} in {response}"
# Check v1
raw_response = base_request("hmyv2_getValidatorKeys", params=[curr_epoch],
endpoint=endpoints[0])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
assert s0_validator["pub-bls-key"] in response, f"Expected pub bls key {s0_validator['pub-bls-key']} in {response}"
@txs.staking
@flaky(max_runs=6, rerun_filter=rerun_delay_filter(delay=8))
def test_get_validators_v1(s0_validator, s1_validator):
reference_response = {
"shardID": 0,
"validators": [
{
"address": "one1pdv9lrdwl0rg5vglh4xtyrv3wjk3wsqket7zxy",
"balance": "0x252c53eaca3b23bb3"
},
]
}
staking_epoch = blockchain.get_staking_epoch(endpoints[beacon_shard_id])
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[beacon_shard_id])["epoch"]
val_0_info = staking.get_validator_information(s0_validator["validator-addr"], endpoint=endpoints[beacon_shard_id])
val_1_info = staking.get_validator_information(s1_validator["validator-addr"], endpoint=endpoints[beacon_shard_id])
s0_creation_epoch = int(blockchain.get_block_by_number(val_0_info["validator"]["creation-height"])["epoch"], 16)
s1_creation_epoch = int(blockchain.get_block_by_number(val_1_info["validator"]["creation-height"])["epoch"], 16)
while curr_epoch <= s0_creation_epoch or curr_epoch <= s1_creation_epoch or curr_epoch < staking_epoch:
time.sleep(random.uniform(0.5, 1.5)) # Random to stop burst spam of RPC calls.
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[beacon_shard_id])["epoch"]
raw_response = base_request("hmy_getValidators", params=[curr_epoch],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_s0, found_s1 = False, False
for val in response["validators"]:
if val["address"] == s0_validator["validator-addr"]:
found_s0 = True
if val["address"] == s1_validator["validator-addr"]:
found_s1 = True
assert found_s0 and found_s1, f"Expected to find validator information for " \
f"{s0_validator['validator-addr']} and {s0_validator['validator-addr']}"
@txs.staking
@flaky(max_runs=6, rerun_filter=rerun_delay_filter(delay=8))
def test_get_validators_v2(s0_validator, s1_validator):
reference_response = {
"shardID": 0,
"validators": [
{
"address": "one1pdv9lrdwl0rg5vglh4xtyrv3wjk3wsqket7zxy",
"balance": 42857730340142857139
},
]
}
staking_epoch = blockchain.get_staking_epoch(endpoints[beacon_shard_id])
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[beacon_shard_id])["epoch"]
val_0_info = staking.get_validator_information(s0_validator["validator-addr"], endpoint=endpoints[beacon_shard_id])
val_1_info = staking.get_validator_information(s1_validator["validator-addr"], endpoint=endpoints[beacon_shard_id])
s0_creation_epoch = int(blockchain.get_block_by_number(val_0_info["validator"]["creation-height"])["epoch"], 16)
s1_creation_epoch = int(blockchain.get_block_by_number(val_1_info["validator"]["creation-height"])["epoch"], 16)
while curr_epoch <= s0_creation_epoch or curr_epoch <= s1_creation_epoch or curr_epoch < staking_epoch:
time.sleep(random.uniform(0.5, 1.5)) # Random to stop burst spam of RPC calls.
curr_epoch = blockchain.get_latest_header(endpoint=endpoints[beacon_shard_id])["epoch"]
raw_response = base_request("hmyv2_getValidators", params=[curr_epoch],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
found_s0, found_s1 = False, False
for val in response["validators"]:
if val["address"] == s0_validator["validator-addr"]:
found_s0 = True
if val["address"] == s1_validator["validator-addr"]:
found_s1 = True
assert found_s0 and found_s1, f"Expected to find validator information for " \
f"{s0_validator['validator-addr']} and {s0_validator['validator-addr']}"
@txs.staking
@pytest.mark.run('first')
def test_pending_staking_transactions_v1():
stx = { # Create validator tx
"validator-addr": "one13v9m45m6yk9qmmcgyq603ucy0wdw9lfsxzsj9d",
"delegator-addr": "one13v9m45m6yk9qmmcgyq603ucy0wdw9lfsxzsj9d",
"name": "test",
"identity": "test1",
"website": "test",
"security-contact": "test",
"details": "test",
"rate": 0.1,
"max-rate": 0.9,
"max-change-rate": 0.05,
"min-self-delegation": 10000,
"max-total-delegation": 10000000,
"amount": 10000,
"pub-bls-key": "8596e18ce463e4d5faa62d669dd959101ca408f757489bc9bdb2f95c2cc7a521b4eeb6d55ff2befd8b220fce6939b408",
"hash": "0xf16668d7e39f01fd15c40e515ece370af1c80f7588bffd7c53932768a0ebba2e",
"nonce": "0x0",
"signed-raw-tx": "0xf9015780f90106948b0bbad37a258a0def082034f8f3047b9ae2fd30da8474657374857465737432847465737484746573748474657374ddc988016345785d8a0000c9880c7d713b49da0000c887b1a2bc2ec500008a021e19e0c9bab24000008b084595161401484a000000f1b08596e18ce463e4d5faa62d669dd959101ca408f757489bc9bdb2f95c2cc7a521b4eeb6d55ff2befd8b220fce6939b408f862b860cc0dbe1c9ba4f352e44420af0bfa9019b604d46b9afb38eebf599e89c9da76101852426bcc3845075fe7f460f639e308b738ef904036fee9c95bfc8888d6eeabd5365d005f20f76fbbe165b4bc1452bb5f50dd91a8422c6d9a94bc846f3754968a021e19e0c9bab240000080843b9aca008351220427a02eeadff25df33d13eb95288006435e06a65ad979bf24b9cbd151c696df5b84e3a016e9fa32ddad438936ba2ac837cc8ac102aeec519198fa4516cfac7032df313c"
}
reference_response = [
{
"blockHash": "0x0000000000000000000000000000000000000000000000000000000000000000",
"blockNumber": None,
"from": "one13v9m45m6yk9qmmcgyq603ucy0wdw9lfsxzsj9d",
"timestamp": "0x0",
"gas": "0x512204",
"gasPrice": "0x3b9aca00",
"hash": "0xf16668d7e39f01fd15c40e515ece370af1c80f7588bffd7c53932768a0ebba2e",
"nonce": "0x0",
"transactionIndex": "0x0",
"v": "0x27",
"r": "0x2eeadff25df33d13eb95288006435e06a65ad979bf24b9cbd151c696df5b84e3",
"s": "0x16e9fa32ddad438936ba2ac837cc8ac102aeec519198fa4516cfac7032df313c",
"type": "CreateValidator",
"msg": None
}
]
reference_create_validator_msg = {
"amount": "0x21e19e0c9bab2400000",
"commissionRate": "0x16345785d8a0000",
"details": "test",
"identity": "test2",
"maxChangeRate": "0xb1a2bc2ec50000",
"maxCommissionRate": "0xc7d713b49da0000",
"maxTotalDelegation": "0x84595161401484a000000",
"minSelfDelegation": "0x21e19e0c9bab2400000",
"name": "test",
"securityContact": "test",
"slotPubKeys": [
"8596e18ce463e4d5faa62d669dd959101ca408f757489bc9bdb2f95c2cc7a521b4eeb6d55ff2befd8b220fce6939b408"
],
"validatorAddress": "one13v9m45m6yk9qmmcgyq603ucy0wdw9lfsxzsj9d",
"website": "test"
}
in_initially_funded = False
for tx in initial_funding:
if tx["to"] == stx["validator-addr"] and tx["to-shard"] == beacon_shard_id:
in_initially_funded = True
break
if not in_initially_funded:
raise AssertionError(f"Test staking transaction from address {stx['validator-addr']} "
f"not found in set of initially funded accounts (or not founded on s{beacon_shard_id})")
if get_staking_transaction(stx["hash"]) is not None:
pytest.skip(f"Test staking transaction (hash {stx['hash']}) already present on chain...")
send_staking_transaction(stx, confirm_submission=True)
start_time = time.time()
while time.time() - start_time <= tx_timeout:
raw_response = base_request("hmy_pendingStakingTransactions", endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
for pending_tx in response:
if pending_tx["hash"] == stx["hash"]:
assert pending_tx["type"] == "CreateValidator"
assert_valid_json_structure(reference_create_validator_msg, pending_tx["msg"])
return
raise AssertionError(f"Timeout! Pending transaction not found for {json.dumps(stx, indent=2)}")
@txs.staking
@pytest.mark.run('first')
def test_pending_staking_transactions_v2():
stx = { # Create validator tx
"validator-addr": "one13muqj27fcd59gfrv7wzvuaupgkkwvwzlxun0ce",
"delegator-addr": "one13muqj27fcd59gfrv7wzvuaupgkkwvwzlxun0ce",
"name": "test",
"identity": "test3",
"website": "test",
"security-contact": "test",
"details": "test",
"rate": 0.1,
"max-rate": 0.9,
"max-change-rate": 0.05,
"min-self-delegation": 10000,
"max-total-delegation": 10000000,
"amount": 10000,
"pub-bls-key": "29cdd2ea5ef25bfee0bbc649065ceb2d0e19cc25f42541154eca69c0ff923971e20352fbfeeac5d17f8f6c6fc5871e88",
"hash": "0x6e54fc7102daa31372027912b7f441ab9b9acafb9fa93b72dc9380321bacdbe2",
"nonce": "0x0",
"signed-raw-tx": "0xf9015780f90106948ef8092bc9c36854246cf384ce778145ace6385fda8474657374857465737433847465737484746573748474657374ddc988016345785d8a0000c9880c7d713b49da0000c887b1a2bc2ec500008a021e19e0c9bab24000008b084595161401484a000000f1b029cdd2ea5ef25bfee0bbc649065ceb2d0e19cc25f42541154eca69c0ff923971e20352fbfeeac5d17f8f6c6fc5871e88f862b860413befdd8895ade3cadaf121cac888f47b73c0986a38dda3198f3821532278b992e413009c014bef52c59264d7b2eb13054377146a540751b3c3c6c5a21a2c7fac9639ef72d613167315df1ea6455cde42e53157d4b7cac0b3c8975e5d5eb2828a021e19e0c9bab240000080843b9aca008351220427a0e03993350ed72c70198bbb9b0c962eba1ba08c6c46f66c50a878f84970120941a0421342afa7dd527edadfb8fc0b3b80c41ba3fcd390cc2ff95bc18b89c58850ca"
}
reference_response = [
{
"blockHash": "0x0000000000000000000000000000000000000000000000000000000000000000",
"blockNumber": None,
"from": "one13muqj27fcd59gfrv7wzvuaupgkkwvwzlxun0ce",
"timestamp": 0,
"gas": 5317124,
"gasPrice": 1000000000,
"hash": "0x6e54fc7102daa31372027912b7f441ab9b9acafb9fa93b72dc9380321bacdbe2",
"nonce": 0,
"transactionIndex": 0,
"v": "0x27",
"r": "0xe03993350ed72c70198bbb9b0c962eba1ba08c6c46f66c50a878f84970120941",
"s": "0x421342afa7dd527edadfb8fc0b3b80c41ba3fcd390cc2ff95bc18b89c58850ca",
"type": "CreateValidator",
"msg": None
}
]
reference_create_validator_msg = {
"amount": 10000000000000000000000,
"commissionRate": 100000000000000000,
"details": "test",
"identity": "test3",
"maxChangeRate": 50000000000000000,
"maxCommissionRate": 900000000000000000,
"maxTotalDelegation": 10000000000000000000000000,
"minSelfDelegation": 10000000000000000000000,
"name": "test",
"securityContact": "test",
"slotPubKeys": [
"29cdd2ea5ef25bfee0bbc649065ceb2d0e19cc25f42541154eca69c0ff923971e20352fbfeeac5d17f8f6c6fc5871e88"
],
"validatorAddress": "one13muqj27fcd59gfrv7wzvuaupgkkwvwzlxun0ce",
"website": "test"
}
in_initially_funded = False
for tx in initial_funding:
if tx["to"] == stx["validator-addr"] and tx["to-shard"] == beacon_shard_id:
in_initially_funded = True
break
if not in_initially_funded:
raise AssertionError(f"Test staking transaction from address {stx['validator-addr']} "
f"not found in set of initially funded accounts (or not founded on s{beacon_shard_id})")
if get_staking_transaction(stx["hash"]) is not None:
pytest.skip(f"Test staking transaction (hash {stx['hash']}) already present on chain...")
send_staking_transaction(stx, confirm_submission=True)
start_time = time.time()
while time.time() - start_time <= tx_timeout:
raw_response = base_request("hmyv2_pendingStakingTransactions", endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
for pending_tx in response:
if pending_tx["hash"] == stx["hash"]:
assert pending_tx["type"] == "CreateValidator"
assert_valid_json_structure(reference_create_validator_msg, pending_tx["msg"])
return
raise AssertionError(f"Timeout! Pending transaction not found for {json.dumps(stx, indent=2)}")
@txs.staking
@mutually_exclusive_test(scope=_mutex_scope)
def test_get_blocks_v1(s0_validator):
"""
Note: param options for 'withSigners' will NOT return any sensical data
in staking epoch (since it returns ONE addresses) and is subject to removal, thus is not tested here.
"""
reference_response_blk = {
"difficulty": 0,
"epoch": "0x1",
"extraData": "0x",
"gasLimit": "0x4c4b400",
"gasUsed": "0x5121c4",
"hash": "0xc0438fb59641cf000ddede158cf3707b6b96f2fbf7eaf40386eb91a0dc4305a4",
"logsBloom": "0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
"miner": "one1pdv9lrdwl0rg5vglh4xtyrv3wjk3wsqket7zxy",
"mixHash": "0x0000000000000000000000000000000000000000000000000000000000000000",
"nonce": 0,
"number": "0xb",
"parentHash": "0x57b4221951b61025eccea748c3a67dc2f1dafa9db278ac4d67135061432de6d0",
"receiptsRoot": "0x37f9bea40135162a9eb2164266b2152a3909ee94dd2f908cdb091afb90724e1e",
"size": "0x3fd",
"stakingTransactions": [],
"stateRoot": "0x33109119529b1d282909975ce846a3eeb1b76681d7beebfa5cf79adfe4a1c4d7",
"timestamp": "0x5f11a7a2",
"transactions": [],
"transactionsRoot": "0xf4ab626bfc3bf9781ddef818f85cc81c345010b7b6abaeb27d0237c8a1ee1ac5",
"uncles": [],
"viewID": "0xb"
}
reference_staking_response = {
"blockHash": "0xc0438fb59641cf000ddede158cf3707b6b96f2fbf7eaf40386eb91a0dc4305a4",
"blockNumber": "0xb",
"from": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"timestamp": "0x5f11a7a2",
"gas": "0x5121c4",
"gasPrice": "0x3b9aca00",
"hash": "0xf80460f1ad041a0a0e841da717fc5b7959b1a7e9a0ce9a25cd70c0ce40d5ff26",
"nonce": "0x0",
"transactionIndex": "0x0",
"v": "0x27",
"r": "0x2348daabe696c4370379b9102dd85da6d4fed52f0f511ff0448a21c001ee75a7",
"s": "0x1a67f9f40e0de02b50d5d7295f200fea7f950c1b59aa7efa8d225294c4fdbc5e",
"type": "CreateValidator",
"msg": None
}
reference_create_validator_msg = {
"amount": "0x21e19e0c9bab2400000",
"commissionRate": "0x16345785d8a0000",
"details": "test",
"identity": "test0",
"maxChangeRate": "0xb1a2bc2ec50000",
"maxCommissionRate": "0xc7d713b49da0000",
"maxTotalDelegation": "0x84595161401484a000000",
"minSelfDelegation": "0x21e19e0c9bab2400000",
"name": "test",
"securityContact": "test",
"slotPubKeys": [
"4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c"
],
"validatorAddress": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"website": "test"
}
init_tx = get_staking_transaction(s0_validator["hash"])
start_blk, end_blk = hex(max(0, int(init_tx["blockNumber"], 16) - 2)), init_tx["blockNumber"]
raw_response = base_request("hmy_getBlocks",
params=[start_blk, end_blk, {
"fullTx": True,
"inclStaking": True
}],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
for blk in response:
assert_valid_json_structure(reference_response_blk, blk)
for stx in blk["stakingTransactions"]:
assert_valid_json_structure(reference_staking_response, stx)
if stx["hash"] == s0_validator["hash"]:
assert stx["type"] == "CreateValidator"
assert_valid_json_structure(reference_create_validator_msg, stx["msg"])
assert len(response[-1]["stakingTransactions"]) > 0, "Expected staking transactions on last block"
start_num, end_num = int(start_blk, 16), int(end_blk, 16)
for blk in response:
blk_num = int(blk["number"], 16)
assert start_num <= blk_num <= end_num, f"Got block number {blk_num}, which is not in range [{start_num},{end_num}]"
@txs.staking
@mutually_exclusive_test(scope=_mutex_scope)
def test_get_blocks_v2(s0_validator):
"""
Only difference in param of RPC is hex string in v1 and decimal in v2.
Note: param options for 'withSigners' will NOT return any sensical data
in staking epoch (since it returns ONE addresses) and is subject to removal, thus is not tested here.
"""
reference_response_blk = {
"difficulty": 0,
"epoch": 1,
"extraData": "0x",
"gasLimit": 80000000,
"gasUsed": 5317060,
"hash": "0xc0438fb59641cf000ddede158cf3707b6b96f2fbf7eaf40386eb91a0dc4305a4",
"logsBloom": "0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000",
"miner": "one1pdv9lrdwl0rg5vglh4xtyrv3wjk3wsqket7zxy",
"mixHash": "0x0000000000000000000000000000000000000000000000000000000000000000",
"nonce": 0,
"number": 11,
"parentHash": "0x57b4221951b61025eccea748c3a67dc2f1dafa9db278ac4d67135061432de6d0",
"receiptsRoot": "0x37f9bea40135162a9eb2164266b2152a3909ee94dd2f908cdb091afb90724e1e",
"size": 1021,
"stakingTransactions": [],
"stateRoot": "0x33109119529b1d282909975ce846a3eeb1b76681d7beebfa5cf79adfe4a1c4d7",
"timestamp": 1594992546,
"transactions": [],
"transactionsRoot": "0xf4ab626bfc3bf9781ddef818f85cc81c345010b7b6abaeb27d0237c8a1ee1ac5",
"uncles": [],
"viewID": 11
}
reference_staking_response = {
"blockHash": "0x0000000000000000000000000000000000000000000000000000000000000000",
"blockNumber": None,
"from": "one13muqj27fcd59gfrv7wzvuaupgkkwvwzlxun0ce",
"timestamp": 0,
"gas": 5317124,
"gasPrice": 1000000000,
"hash": "0x6e54fc7102daa31372027912b7f441ab9b9acafb9fa93b72dc9380321bacdbe2",
"nonce": 0,
"transactionIndex": 0,
"v": "0x27",
"r": "0xe03993350ed72c70198bbb9b0c962eba1ba08c6c46f66c50a878f84970120941",
"s": "0x421342afa7dd527edadfb8fc0b3b80c41ba3fcd390cc2ff95bc18b89c58850ca",
"type": "CreateValidator",
"msg": None
}
reference_create_validator_msg = {
"amount": 10000000000000000000000,
"commissionRate": 100000000000000000,
"details": "test",
"identity": "test3",
"maxChangeRate": 50000000000000000,
"maxCommissionRate": 900000000000000000,
"maxTotalDelegation": 10000000000000000000000000,
"minSelfDelegation": 10000000000000000000000,
"name": "test",
"securityContact": "test",
"slotPubKeys": [
"29cdd2ea5ef25bfee0bbc649065ceb2d0e19cc25f42541154eca69c0ff923971e20352fbfeeac5d17f8f6c6fc5871e88"
],
"validatorAddress": "one13muqj27fcd59gfrv7wzvuaupgkkwvwzlxun0ce",
"website": "test"
}
init_tx = get_staking_transaction(s0_validator["hash"])
start_blk, end_blk = max(0, int(init_tx["blockNumber"], 16) - 2), int(init_tx["blockNumber"], 16)
raw_response = base_request("hmyv2_getBlocks",
params=[start_blk, end_blk, {
"fullTx": True,
"inclStaking": True
}],
endpoint=endpoints[beacon_shard_id])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
for blk in response:
assert_valid_json_structure(reference_response_blk, blk)
for stx in blk["stakingTransactions"]:
assert_valid_json_structure(reference_staking_response, stx)
if stx["hash"] == s0_validator["hash"]:
assert stx["type"] == "CreateValidator"
assert_valid_json_structure(reference_create_validator_msg, stx["msg"])
assert len(response[-1]["stakingTransactions"]) > 0, "Expected staking transactions on last block"
for blk in response:
assert start_blk <= blk[
"number"] <= end_blk, f"Got block number {blk['number']}, which is not in range [{start_blk},{end_blk}]"
@txs.staking
def test_get_staking_transaction_history_v1(s0_validator):
"""
No staking transactions for the 'to' account of `account_test_tx`.
This method may not be implemented, skip if this is the case
"""
reference_response = {
"staking_transactions": [
{
"blockHash": "0xc0438fb59641cf000ddede158cf3707b6b96f2fbf7eaf40386eb91a0dc4305a4",
"blockNumber": "0xb",
"from": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"timestamp": "0x5f11a7a2",
"gas": "0x5121c4",
"gasPrice": "0x3b9aca00",
"hash": "0xf80460f1ad041a0a0e841da717fc5b7959b1a7e9a0ce9a25cd70c0ce40d5ff26",
"nonce": "0x0",
"transactionIndex": "0x0",
"v": "0x27",
"r": "0x2348daabe696c4370379b9102dd85da6d4fed52f0f511ff0448a21c001ee75a7",
"s": "0x1a67f9f40e0de02b50d5d7295f200fea7f950c1b59aa7efa8d225294c4fdbc5e",
"type": "CreateValidator",
"msg": None
},
]
}
reference_create_validator_msg = {
"amount": "0x21e19e0c9bab2400000",
"commissionRate": "0x16345785d8a0000",
"details": "test",
"identity": "test0",
"maxChangeRate": "0xb1a2bc2ec50000",
"maxCommissionRate": "0xc7d713b49da0000",
"maxTotalDelegation": "0x84595161401484a000000",
"minSelfDelegation": "0x21e19e0c9bab2400000",
"name": "test",
"securityContact": "test",
"slotPubKeys": [
"4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c"
],
"validatorAddress": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"website": "test"
}
reference_response_short = {
"staking_transactions": [
"0x5718a2fda967f051611ccfaf2230dc544c9bdd388f5759a42b2fb0847fc8d759",
]
}
try:
raw_response = base_request("hmy_getStakingTransactionsHistory",
params=[{
"address": s0_validator["validator-addr"],
"pageIndex": 0,
"pageSize": 1000,
"fullTx": False,
"txType": "ALL",
"order": "ASC"
}],
endpoint=endpoints[initial_funding[0]["from-shard"]])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response_short, response)
raw_response = base_request("hmy_getStakingTransactionsHistory",
params=[{
"address": s0_validator["validator-addr"],
"pageIndex": 0,
"pageSize": 1000,
"fullTx": True,
"txType": "ALL",
"order": "ASC"
}],
endpoint=endpoints[initial_funding[0]["from-shard"]])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
for stx in response["staking_transactions"]:
if stx["hash"] == s0_validator["hash"]:
assert stx["type"] == "CreateValidator"
assert_valid_json_structure(reference_create_validator_msg, stx["msg"])
except Exception as e:
pytest.skip(traceback.format_exc())
pytest.skip(f"Exception: {e}")
@txs.staking
def test_get_staking_transaction_history_v2(s0_validator):
"""
No staking transactions for the 'to' account of `account_test_tx`.
"""
reference_response = {
"staking_transactions": [
{
"blockHash": "0xc0438fb59641cf000ddede158cf3707b6b96f2fbf7eaf40386eb91a0dc4305a4",
"blockNumber": 11,
"from": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"timestamp": 1594992546,
"gas": 5317060,
"gasPrice": 1000000000,
"hash": "0xf80460f1ad041a0a0e841da717fc5b7959b1a7e9a0ce9a25cd70c0ce40d5ff26",
"nonce": 0,
"transactionIndex": 0,
"v": "0x27",
"r": "0x2348daabe696c4370379b9102dd85da6d4fed52f0f511ff0448a21c001ee75a7",
"s": "0x1a67f9f40e0de02b50d5d7295f200fea7f950c1b59aa7efa8d225294c4fdbc5e",
"type": "CreateValidator",
"msg": None
},
]
}
reference_create_validator_msg = {
"amount": 10000000000000000000000,
"commissionRate": 100000000000000000,
"details": "test",
"identity": "test0",
"maxChangeRate": 50000000000000000,
"maxCommissionRate": 900000000000000000,
"maxTotalDelegation": 10000000000000000000000000,
"minSelfDelegation": 10000000000000000000000,
"name": "test",
"securityContact": "test",
"slotPubKeys": [
"4f41a37a3a8d0695dd6edcc58142c6b7d98e74da5c90e79b587b3b960b6a4f5e048e6d8b8a000d77a478d44cd640270c"
],
"validatorAddress": "one109r0tns7av5sjew7a7fkekg4fs3pw0h76pp45e",
"website": "test"
}
reference_response_short = {
"staking_transactions": [
"0x5718a2fda967f051611ccfaf2230dc544c9bdd388f5759a42b2fb0847fc8d759",
]
}
raw_response = base_request("hmyv2_getStakingTransactionsHistory",
params=[{
"address": s0_validator["validator-addr"],
"pageIndex": 0,
"pageSize": 1000,
"fullTx": False,
"txType": "ALL",
"order": "ASC"
}],
endpoint=endpoints[initial_funding[0]["from-shard"]])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response_short, response)
raw_response = base_request("hmyv2_getStakingTransactionsHistory",
params=[{
"address": s0_validator["validator-addr"],
"pageIndex": 0,
"pageSize": 1000,
"fullTx": True,
"txType": "ALL",
"order": "ASC"
}],
endpoint=endpoints[initial_funding[0]["from-shard"]])
response = check_and_unpack_rpc_response(raw_response, expect_error=False)
assert_valid_json_structure(reference_response, response)
for stx in response["staking_transactions"]:
if stx["hash"] == s0_validator["hash"]:
assert stx["type"] == "CreateValidator"
assert_valid_json_structure(reference_create_validator_msg, stx["msg"])
| 49.989403 | 721 | 0.655186 | 7,822 | 89,631 | 7.263488 | 0.06878 | 0.022072 | 0.034076 | 0.026331 | 0.836452 | 0.815929 | 0.787169 | 0.768107 | 0.760627 | 0.760627 | 0 | 0.173312 | 0.2465 | 89,631 | 1,792 | 722 | 50.017299 | 0.66793 | 0.045386 | 0 | 0.782205 | 0 | 0.001328 | 0.381249 | 0.208819 | 0 | 0 | 0.107006 | 0.000558 | 0.105578 | 1 | 0.01992 | false | 0 | 0.007304 | 0 | 0.031209 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3b5a1b0306bcd6a3c70e200ce49f437236a314b4 | 50,242 | py | Python | plugins/son-mano-placement/test/test_placement.py | sonata-nfv/son-mano-framework | d58dc5b473429b245d0a03f339fcda7829549f0c | [
"Apache-2.0"
] | 16 | 2016-07-20T15:07:38.000Z | 2021-11-11T16:33:18.000Z | plugins/son-mano-placement/test/test_placement.py | sonata-nfv/son-mano-framework | d58dc5b473429b245d0a03f339fcda7829549f0c | [
"Apache-2.0"
] | 44 | 2016-07-22T10:36:39.000Z | 2019-07-29T11:41:23.000Z | plugins/son-mano-placement/test/test_placement.py | sonata-nfv/son-mano-framework | d58dc5b473429b245d0a03f339fcda7829549f0c | [
"Apache-2.0"
] | 42 | 2016-07-20T14:09:51.000Z | 2019-05-23T14:35:13.000Z | # Copyright (c) 2015 SONATA-NFV, 2017 5GTANGO
# ALL RIGHTS RESERVED.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Neither the name of the SONATA-NFV, 5GTANGO
# nor the names of its contributors may be used to endorse or promote
# products derived from this software without specific prior written
# permission.
#
# This work has been performed in the framework of the SONATA project,
# funded by the European Commission under Grant number 671517 through
# the Horizon 2020 and 5G-PPP programmes. The authors would like to
# acknowledge the contributions of their colleagues of the SONATA
# partner consortium (www.sonata-nfv.eu).
#
# This work has been performed in the framework of the 5GTANGO project,
# funded by the European Commission under Grant number 761493 through
# the Horizon 2020 and 5G-PPP programmes. The authors would like to
# acknowledge the contributions of their colleagues of the 5GTANGO
# partner consortium (www.5gtango.eu).
import unittest
import time
import json
import yaml
import threading
import logging
import uuid
import son_mano_placement.placement_helpers as tools
from unittest import mock
from multiprocessing import Process
from son_mano_placement.placement import PlacementPlugin
from sonmanobase.messaging import ManoBrokerRequestResponseConnection
from collections import namedtuple
logging.basicConfig(level=logging.INFO)
logging.getLogger('amqp-storm').setLevel(logging.INFO)
LOG = logging.getLogger("son-mano-plugins:slm_test")
logging.getLogger("son-mano-base:messaging").setLevel(logging.INFO)
logging.getLogger("son-mano-base:plugin").setLevel(logging.INFO)
LOG.setLevel(logging.INFO)
class testPlacementPluginFunctionality(unittest.TestCase):
"""
Test the different methods of the placement plugin.
"""
pp = None
########################
# SETUP
########################
def setUp(self):
# Create a new placement plugin
self.pp = PlacementPlugin(auto_register=False, start_running=False)
# Some threading events that can be used during the tests
self.wait_for_first_event = threading.Event()
self.wait_for_first_event.clear()
def tearDown(self):
# Clean up
try:
del self.pp
except:
pass
########################
# TESTS
########################
def test_Placement_load_balanced_1_vnf_1_vdu_a(self):
"""
This method tests whether a load_balanced operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_1.yml', 'rb'))
vnfd = yaml.load(open(path_to_artefacts + 'vnfd_1_1.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd
vnf_1['id'] = vnf_1_id
vnfs = []
vnfs.append(vnf_1)
top = yaml.load(open(path_to_artefacts + 'infrastructure_1.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'load balanced'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'bar'}]
egress = [{'nap': '8.8.8.8', 'location': 'bar'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number VNFs doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
def test_Placement_load_balanced_1_vnf_1_vdu_b(self):
"""
This method tests whether a load_balanced operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_2.yml', 'rb'))
vnfd = yaml.load(open(path_to_artefacts + 'vnfd_1_2.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd
vnf_1['id'] = vnf_1_id
vnfs = []
vnfs.append(vnf_1)
top = yaml.load(open(path_to_artefacts + 'infrastructure_2.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'load balanced'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number VNFs doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
def test_Placement_load_balanced_1_vnf_2_vdu(self):
"""
This method tests whether a load_balanced operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_3.yml', 'rb'))
vnfd = yaml.load(open(path_to_artefacts + 'vnfd_1_3.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd
vnf_1['id'] = vnf_1_id
vnfs = []
vnfs.append(vnf_1)
top = yaml.load(open(path_to_artefacts + 'infrastructure_3.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'load balanced'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
def test_Placement_load_balanced_2_vnf_2_vdu(self):
"""
This method tests whether a load_balanced operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_4.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_4.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_4.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_4.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'load balanced'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
def test_Placement_fill_first_2_vnf_2_vdu_a(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_5.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_5.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_5.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_5.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'fill first'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
def test_Placement_fill_first_2_vnf_2_vdu_b(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_6.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_6.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_6.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_6.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'fill first'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
def test_Placement_fill_first_2_vnf_2_vdu_c(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_7.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_7.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_7.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_7.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'fill first'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
def test_Placement_fill_first_2_vnf_2_vdu_d(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_8.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_8.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_8.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_8.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'fill first'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
def test_Placement_fill_first_2_vnf_2_vdu_e(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_9.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_9.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_9.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_9.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'fill first'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
def test_Placement_priority_2_vnf_2_vdu_a(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_10.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_10.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_10.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_10.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'priority'
operator_policy['policy_list'] = ['Athens', 'Ghent', 'Aveiro']
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
def test_Placement_priority_2_vnf_2_vdu_b(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_11.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_11.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_11.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_11.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'priority'
operator_policy['policy_list'] = ['Athens', 'Ghent', 'Aveiro']
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
customer_policy = {}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
def test_Placement_load_balanced_blacklist_2_vnf_2_vdu(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_12.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_12.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_12.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_12.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'load balanced'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
blacklist = ['Aveiro']
customer_policy = {'blacklist': blacklist}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
def test_Placement_fill_first_blacklist_2_vnf_2_vdu(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_13.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_13.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_13.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_13.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'fill first'
operator_policy['policy_list'] = []
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
blacklist = ['Athens']
customer_policy = {'blacklist': blacklist}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
def test_Placement_priority_blacklist_2_vnf_2_vdu(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_14.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_14.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_14.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_14.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'priority'
operator_policy['policy_list'] = ['Ghent', 'Athens', 'Aveiro']
operator_policy['weights'] = {'operator': 1.0, 'developer': '0.0'}
blacklist = ['Athens']
customer_policy = {'blacklist': blacklist}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
def test_Placement_proximity_blacklist_2_vnf_2_vdu(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_15.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_15.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_15.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_15.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'priority'
operator_policy['policy_list'] = ['Ghent', 'Athens', 'Aveiro']
operator_weight = 0.0
developer_weight = 1.0
blacklist = ['Athens']
customer_policy = {'blacklist': blacklist}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '8.8.8.8', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, operator_weight=operator_weight, developer_weight=developer_weight, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
def test_Placement_proximity_2_vnf_2_vdu(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_16.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_16.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_16.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_16.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'priority'
operator_policy['policy_list'] = ['Ghent', 'Athens', 'Aveiro']
operator_weight = 0.0
developer_weight = 1.0
blacklist = []
customer_policy = {'blacklist': blacklist}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '10.1.10.1', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, operator_weight=operator_weight, developer_weight=developer_weight, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
def test_Placement_affinity_2_vnf_2_vdu_a(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_17.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_17.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_17.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_17.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'priority'
operator_policy['policy_list'] = ['Ghent', 'Athens', 'Aveiro']
operator_weight = 0.0
developer_weight = 1.0
blacklist = []
customer_policy = {'blacklist': blacklist}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '10.1.10.1', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, operator_weight=operator_weight, developer_weight=developer_weight, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
def test_Placement_affinity_blacklist_2_vnf_2_vdu_a(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_18.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_18.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_18.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_18.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'priority'
operator_policy['policy_list'] = ['Ghent', 'Athens', 'Aveiro']
operator_weight = 0.0
developer_weight = 1.0
blacklist = ['Aveiro']
customer_policy = {'blacklist': blacklist}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '10.1.10.1', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, operator_weight=operator_weight, developer_weight=developer_weight, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-6666',
msg="VNF mapped on wrong PoP.")
def test_Placement_affinity_proximity_blacklist_2_vnf_2_vdu(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_19.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_19.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_19.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_19.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'priority'
operator_policy['policy_list'] = ['Ghent', 'Athens', 'Aveiro']
operator_weight = 0.0
developer_weight = 1.0
blacklist = ['Aveiro']
customer_policy = {'blacklist': blacklist}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '10.1.10.1', 'location': 'foo'}]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, operator_weight=operator_weight, developer_weight=developer_weight, vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
def test_Placement_load_balanced_proximity_2_vnf_2_vdu_a(self):
"""
This method tests whether a fill_first operator constraint
is correctly executed.
"""
path_to_artefacts = '/plugins/son-mano-placement/test/artefacts/'
nsd = yaml.load(open(path_to_artefacts + 'nsd_20.yml', 'rb'))
vnfd1 = yaml.load(open(path_to_artefacts + 'vnfd_1_20.yml', 'rb'))
vnfd2 = yaml.load(open(path_to_artefacts + 'vnfd_2_20.yml', 'rb'))
serv_id = str(uuid.uuid4())
vnf_1_id = str(uuid.uuid4())
vnf_1 = {}
vnf_1['vnfd'] = vnfd1
vnf_1['id'] = vnf_1_id
vnf_2_id = str(uuid.uuid4())
vnf_2 = {}
vnf_2['vnfd'] = vnfd2
vnf_2['id'] = vnf_2_id
vnfs = []
vnfs.append(vnf_1)
vnfs.append(vnf_2)
top = yaml.load(open(path_to_artefacts + 'infrastructure_20.yml', 'rb'))
operator_policy = {}
operator_policy['policy'] = 'load balanced'
operator_policy['policy_list'] = ['Ghent', 'Athens', 'Aveiro']
operator_weight = 0.50
developer_weight = 0.50
blacklist = []
customer_policy = {'blacklist': blacklist}
ingress = [{'nap': '10.100.10.100', 'location': 'foo'}]
egress = [{'nap': '10.1.10.1', 'location': 'foo'}]
op_corr = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, operator_weight=1.0, developer_weight=0.0, vnf_single_pop=True)[2]
de_corr = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, operator_weight=0.0, developer_weight=1.0, vnf_single_pop=True)[2]
mapping = self.pp.placement(serv_id, nsd, vnfs, top, operator_policy, customer_policy, ingress, egress, operator_weight=abs(operator_weight/op_corr), developer_weight=abs(developer_weight/de_corr), vnf_single_pop=True)
# Check if every VNF is mapped.
self.assertEqual(len(mapping[0].keys()),
len(vnfs),
msg="Number images doesn't match number of mappings.")
# Check if correct VNF id is used.
self.assertIn(vnf_1_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if correct VNF id is used.
self.assertIn(vnf_2_id,
mapping[0].keys(),
msg="Function ID in mapping incorrect.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_1_id],
'1111-22222222-33333333-5555',
msg="VNF mapped on wrong PoP.")
# Check if VNF is mapped on PoP with lowest load.
self.assertEqual(mapping[0][vnf_2_id],
'1111-22222222-33333333-4444',
msg="VNF mapped on wrong PoP.")
if __name__ == '__main__':
unittest.main()
| 36.942647 | 226 | 0.567533 | 6,368 | 50,242 | 4.28392 | 0.045069 | 0.023754 | 0.021994 | 0.045161 | 0.933578 | 0.929399 | 0.928116 | 0.927346 | 0.926576 | 0.921298 | 0 | 0.057987 | 0.305959 | 50,242 | 1,359 | 227 | 36.969831 | 0.724348 | 0.135325 | 0 | 0.836735 | 0 | 0 | 0.201822 | 0.050209 | 0 | 0 | 0 | 0 | 0.112845 | 1 | 0.026411 | false | 0.0012 | 0.015606 | 0 | 0.044418 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3b9145cdae0d8c4da066bf32ee468f96086eeeb8 | 102 | py | Python | sakurajima/db/backends/base/__init__.py | jefersonmontalvao/Sakurajima-BOT | 0e34739f884c601edab4cd9720dba87395ab638a | [
"MIT"
] | null | null | null | sakurajima/db/backends/base/__init__.py | jefersonmontalvao/Sakurajima-BOT | 0e34739f884c601edab4cd9720dba87395ab638a | [
"MIT"
] | null | null | null | sakurajima/db/backends/base/__init__.py | jefersonmontalvao/Sakurajima-BOT | 0e34739f884c601edab4cd9720dba87395ab638a | [
"MIT"
] | null | null | null | import db.backends.base.operations
import db.backends.base.schema
__all__ = ['operations', 'schema']
| 20.4 | 34 | 0.77451 | 13 | 102 | 5.769231 | 0.538462 | 0.213333 | 0.426667 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 102 | 4 | 35 | 25.5 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0.156863 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8ecec9de8ba42dae7cf13b0e5f2c0820c26ac099 | 22,068 | py | Python | loader_representations.py | PRamoneda/ICASSP22 | a7f798d09290734fa94761558e003f35567cdee8 | [
"MIT"
] | 2 | 2022-03-26T10:12:57.000Z | 2022-03-26T23:55:52.000Z | loader_representations.py | PRamoneda/ICASSP22 | a7f798d09290734fa94761558e003f35567cdee8 | [
"MIT"
] | null | null | null | loader_representations.py | PRamoneda/ICASSP22 | a7f798d09290734fa94761558e003f35567cdee8 | [
"MIT"
] | null | null | null | """
File name: approach_deepgru.py
Author: Pedro Ramoneda
Python Version: 3.7
"""
import csv
import os
import sys
import numpy as np
from utils import load_xmls, load_json, save_json
def get_path(alias):
if alias == "mikro1":
path = "mikrokosmos1"
if alias == "mikro2":
path = "pianoplayer"
if alias == "nak":
path = "nakamura"
return path
def rep_raw(alias):
path_alias = get_path(alias)
rep = {}
for grade, path, xml in load_xmls():
rep[path] = {
'grade': grade,
'right_velocity': [],
'left_velocity': [],
'right_fingers': [],
'left_fingers': []
}
r_h_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '_rh.txt'])
l_h_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '_lh.txt'])
for path_txt, hand in zip([r_h_cost, l_h_cost], ["right_", "left_"]):
with open(path_txt) as tsv_file:
read_tsv = csv.reader(tsv_file, delimiter="\t")
for l in read_tsv:
rep[path][hand + 'velocity'] = rep[path][hand + 'velocity'] + [float(l[8])]
rep[path][hand + 'fingers'] = rep[path][hand + 'fingers'] + [abs(int(l[7]))]
save_json(rep, os.path.join('representations', path_alias, 'rep_raw.json'))
def merge_chord_onsets(time_series):
new_time_series = [list(a) for a in time_series]
for ii in range(len(time_series)):
if ii + 1 < len(time_series) and time_series[ii][0] + 0.05 == time_series[ii + 1][0]:
if ii + 2 < len(time_series) and time_series[ii][0] + 0.1 == time_series[ii + 2][0]:
if ii + 3 < len(time_series) and time_series[ii][0] + 0.15 == time_series[ii + 3][0]:
if ii + 4 < len(time_series) and time_series[ii][0] + 0.2 == time_series[ii + 4][0]:
new_time_series[ii][0] = time_series[ii + 4][0]
else:
new_time_series[ii][0] = time_series[ii + 3][0]
else:
new_time_series[ii][0] = time_series[ii + 2][0]
else:
new_time_series[ii][0] = time_series[ii + 1][0]
else:
new_time_series[ii][0] = time_series[ii][0]
return [tuple(a) for a in new_time_series]
def finger2index(f):
if f > 0:
index = int(f) + 4
elif f < 0:
index = int(f) - 5
else: # == 0
index = -1000
return index
def velocity_piece(path, alias, xml):
path_alias = get_path(alias)
print(path)
r_h_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '_rh.txt'])
l_h_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '_lh.txt'])
intermediate_rep = []
for path_txt, hand in zip([r_h_cost, l_h_cost], ["right_", "left_"]):
time_series = []
with open(path_txt) as tsv_file:
read_tsv = csv.reader(tsv_file, delimiter="\t")
for l in read_tsv:
if int(l[7]) != 0:
time_series.append((round(float(l[1]), 2), int(l[7]), abs(float(l[8]))))
time_series = time_series[:-9]
intermediate_rep.extend(time_series)
# order by onset and create matrix
matrix = []
intermediate_rep = [on for on in sorted(intermediate_rep, key=(lambda a: a[0]))]
idx = 0
onsets = []
while idx < len(intermediate_rep):
onsets.append(intermediate_rep[idx][0])
t = [0] * 10
index = finger2index(intermediate_rep[idx][1])
t[index] = intermediate_rep[idx][2]
j = idx + 1
while j < len(intermediate_rep) and intermediate_rep[idx][0] == intermediate_rep[j][0]:
index = finger2index(intermediate_rep[j][1])
t[index] = intermediate_rep[j][2]
j += 1
idx = j
# print(t)
matrix.append(t)
return matrix, onsets
def rep_velocity(alias):
rep = {}
for grade, path, xml in load_xmls():
matrix, _ = velocity_piece(path, alias, xml)
rep[path] = {
'grade': grade,
'matrix': matrix
}
save_json(rep, os.path.join('representations', get_path(alias), 'rep_velocity.json'))
def prob_piece(path, alias, xml):
path_alias = get_path(alias)
print(path)
PIG_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '.txt'])
time_series = []
with open(PIG_cost) as tsv_file:
read_tsv = csv.reader(tsv_file, delimiter="\t")
for l in list(read_tsv)[1:]:
if int(l[7]) != 0:
time_series.append((round(float(l[1]), 2), int(l[7]), abs(abs(float(l[8])))))
time_series = time_series[:-3]
# order by onset and create matrix
matrix = []
intermediate_rep = [on for on in sorted(time_series, key=(lambda a: a[0]))]
onsets = []
idx = 0
while idx < len(intermediate_rep):
onsets.append(intermediate_rep[idx][0])
t = [0] * 10
index = finger2index(intermediate_rep[idx][1])
t[index] = intermediate_rep[idx][2]
j = idx + 1
while j < len(intermediate_rep) and intermediate_rep[idx][0] == intermediate_rep[j][0]:
index = finger2index(intermediate_rep[j][1])
t[index] = intermediate_rep[j][2]
j += 1
idx = j
# print(t)
matrix.append(t)
return matrix, onsets
def rep_prob(alias):
rep = {}
for grade, path, xml in load_xmls():
matrix, _ = prob_piece(path, alias, xml)
rep[path] = {
'grade': grade,
'matrix': matrix
}
save_json(rep, os.path.join('representations', get_path(alias), 'rep_nakamura.json'))
def rep_d_nakamura(alias):
path_alias = get_path(alias)
rep = {}
for grade, path, xml in load_xmls():
print(path, grade)
PIG_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '.txt'])
time_series = []
with open(PIG_cost) as tsv_file:
read_tsv = csv.reader(tsv_file, delimiter="\t")
for l in list(read_tsv)[1:]:
if int(l[7]) != 0:
time_series.append((round(float(l[1]), 2), int(l[7]), abs(abs(float(l[8]))), round(float(l[2]), 2)))
time_series = time_series[:-3]
# order by onset and create matrix
matrix = []
intermediate_rep = [on for on in sorted(time_series, key=(lambda a: a[0]))]
idx = 0
while idx < len(intermediate_rep):
t = [0] * 10
index = finger2index(intermediate_rep[idx][1])
t[index] = intermediate_rep[idx][2] / (intermediate_rep[idx][3] - intermediate_rep[idx][0])
j = idx + 1
while j < len(intermediate_rep) and intermediate_rep[idx][0] == intermediate_rep[j][0]:
index = finger2index(intermediate_rep[j][1])
t[index] = intermediate_rep[j][2] / (intermediate_rep[j][3] - (intermediate_rep[j][0]))
j += 1
idx = j
# print(t)
matrix.append(t)
rep[path] = {
'grade': grade,
'matrix': matrix
}
save_json(rep, os.path.join('representations', path_alias, 'rep_d_nakamura.json'))
def finger_piece(path, alias, xml):
path_alias = get_path(alias)
print(path)
r_h_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '_rh.txt'])
l_h_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '_lh.txt'])
intermediate_rep = []
for path_txt, hand in zip([r_h_cost, l_h_cost], ["right_", "left_"]):
time_series = []
with open(path_txt) as tsv_file:
read_tsv = csv.reader(tsv_file, delimiter="\t")
for l in read_tsv:
if int(l[7]) != 0:
time_series.append((round(float(l[1]), 2), int(l[7]), abs(float(l[8]))))
time_series = time_series[:-1]
intermediate_rep.extend(time_series)
# order by onset and create matrix
matrix = []
intermediate_rep = [on for on in sorted(intermediate_rep, key=(lambda a: a[0]))]
idx = 0
onsets = []
while idx < len(intermediate_rep):
onsets.append(intermediate_rep[idx][0])
t = [0] * 10
index = finger2index(intermediate_rep[idx][1])
t[index] = 1.0
j = idx + 1
while j < len(intermediate_rep) and intermediate_rep[idx][0] == intermediate_rep[j][0]:
index = finger2index(intermediate_rep[j][1])
t[index] = 1.0
j += 1
idx = j
# print(t)
matrix.append(t)
return matrix, onsets
def rep_finger(alias):
rep = {}
for grade, path, xml in load_xmls():
matrix, _ = finger_piece(path, alias, xml)
rep[path] = {
'grade': grade,
'matrix': matrix
}
save_json(rep, os.path.join('representations', get_path(alias), 'rep_finger.json'))
def finger_nakamura_piece(path, alias, xml):
path_alias = get_path(alias)
print(path)
PIG_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '.txt'])
time_series = []
with open(PIG_cost) as tsv_file:
read_tsv = csv.reader(tsv_file, delimiter="\t")
for l in list(read_tsv)[1:]:
if int(l[7]) != 0:
time_series.append((round(float(l[1]), 2), int(l[7]), abs(abs(float(l[8])))))
time_series = time_series[:-3]
# order by onset and create matrix
matrix = []
intermediate_rep = [on for on in sorted(time_series, key=(lambda a: a[0]))]
idx = 0
onsets = []
while idx < len(intermediate_rep):
onsets.append(intermediate_rep[idx][0])
t = [0] * 10
index = finger2index(intermediate_rep[idx][1])
t[index] = 1.0
j = idx + 1
while j < len(intermediate_rep) and intermediate_rep[idx][0] == intermediate_rep[j][0]:
index = finger2index(intermediate_rep[j][1])
t[index] = 1.0
j += 1
idx = j
# print(t)
matrix.append(t)
return matrix, onsets
def rep_finger_nakamura(alias):
rep = {}
for grade, path, xml in load_xmls():
matrix, _ = finger_nakamura_piece(path, alias, xml)
rep[path] = {
'grade': grade,
'matrix': matrix
}
save_json(rep, os.path.join('representations', get_path(alias), 'rep_finger_nakamura.json'))
def notes_piece(path, alias, xml):
path_alias = get_path(alias)
print(path)
r_h_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '_rh.txt'])
l_h_cost = '/'.join(["Fingers", path_alias, os.path.basename(xml[:-4]) + '_lh.txt'])
intermediate_rep = []
for path_txt, hand in zip([r_h_cost, l_h_cost], ["right_", "left_"]):
time_series = []
with open(path_txt) as tsv_file:
read_tsv = csv.reader(tsv_file, delimiter="\t")
for l in read_tsv:
if int(l[7]) != 0:
# (onset, note)
time_series.append((round(float(l[1]), 2), int(l[3]) - 21))
intermediate_rep.extend(time_series)
# order by onset and create matrix
matrix = []
intermediate_rep = [on for on in sorted(intermediate_rep, key=(lambda a: a[0]))]
idx = 0
onsets = []
while idx < len(intermediate_rep):
onsets.append(intermediate_rep[idx][0])
t = [0.0] * 88
index = intermediate_rep[idx][1]
t[index] = 1.0
j = idx + 1
while j < len(intermediate_rep) and intermediate_rep[idx][0] == intermediate_rep[j][0]:
index = intermediate_rep[j][1]
t[index] = 1.0
j += 1
idx = j
# print(t)
matrix.append(t)
return matrix, onsets
def rep_notes(alias):
rep = {}
for grade, path, xml in load_xmls():
matrix, _ = notes_piece(path, alias, xml)
rep[path] = {
'grade': grade,
'matrix': matrix
}
save_json(rep, os.path.join('representations', get_path(alias), 'rep_note.json'))
def visualize_note_representation(alias, score='mikrokosmos/musicxml/69.xml'):
data = load_json(os.path.join('representations', get_path(alias), 'rep_note.json'))
matrix = data[score]['matrix']
for row in np.array(matrix).transpose():
for c in row:
print(c, end="|")
print()
def visualize_finger_representation(alias, score='mikrokosmos/musicxml/69.xml'):
data = load_json(os.path.join('representations', get_path(alias), 'rep_finger.json'))
matrix = data[score]['matrix']
for row in np.array(matrix).transpose():
for c in row:
print(c, end="|")
print()
def visualize_finger_representation_nakamura(alias="nak", score='mikrokosmos/musicxml/69.xml'):
data = load_json(os.path.join('representations', get_path(alias), 'rep_finger_nakamura.json'))
matrix = data[score]['matrix']
for row in np.array(matrix).transpose():
for c in row:
print(c, end="|")
print()
def visualize_velocity_representation(alias, score='mikrokosmos/musicxml/69.xml'):
data = load_json(os.path.join('representations', get_path(alias), 'rep_velocity.json'))
matrix = data[score]['matrix']
for row in np.array(matrix).transpose():
for c in row:
print(c, end="|")
print()
def visualize_prob_representation(alias="nak", score='mikrokosmos/musicxml/5.xml'):
data = load_json(os.path.join('representations', get_path(alias), 'rep_nakamura.json'))
matrix = data[score]['matrix']
for row in np.array(matrix).transpose():
for c in row:
print('%02.1f' % c, end="|")
print()
def visualize_d_nakamura(alias="nak", score='mikrokosmos/musicxml/69.xml'):
data = load_json(os.path.join('representations', get_path(alias), 'rep_d_nakamura.json'))
matrix = data[score]['matrix']
for row in np.array(matrix).transpose():
for c in row:
print(c, end="|")
print()
def get_distance_type(last_semitone, current_semitone):
last_black = (last_semitone % 12) in [1, 3, 6, 8, 10]
current_black = (current_semitone % 12) in [1, 3, 6, 8, 10]
if not last_black and not current_black:
distance_type = 1
elif last_black and not current_black:
distance_type = 2
elif not last_black and current_black:
distance_type = 3
else: # bb
distance_type = 4
return distance_type
def rep_distances(alias):
path_alias = get_path(alias)
rep = {}
for grade, path, r_h, l_h in load_xmls():
print(path, grade)
r_h_cost = '/'.join(["Fingers", path_alias, r_h[:-11] + '_rh.txt'])
l_h_cost = '/'.join(["Fingers", path_alias, l_h[:-11] + '_lh.txt'])
intermediate_rep = []
for path_txt, hand in zip([r_h_cost, l_h_cost], ["right_", "left_"]):
time_series = []
with open(path_txt) as tsv_file:
read_tsv = csv.reader(tsv_file, delimiter="\t")
for l in read_tsv:
if int(l[7]) != 0:
time_series.append((round(float(l[1]), 2), int(l[7]), abs(float(l[8])), abs(float(l[3]))))
if alias == 'version_1.0':
time_series = merge_chord_onsets(time_series[:-10])
else:
time_series = time_series[:-10]
intermediate_rep.extend(time_series)
# order by onset and create matrix
matrix = []
intermediate_rep = [on for on in sorted(intermediate_rep, key=(lambda a: a[0]))]
# initial semitone: at the beginning the distance is 0
last_semitone_rh = next(x[3] for x in intermediate_rep if x[1] > 0)
last_semitone_lh = next(x[3] for x in intermediate_rep if x[1] < 0)
idx = 0
while idx < len(intermediate_rep):
d, dt, t = [0] * 10, [0] * 10, [0] * 10
index = finger2index(intermediate_rep[idx][1])
is_r_h = index >= 5
last_semitone = last_semitone_rh if is_r_h else last_semitone_lh
t[index] = intermediate_rep[idx][2]
d[index] = last_semitone - intermediate_rep[idx][3]
dt[index] = get_distance_type(last_semitone, intermediate_rep[idx][3])
if is_r_h:
last_semitone_rh = last_semitone
else:
last_semitone_lh = last_semitone
j = idx + 1
while j < len(intermediate_rep) and intermediate_rep[idx][0] == intermediate_rep[j][0]:
index = finger2index(intermediate_rep[j][1])
is_r_h = index >= 5
last_semitone = last_semitone_rh if is_r_h else last_semitone_lh
t[index] = intermediate_rep[j][2]
d[index] = last_semitone - intermediate_rep[idx][3]
dt[index] = get_distance_type(last_semitone, intermediate_rep[idx][3])
if is_r_h:
last_semitone_rh = last_semitone
else:
last_semitone_lh = last_semitone
j += 1
idx = j
# print(t)
# matrix.append([t, d , dt])
matrix.append(t + d + dt)
rep[path] = {
'grade': grade,
'matrix': matrix
}
save_json(rep, os.path.join('representations', path_alias, 'rep_distance.json'))
def rep_fing_vel_time(alias):
get_path(alias)
def rep_distances_time(alias):
get_path(alias)
def rep_merged_time(alias):
get_path(alias)
def load_rep(klass):
if klass == "rep_velocity":
path_alias = get_path("mikro2")
path = os.path.join('representations', path_alias, 'rep_velocity.json')
data = load_json(path)
ans = ([np.array(x['matrix']) for k, x in data.items()], np.array([x['grade'] for k, x in data.items()]))
elif klass == "rep_finger":
path_alias = get_path("mikro2")
path = os.path.join('representations', path_alias, 'rep_finger.json')
data = load_json(path)
ans = ([np.array(x['matrix']) for k, x in data.items()], np.array([x['grade'] for k, x in data.items()]))
elif klass == "rep_finger_nakamura":
path_alias = get_path("nak")
path = os.path.join('representations', path_alias, 'rep_finger_nakamura.json')
data = load_json(path)
ans = ([np.array(x['matrix']) for k, x in data.items()], np.array([x['grade'] for k, x in data.items()]))
elif klass == "rep_prob":
path_alias = get_path("nak")
path = os.path.join('representations', path_alias, 'rep_nakamura.json')
data = load_json(path)
ans = ([np.array(x['matrix']) for k, x in data.items()], np.array([x['grade'] for k, x in data.items()]))
elif klass == "rep_d_nakamura":
path_alias = get_path("nak")
path = os.path.join('representations', path_alias, 'rep_d_nakamura.json')
data = load_json(path)
ans = ([np.array(x['matrix']) for k, x in data.items()], np.array([x['grade'] for k, x in data.items()]))
elif klass == "rep_note":
path_alias = get_path("mikro2")
path = os.path.join('representations', path_alias, 'rep_note.json')
data = load_json(path)
ans = ([np.array(x['matrix']) for k, x in data.items()], np.array([x['grade'] for k, x in data.items()]))
elif klass == "rep_distance":
path_alias = get_path("mikro2")
path = os.path.join('representations', path_alias, 'rep_distance.json')
data = load_json(path)
ans = ([np.array(x['matrix']) for k, x in data.items()], np.array([x['grade'] for k, x in data.items()]))
return ans
def load_rep_info(klass):
if klass == "rep_velocity":
path_alias = get_path("mikro2")
path = os.path.join('representations', path_alias, 'rep_velocity.json')
data = load_json(path)
ans = np.array([k for k, x in data.items()])
elif klass == "rep_finger":
path_alias = get_path("mikro2")
path = os.path.join('representations', path_alias, 'rep_finger.json')
data = load_json(path)
ans = np.array([k for k, x in data.items()])
elif klass == "rep_finger_nakamura":
path_alias = get_path("nak")
path = os.path.join('representations', path_alias, 'rep_finger_nakamura.json')
data = load_json(path)
ans = np.array([k for k, x in data.items()])
elif klass == "rep_prob":
path_alias = get_path("nak")
path = os.path.join('representations', path_alias, 'rep_nakamura.json')
data = load_json(path)
ans = np.array([k for k, x in data.items()])
elif klass == "rep_d_nakamura":
path_alias = get_path("nak")
path = os.path.join('representations', path_alias, 'rep_d_nakamura.json')
data = load_json(path)
ans = np.array([k for k, x in data.items()])
elif klass == "rep_note":
path_alias = get_path("mikro2")
path = os.path.join('representations', path_alias, 'rep_note.json')
data = load_json(path)
ans = np.array([k for k, x in data.items()])
elif klass == "rep_distance":
path_alias = get_path("mikro2")
path = os.path.join('representations', path_alias, 'rep_distance.json')
data = load_json(path)
ans = np.array([k for k, x in data.items()])
return ans
if __name__ == '__main__':
# rep_raw("version_1.0")
rep_velocity("mikro2")
# rep_distances("version_1.0")
# load_rep("version_1.0", rep_velocity)
# load_rep("version_1.0", rep_velocity)
# visualize_note_representation("mikro1")
# visualize_note_representation("mikro1")
# rep_finger_nakamura("nak")
# rep_prob("nak")
rep_notes("mikro2")
rep_finger("mikro2")
# visualize_note_representation("mikro2")
# rep_d_nakamura("nak")
# visualize_prob_representation("nak")
# visualize_finger_representation_nakamura()
| 35.708738 | 120 | 0.575403 | 3,033 | 22,068 | 3.985823 | 0.055061 | 0.063281 | 0.030772 | 0.057904 | 0.880966 | 0.856729 | 0.849367 | 0.837952 | 0.816941 | 0.801472 | 0 | 0.020401 | 0.275875 | 22,068 | 617 | 121 | 35.766613 | 0.736108 | 0.039106 | 0 | 0.738046 | 0 | 0 | 0.09142 | 0.012148 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058212 | false | 0 | 0.010395 | 0 | 0.091476 | 0.039501 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d966248c16f1336f9763236d0958e0aee054dc55 | 210 | py | Python | tests/db_sync_streaming_test.py | lmmx/impscan | fd809f9f32302e3b67bc921164f34fdd837cf92b | [
"MIT"
] | null | null | null | tests/db_sync_streaming_test.py | lmmx/impscan | fd809f9f32302e3b67bc921164f34fdd837cf92b | [
"MIT"
] | 15 | 2021-06-24T15:30:57.000Z | 2021-07-30T14:04:38.000Z | tests/db_sync_streaming_test.py | lmmx/impscan | fd809f9f32302e3b67bc921164f34fdd837cf92b | [
"MIT"
] | null | null | null | from pytest import fixture, mark, raises
from impscan.db.generate_db_sync_streaming import populate_conda_package_db
def test_db_population():
# populate_conda_package_db()
pass # lol don't do this
| 23.333333 | 75 | 0.790476 | 32 | 210 | 4.84375 | 0.71875 | 0.167742 | 0.258065 | 0.283871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152381 | 210 | 8 | 76 | 26.25 | 0.870787 | 0.214286 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 9 |
7944cb0140d50899a466521d50acbb389adeba1d | 12,995 | py | Python | components/cuda.py | RobInLabUJI/ROSLab | 3a5047a204989dea108cb163fd1ca7516ec2f5c9 | [
"MIT"
] | 10 | 2019-09-18T18:51:06.000Z | 2022-01-25T21:46:05.000Z | components/cuda.py | RobInLabUJI/ROSLab | 3a5047a204989dea108cb163fd1ca7516ec2f5c9 | [
"MIT"
] | 2 | 2019-09-11T13:02:35.000Z | 2019-10-11T12:44:13.000Z | components/cuda.py | RobInLabUJI/ROSLab | 3a5047a204989dea108cb163fd1ca7516ec2f5c9 | [
"MIT"
] | 2 | 2019-10-31T06:29:05.000Z | 2020-01-08T03:18:53.000Z | import os, sys
versions = ['8.0-runtime', '8.0-devel', '9.0-runtime', '9.0-devel',
'9.2-runtime', '9.2-devel', '10.0-runtime', '10.0-devel']
DOCKER_CUDA_HEADER = """
###################################### CUDA ####################################
"""
DOCKER_RUNTIME_CONTENTS = {}
DOCKER_RUNTIME_CONTENTS['18.04'] = {}
DOCKER_RUNTIME_CONTENTS['16.04'] = {}
DOCKER_RUNTIME_CONTENTS['18.04']['10.0'] = """
RUN apt-get update && apt-get install -y --no-install-recommends gnupg2 curl ca-certificates && \\
curl -fsSL https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | apt-key add - && \\
echo "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 /" > /etc/apt/sources.list.d/cuda.list && \\
echo "deb https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 /" > /etc/apt/sources.list.d/nvidia-ml.list && \\
apt-get purge --autoremove -y curl && \\
rm -rf /var/lib/apt/lists/*
ENV CUDA_VERSION 10.0.130
ENV CUDA_PKG_VERSION 10-0=$CUDA_VERSION-1
# For libraries in the cuda-compat-* package: https://docs.nvidia.com/cuda/eula/index.html#attachment-a
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-cudart-$CUDA_PKG_VERSION \\
cuda-compat-10-0=410.48-1 && \\
ln -s cuda-10.0 /usr/local/cuda && \\
rm -rf /var/lib/apt/lists/*
ENV PATH /usr/local/cuda/bin:${PATH}
# nvidia-container-runtime
ENV NVIDIA_VISIBLE_DEVICES all
ENV NVIDIA_DRIVER_CAPABILITIES compute,utility
ENV NVIDIA_REQUIRE_CUDA "cuda>=10.0 brand=tesla,driver>=384,driver<385"
ENV NCCL_VERSION 2.4.2
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-libraries-$CUDA_PKG_VERSION \\
cuda-nvtx-$CUDA_PKG_VERSION \\
libnccl2=$NCCL_VERSION-1+cuda10.0 && \\
apt-mark hold libnccl2 && \\
rm -rf /var/lib/apt/lists/*
"""
DOCKER_RUNTIME_CONTENTS['18.04']['9.2'] = """
# CUDA 9.2 is not officially supported on ubuntu 18.04 yet, we use the ubuntu 17.10 repository for CUDA instead.
RUN apt-get update && apt-get install -y --no-install-recommends gnupg2 curl ca-certificates && \\
curl -fsSL https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af80.pub | apt-key add - && \\
echo "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64 /" > /etc/apt/sources.list.d/cuda.list && \\
echo "deb https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 /" > /etc/apt/sources.list.d/nvidia-ml.list && \\
apt-get purge --autoremove -y curl && \\
rm -rf /var/lib/apt/lists/*
ENV CUDA_VERSION 9.2.148
ENV CUDA_PKG_VERSION 9-2=$CUDA_VERSION-1
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-cudart-$CUDA_PKG_VERSION && \\
ln -s cuda-9.2 /usr/local/cuda && \\
rm -rf /var/lib/apt/lists/*
# nvidia-docker 1.0
LABEL com.nvidia.volumes.needed="nvidia_driver"
LABEL com.nvidia.cuda.version="${CUDA_VERSION}"
RUN echo "/usr/local/nvidia/lib" >> /etc/ld.so.conf.d/nvidia.conf && \\
echo "/usr/local/nvidia/lib64" >> /etc/ld.so.conf.d/nvidia.conf
ENV PATH /usr/local/nvidia/bin:/usr/local/cuda/bin:${PATH}
ENV LD_LIBRARY_PATH /usr/local/nvidia/lib:/usr/local/nvidia/lib64
# nvidia-container-runtime
ENV NVIDIA_VISIBLE_DEVICES all
ENV NVIDIA_DRIVER_CAPABILITIES compute,utility
ENV NVIDIA_REQUIRE_CUDA "cuda>=9.2"
ENV NCCL_VERSION 2.3.7
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-libraries-$CUDA_PKG_VERSION \\
cuda-nvtx-$CUDA_PKG_VERSION \\
libnccl2=$NCCL_VERSION-1+cuda9.2 && \\
apt-mark hold libnccl2 && \\
rm -rf /var/lib/apt/lists/*
"""
DOCKER_RUNTIME_CONTENTS['16.04']['10.0'] = """
RUN apt-get update && apt-get install -y --no-install-recommends ca-certificates apt-transport-https gnupg-curl && \\
rm -rf /var/lib/apt/lists/* && \\
NVIDIA_GPGKEY_SUM=d1be581509378368edeec8c1eb2958702feedf3bc3d17011adbf24efacce4ab5 && \\
NVIDIA_GPGKEY_FPR=ae09fe4bbd223a84b2ccfce3f60f4b3d7fa2af80 && \\
apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub && \\
apt-key adv --export --no-emit-version -a $NVIDIA_GPGKEY_FPR | tail -n +5 > cudasign.pub && \\
echo "$NVIDIA_GPGKEY_SUM cudasign.pub" | sha256sum -c --strict - && rm cudasign.pub && \\
echo "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 /" > /etc/apt/sources.list.d/cuda.list && \\
echo "deb https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 /" > /etc/apt/sources.list.d/nvidia-ml.list
ENV CUDA_VERSION 10.0.130
ENV CUDA_PKG_VERSION 10-0=$CUDA_VERSION-1
# For libraries in the cuda-compat-* package: https://docs.nvidia.com/cuda/eula/index.html#attachment-a
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-cudart-$CUDA_PKG_VERSION \\
cuda-compat-10-0=410.48-1 && \\
ln -s cuda-10.0 /usr/local/cuda && \\
rm -rf /var/lib/apt/lists/*
ENV PATH /usr/local/cuda/bin:${PATH}
# nvidia-container-runtime
ENV NVIDIA_VISIBLE_DEVICES all
ENV NVIDIA_DRIVER_CAPABILITIES compute,utility
ENV NVIDIA_REQUIRE_CUDA "cuda>=10.0 brand=tesla,driver>=384,driver<385"
ENV NCCL_VERSION 2.4.2
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-libraries-$CUDA_PKG_VERSION \\
cuda-nvtx-$CUDA_PKG_VERSION \\
libnccl2=$NCCL_VERSION-1+cuda10.0 && \\
apt-mark hold libnccl2 && \\
rm -rf /var/lib/apt/lists/*
"""
DOCKER_RUNTIME_CONTENTS['16.04']['9.0'] = """
RUN apt-get update && apt-get install -y --no-install-recommends ca-certificates apt-transport-https gnupg-curl && \\
rm -rf /var/lib/apt/lists/* && \\
NVIDIA_GPGKEY_SUM=d1be581509378368edeec8c1eb2958702feedf3bc3d17011adbf24efacce4ab5 && \\
NVIDIA_GPGKEY_FPR=ae09fe4bbd223a84b2ccfce3f60f4b3d7fa2af80 && \\
apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub && \\
apt-key adv --export --no-emit-version -a $NVIDIA_GPGKEY_FPR | tail -n +5 > cudasign.pub && \\
echo "$NVIDIA_GPGKEY_SUM cudasign.pub" | sha256sum -c --strict - && rm cudasign.pub && \\
echo "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 /" > /etc/apt/sources.list.d/cuda.list && \\
echo "deb https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 /" > /etc/apt/sources.list.d/nvidia-ml.list
ENV CUDA_VERSION 9.0.176
ENV CUDA_PKG_VERSION 9-0=$CUDA_VERSION-1
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-cudart-$CUDA_PKG_VERSION && \\
ln -s cuda-9.0 /usr/local/cuda && \\
rm -rf /var/lib/apt/lists/*
# nvidia-docker 1.0
LABEL com.nvidia.volumes.needed="nvidia_driver"
LABEL com.nvidia.cuda.version="${CUDA_VERSION}"
RUN echo "/usr/local/nvidia/lib" >> /etc/ld.so.conf.d/nvidia.conf && \\
echo "/usr/local/nvidia/lib64" >> /etc/ld.so.conf.d/nvidia.conf
ENV PATH /usr/local/nvidia/bin:/usr/local/cuda/bin:${PATH}
ENV LD_LIBRARY_PATH /usr/local/nvidia/lib:/usr/local/nvidia/lib64
# nvidia-container-runtime
ENV NVIDIA_VISIBLE_DEVICES all
ENV NVIDIA_DRIVER_CAPABILITIES compute,utility
ENV NVIDIA_REQUIRE_CUDA "cuda>=9.0"
ENV NCCL_VERSION 2.4.2
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-libraries-$CUDA_PKG_VERSION \\
cuda-cublas-9-0=9.0.176.4-1 \\
libnccl2=$NCCL_VERSION-1+cuda9.0 && \\
apt-mark hold libnccl2 && \\
rm -rf /var/lib/apt/lists/*
"""
DOCKER_RUNTIME_CONTENTS['16.04']['8.0'] = """
RUN apt-get update && apt-get install -y --no-install-recommends ca-certificates apt-transport-https gnupg-curl && \\
rm -rf /var/lib/apt/lists/* && \\
NVIDIA_GPGKEY_SUM=d1be581509378368edeec8c1eb2958702feedf3bc3d17011adbf24efacce4ab5 && \\
NVIDIA_GPGKEY_FPR=ae09fe4bbd223a84b2ccfce3f60f4b3d7fa2af80 && \\
apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub && \\
apt-key adv --export --no-emit-version -a $NVIDIA_GPGKEY_FPR | tail -n +5 > cudasign.pub && \\
echo "$NVIDIA_GPGKEY_SUM cudasign.pub" | sha256sum -c --strict - && rm cudasign.pub && \\
echo "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 /" > /etc/apt/sources.list.d/cuda.list
ENV CUDA_VERSION 8.0.61
ENV CUDA_PKG_VERSION 8-0=$CUDA_VERSION-1
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-nvrtc-$CUDA_PKG_VERSION \\
cuda-nvgraph-$CUDA_PKG_VERSION \\
cuda-cusolver-$CUDA_PKG_VERSION \\
cuda-cublas-8-0=8.0.61.2-1 \\
cuda-cufft-$CUDA_PKG_VERSION \\
cuda-curand-$CUDA_PKG_VERSION \\
cuda-cusparse-$CUDA_PKG_VERSION \\
cuda-npp-$CUDA_PKG_VERSION \\
cuda-cudart-$CUDA_PKG_VERSION && \\
ln -s cuda-8.0 /usr/local/cuda && \\
rm -rf /var/lib/apt/lists/*
# nvidia-docker 1.0
LABEL com.nvidia.volumes.needed="nvidia_driver"
LABEL com.nvidia.cuda.version="${CUDA_VERSION}"
RUN echo "/usr/local/nvidia/lib" >> /etc/ld.so.conf.d/nvidia.conf && \\
echo "/usr/local/nvidia/lib64" >> /etc/ld.so.conf.d/nvidia.conf
ENV PATH /usr/local/nvidia/bin:/usr/local/cuda/bin:${PATH}
ENV LD_LIBRARY_PATH /usr/local/nvidia/lib:/usr/local/nvidia/lib64
# nvidia-container-runtime
ENV NVIDIA_VISIBLE_DEVICES all
ENV NVIDIA_DRIVER_CAPABILITIES compute,utility
ENV NVIDIA_REQUIRE_CUDA "cuda>=8.0"
"""
DOCKER_DEVEL_CONTENTS = {}
DOCKER_DEVEL_CONTENTS['18.04'] = {}
DOCKER_DEVEL_CONTENTS['16.04'] = {}
DOCKER_DEVEL_CONTENTS['18.04']['10.0'] = """
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-libraries-dev-$CUDA_PKG_VERSION \\
cuda-nvml-dev-$CUDA_PKG_VERSION \\
cuda-minimal-build-$CUDA_PKG_VERSION \\
cuda-command-line-tools-$CUDA_PKG_VERSION \\
libnccl-dev=$NCCL_VERSION-1+cuda10.0 && \\
rm -rf /var/lib/apt/lists/*
ENV LIBRARY_PATH /usr/local/cuda/lib64/stubs
"""
DOCKER_DEVEL_CONTENTS['18.04']['9.2'] = """
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-libraries-dev-$CUDA_PKG_VERSION \\
cuda-nvml-dev-$CUDA_PKG_VERSION \\
cuda-minimal-build-$CUDA_PKG_VERSION \\
cuda-command-line-tools-$CUDA_PKG_VERSION \\
libnccl-dev=$NCCL_VERSION-1+cuda9.2 && \\
rm -rf /var/lib/apt/lists/*
ENV LIBRARY_PATH /usr/local/cuda/lib64/stubs
"""
DOCKER_DEVEL_CONTENTS['16.04']['10.0'] = """
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-libraries-dev-$CUDA_PKG_VERSION \\
cuda-nvml-dev-$CUDA_PKG_VERSION \\
cuda-minimal-build-$CUDA_PKG_VERSION \\
cuda-command-line-tools-$CUDA_PKG_VERSION \\
libnccl-dev=$NCCL_VERSION-1+cuda10.0 && \\
rm -rf /var/lib/apt/lists/*
ENV LIBRARY_PATH /usr/local/cuda/lib64/stubs
"""
DOCKER_DEVEL_CONTENTS['16.04']['9.0'] = """
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-libraries-dev-$CUDA_PKG_VERSION \\
cuda-nvml-dev-$CUDA_PKG_VERSION \\
cuda-minimal-build-$CUDA_PKG_VERSION \\
cuda-command-line-tools-$CUDA_PKG_VERSION \\
cuda-core-9-0=9.0.176.3-1 \\
cuda-cublas-dev-9-0=9.0.176.4-1 \\
libnccl-dev=$NCCL_VERSION-1+cuda9.0 && \\
rm -rf /var/lib/apt/lists/*
ENV LIBRARY_PATH /usr/local/cuda/lib64/stubs
"""
DOCKER_DEVEL_CONTENTS['16.04']['8.0'] = """
RUN apt-get update && apt-get install -y --no-install-recommends \\
cuda-core-$CUDA_PKG_VERSION \\
cuda-misc-headers-$CUDA_PKG_VERSION \\
cuda-command-line-tools-$CUDA_PKG_VERSION \\
cuda-nvrtc-dev-$CUDA_PKG_VERSION \\
cuda-nvml-dev-$CUDA_PKG_VERSION \\
cuda-nvgraph-dev-$CUDA_PKG_VERSION \\
cuda-cusolver-dev-$CUDA_PKG_VERSION \\
cuda-cublas-dev-8-0=8.0.61.2-1 \\
cuda-cufft-dev-$CUDA_PKG_VERSION \\
cuda-curand-dev-$CUDA_PKG_VERSION \\
cuda-cusparse-dev-$CUDA_PKG_VERSION \\
cuda-npp-dev-$CUDA_PKG_VERSION \\
cuda-cudart-dev-$CUDA_PKG_VERSION \\
cuda-driver-dev-$CUDA_PKG_VERSION && \\
rm -rf /var/lib/apt/lists/*
ENV LIBRARY_PATH /usr/local/cuda/lib64/stubs
"""
def write(DOCKER_FILE, version, ubuntu):
if version in versions:
with open(DOCKER_FILE, "a") as dockerfile:
dockerfile.write(DOCKER_CUDA_HEADER)
cuda = version.split('-')[0]
try:
dockerfile.write(DOCKER_RUNTIME_CONTENTS[ubuntu][cuda])
if 'devel' in version:
dockerfile.write(DOCKER_DEVEL_CONTENTS[ubuntu][cuda])
except KeyError as e:
print("CUDA version %s not supported in Ubuntu %s" % (cuda, ubuntu) )
sys.exit(1)
return
else:
print("cuda: version %s not supported. Options: %s" % (version, versions))
sys.exit(1)
| 42.32899 | 150 | 0.679646 | 1,922 | 12,995 | 4.462539 | 0.099376 | 0.043255 | 0.08651 | 0.079748 | 0.916871 | 0.85123 | 0.840737 | 0.838638 | 0.835024 | 0.829894 | 0 | 0.056945 | 0.154059 | 12,995 | 306 | 151 | 42.46732 | 0.723278 | 0 | 0 | 0.669323 | 0 | 0.14741 | 0.900339 | 0.36463 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003984 | false | 0 | 0.003984 | 0 | 0.011952 | 0.007968 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
798352c8ccdd415dfe945daf369e1c61f1238875 | 3,341 | py | Python | iocage/tests/unit_tests/1001_lib_start_test.py | project-fifo/iocage | 1b8669bc2119718dbea8f2707a4eb4c92197c0f0 | [
"BSD-2-Clause"
] | null | null | null | iocage/tests/unit_tests/1001_lib_start_test.py | project-fifo/iocage | 1b8669bc2119718dbea8f2707a4eb4c92197c0f0 | [
"BSD-2-Clause"
] | null | null | null | iocage/tests/unit_tests/1001_lib_start_test.py | project-fifo/iocage | 1b8669bc2119718dbea8f2707a4eb4c92197c0f0 | [
"BSD-2-Clause"
] | 1 | 2022-03-06T10:09:18.000Z | 2022-03-06T10:09:18.000Z | import mock
import pytest
from iocage.lib.ioc_start import find_bridge_mtu
@mock.patch('iocage.lib.ioc_start.checkoutput')
def test_should_return_mtu_of_first_member(mock_checkoutput):
mock_checkoutput.side_effect = [bridge_if_config, member_if_config]
mtu = find_bridge_mtu('bridge0')
assert mtu == '1500'
mock_checkoutput.assert_has_calls([mock.call(["ifconfig", "bridge0"]),
mock.call(["ifconfig", "bge0"])])
@mock.patch('iocage.lib.ioc_start.checkoutput')
def test_should_return_mtu_of_first_member_with_description(mock_checkoutput):
mock_checkoutput.side_effect = [bridge_with_description_if_config,
member_if_config]
mtu = find_bridge_mtu('bridge0')
assert mtu == '1500'
mock_checkoutput.assert_has_calls([mock.call(["ifconfig", "bridge0"]),
mock.call(["ifconfig", "bge0"])])
@mock.patch('iocage.lib.ioc_start.checkoutput')
def test_should_return_default_mtu_if_no_members(mock_checkoutput):
mock_checkoutput.side_effect = [bridge_with_no_members_if_config,
member_if_config]
mtu = find_bridge_mtu('bridge0')
assert mtu == '1500'
mock_checkoutput.called_with(["ifconfig", "bridge0"])
bridge_if_config = """bridge0: flags=8843<UP,BROADCAST,RUNNING,SIMPLEX,MULTICAST> metric 0 mtu 1500
ether 00:00:00:00:00:00
nd6 options=1<PERFORMNUD>
groups: bridge
id 00:00:00:00:00:00 priority 32768 hellotime 2 fwddelay 15
maxage 20 holdcnt 6 proto rstp maxaddr 2000 timeout 1200
root id 00:00:00:00:00:00 priority 32768 ifcost 0 port 0
member: bge0 flags=143<LEARNING,DISCOVER,AUTOEDGE,AUTOPTP>
ifmaxaddr 0 port 1 priority 128 path cost 20000
"""
bridge_with_description_if_config = """bridge0: flags=8843<UP,BROADCAST,RUNNING,SIMPLEX,MULTICAST> metric 0 mtu 1500
description: first-bridge
ether 00:00:00:00:00:00
nd6 options=1<PERFORMNUD>
groups: bridge
id 00:00:00:00:00:00 priority 32768 hellotime 2 fwddelay 15
maxage 20 holdcnt 6 proto rstp maxaddr 2000 timeout 1200
root id 00:00:00:00:00:00 priority 32768 ifcost 0 port 0
member: bge0 flags=143<LEARNING,DISCOVER,AUTOEDGE,AUTOPTP>
ifmaxaddr 0 port 1 priority 128 path cost 20000
"""
bridge_with_no_members_if_config = """bridge0: flags=8843<UP,BROADCAST,RUNNING,SIMPLEX,MULTICAST> metric 0 mtu 1500
description: first-bridge
ether 00:00:00:00:00:00
nd6 options=1<PERFORMNUD>
groups: bridge
id 00:00:00:00:00:00 priority 32768 hellotime 2 fwddelay 15
maxage 20 holdcnt 6 proto rstp maxaddr 2000 timeout 1200
root id 00:00:00:00:00:00 priority 32768 ifcost 0 port 0
"""
member_if_config = """bge0: flags=8943<UP,BROADCAST,RUNNING,PROMISC,SIMPLEX,MULTICAST> metric 0 mtu 1500
options=c019b<RXCSUM,TXCSUM,VLAN_MTU,VLAN_HWTAGGING,VLAN_HWCSUM,TSO4,VLAN_HWTSO,LINKSTATE>
ether 00:00:00:00:00:00
inet6 fe80::0000:0000:0000:0000%bge0 prefixlen 64 scopeid 0x1
inet 10.2.3.4 netmask 0xffffff00 broadcast 10.2.3.255
nd6 options=21<PERFORMNUD,AUTO_LINKLOCAL>
media: Ethernet autoselect (1000baseT <full-duplex>)
status: active
"""
| 44.546667 | 116 | 0.690213 | 472 | 3,341 | 4.705508 | 0.254237 | 0.09005 | 0.108059 | 0.108059 | 0.805493 | 0.795588 | 0.773976 | 0.74561 | 0.700585 | 0.700585 | 0 | 0.130435 | 0.215205 | 3,341 | 74 | 117 | 45.148649 | 0.716629 | 0 | 0 | 0.6875 | 0 | 0.015625 | 0.629452 | 0.183777 | 0 | 0 | 0.003891 | 0 | 0.078125 | 1 | 0.046875 | false | 0 | 0.046875 | 0 | 0.09375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
79c248da75bba8b7fcf7c8660e417111cb27df14 | 51 | py | Python | theBroker/venv/Lib/site-packages/ttn/github_com/TheThingsNetwork/api/__init__.py | emirgo/WeatherStation | f0f8c3464470991fc962d83cea20f3bcfd6a04b6 | [
"MIT"
] | 32 | 2017-11-01T16:03:48.000Z | 2021-11-16T12:35:34.000Z | theBroker/venv/Lib/site-packages/ttn/github_com/TheThingsNetwork/api/__init__.py | emirgo/WeatherStation | f0f8c3464470991fc962d83cea20f3bcfd6a04b6 | [
"MIT"
] | 28 | 2017-11-20T09:45:59.000Z | 2021-12-14T09:31:24.000Z | theBroker/venv/Lib/site-packages/ttn/github_com/TheThingsNetwork/api/__init__.py | emirgo/WeatherStation | f0f8c3464470991fc962d83cea20f3bcfd6a04b6 | [
"MIT"
] | 22 | 2017-11-03T10:21:50.000Z | 2021-04-08T05:20:51.000Z | from .api_pb2_grpc import *
from .api_pb2 import *
| 17 | 27 | 0.764706 | 9 | 51 | 4 | 0.555556 | 0.388889 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046512 | 0.156863 | 51 | 2 | 28 | 25.5 | 0.790698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.