hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3165e130f5207cedd4ba7b73c0d8adf2bd6ef36e | 2,508 | py | Python | 2019/day25_moves.py | dimkarakostas/advent-of-code | fb9c12eabc3342c607e24da1edeb7e5643400263 | [
"MIT"
] | 2 | 2018-12-06T09:39:35.000Z | 2020-12-18T19:38:40.000Z | 2019/day25_moves.py | dimkarakostas/advent-of-code | fb9c12eabc3342c607e24da1edeb7e5643400263 | [
"MIT"
] | null | null | null | 2019/day25_moves.py | dimkarakostas/advent-of-code | fb9c12eabc3342c607e24da1edeb7e5643400263 | [
"MIT"
] | null | null | null | logged_moves = [[115, 111, 117, 116, 104, 10], [101, 97, 115, 116, 10], [116, 97, 107, 101, 32, 115, 112, 97, 99, 101, 32, 104, 101, 97, 116, 101, 114, 10], [119, 101, 115, 116, 10], [119, 101, 115, 116, 10], [116, 97, 107, 101, 32, 115, 104, 101, 108, 108, 10], [101, 97, 115, 116, 10], [110, 111, 114, 116, 104, 10], [119, 101, 115, 116, 10], [110, 111, 114, 116, 104, 10], [116, 97, 107, 101, 32, 106, 97, 109, 10], [101, 97, 115, 116, 10], [115, 111, 117, 116, 104, 10], [116, 97, 107, 101, 32, 97, 115, 116, 101, 114, 105, 115, 107, 10], [101, 97, 115, 116, 10], [119, 101, 115, 116, 10], [115, 111, 117, 116, 104, 10], [116, 97, 107, 101, 32, 107, 108, 101, 105, 110, 32, 98, 111, 116, 116, 108, 101, 10], [101, 97, 115, 116, 10], [116, 97, 107, 101, 32, 115, 112, 111, 111, 108, 32, 111, 102, 32, 99, 97, 116, 54, 10], [119, 101, 115, 116, 10], [110, 111, 114, 116, 104, 10], [110, 111, 114, 116, 104, 10], [119, 101, 115, 116, 10], [110, 111, 114, 116, 104, 10], [116, 97, 107, 101, 32, 97, 115, 116, 114, 111, 110, 97, 117, 116, 32, 105, 99, 101, 32, 99, 114, 101, 97, 109, 10], [101, 97, 115, 116, 10], [119, 101, 115, 116, 10], [110, 111, 114, 116, 104, 10], [101, 97, 115, 116, 10], [10], [119, 101, 115, 116, 10], [101, 97, 115, 116, 10], [10], [27, 91, 65, 10], [110, 111, 114, 116, 104, 10], [115, 111, 117, 116, 104, 10], [115, 111, 117, 116, 104, 10], [116, 97, 107, 101, 32, 115, 112, 97, 99, 101, 32, 108, 97, 119, 32, 115, 112, 97, 99, 101, 32, 98, 114, 111, 99, 104, 117, 114, 101, 10], [110, 111, 114, 116, 104, 10], [119, 101, 115, 116, 10], [115, 111, 117, 116, 104, 10], [115, 111, 117, 116, 104, 10], [115, 111, 117, 116, 104, 10], [115, 111, 117, 116, 104, 10], [119, 101, 115, 116, 10], [115, 111, 117, 116, 104, 10], [101, 97, 115, 116, 10], [105, 110, 118, 10], [119, 101, 115, 116, 10], [100, 114, 111, 112, 32, 115, 112, 111, 111, 108, 32, 111, 102, 32, 99, 97, 116, 54, 10], [100, 114, 111, 112, 32, 115, 112, 97, 99, 101, 32, 108, 97, 119, 32, 115, 112, 97, 99, 101, 32, 98, 114, 111, 99, 104, 117, 114, 101, 10], [100, 114, 111, 112, 32, 97, 115, 116, 101, 114, 105, 115, 107, 10], [100, 114, 111, 112, 32, 97, 115, 116, 114, 111, 110, 97, 117, 116, 32, 105, 99, 101, 32, 99, 114, 101, 97, 109, 10], [100, 114, 111, 112, 32, 106, 97, 109, 10], [100, 114, 111, 112, 32, 115, 104, 101, 108, 108, 10], [100, 114, 111, 112, 32, 115, 112, 97, 99, 101, 32, 104, 101, 97, 116, 101, 114, 10], [100, 114, 111, 112, 32, 107, 108, 101, 105, 110, 32, 98, 111, 116, 116, 108, 101, 10]]
| 1,254 | 2,507 | 0.548644 | 508 | 2,508 | 2.706693 | 0.057087 | 0.104727 | 0.116364 | 0.088 | 0.981091 | 0.976727 | 0.941091 | 0.898182 | 0.834182 | 0.794909 | 0 | 0.682 | 0.202552 | 2,508 | 1 | 2,508 | 2,508 | 0.0055 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
318cae4f18300ea3ee3d45653789e68eec573d8f | 6,269 | py | Python | loldib/getratings/models/NA/na_shen/na_shen_sup.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_shen/na_shen_sup.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_shen/na_shen_sup.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Shen_Sup_Aatrox(Ratings):
pass
class NA_Shen_Sup_Ahri(Ratings):
pass
class NA_Shen_Sup_Akali(Ratings):
pass
class NA_Shen_Sup_Alistar(Ratings):
pass
class NA_Shen_Sup_Amumu(Ratings):
pass
class NA_Shen_Sup_Anivia(Ratings):
pass
class NA_Shen_Sup_Annie(Ratings):
pass
class NA_Shen_Sup_Ashe(Ratings):
pass
class NA_Shen_Sup_AurelionSol(Ratings):
pass
class NA_Shen_Sup_Azir(Ratings):
pass
class NA_Shen_Sup_Bard(Ratings):
pass
class NA_Shen_Sup_Blitzcrank(Ratings):
pass
class NA_Shen_Sup_Brand(Ratings):
pass
class NA_Shen_Sup_Braum(Ratings):
pass
class NA_Shen_Sup_Caitlyn(Ratings):
pass
class NA_Shen_Sup_Camille(Ratings):
pass
class NA_Shen_Sup_Cassiopeia(Ratings):
pass
class NA_Shen_Sup_Chogath(Ratings):
pass
class NA_Shen_Sup_Corki(Ratings):
pass
class NA_Shen_Sup_Darius(Ratings):
pass
class NA_Shen_Sup_Diana(Ratings):
pass
class NA_Shen_Sup_Draven(Ratings):
pass
class NA_Shen_Sup_DrMundo(Ratings):
pass
class NA_Shen_Sup_Ekko(Ratings):
pass
class NA_Shen_Sup_Elise(Ratings):
pass
class NA_Shen_Sup_Evelynn(Ratings):
pass
class NA_Shen_Sup_Ezreal(Ratings):
pass
class NA_Shen_Sup_Fiddlesticks(Ratings):
pass
class NA_Shen_Sup_Fiora(Ratings):
pass
class NA_Shen_Sup_Fizz(Ratings):
pass
class NA_Shen_Sup_Galio(Ratings):
pass
class NA_Shen_Sup_Gangplank(Ratings):
pass
class NA_Shen_Sup_Garen(Ratings):
pass
class NA_Shen_Sup_Gnar(Ratings):
pass
class NA_Shen_Sup_Gragas(Ratings):
pass
class NA_Shen_Sup_Graves(Ratings):
pass
class NA_Shen_Sup_Hecarim(Ratings):
pass
class NA_Shen_Sup_Heimerdinger(Ratings):
pass
class NA_Shen_Sup_Illaoi(Ratings):
pass
class NA_Shen_Sup_Irelia(Ratings):
pass
class NA_Shen_Sup_Ivern(Ratings):
pass
class NA_Shen_Sup_Janna(Ratings):
pass
class NA_Shen_Sup_JarvanIV(Ratings):
pass
class NA_Shen_Sup_Jax(Ratings):
pass
class NA_Shen_Sup_Jayce(Ratings):
pass
class NA_Shen_Sup_Jhin(Ratings):
pass
class NA_Shen_Sup_Jinx(Ratings):
pass
class NA_Shen_Sup_Kalista(Ratings):
pass
class NA_Shen_Sup_Karma(Ratings):
pass
class NA_Shen_Sup_Karthus(Ratings):
pass
class NA_Shen_Sup_Kassadin(Ratings):
pass
class NA_Shen_Sup_Katarina(Ratings):
pass
class NA_Shen_Sup_Kayle(Ratings):
pass
class NA_Shen_Sup_Kayn(Ratings):
pass
class NA_Shen_Sup_Kennen(Ratings):
pass
class NA_Shen_Sup_Khazix(Ratings):
pass
class NA_Shen_Sup_Kindred(Ratings):
pass
class NA_Shen_Sup_Kled(Ratings):
pass
class NA_Shen_Sup_KogMaw(Ratings):
pass
class NA_Shen_Sup_Leblanc(Ratings):
pass
class NA_Shen_Sup_LeeSin(Ratings):
pass
class NA_Shen_Sup_Leona(Ratings):
pass
class NA_Shen_Sup_Lissandra(Ratings):
pass
class NA_Shen_Sup_Lucian(Ratings):
pass
class NA_Shen_Sup_Lulu(Ratings):
pass
class NA_Shen_Sup_Lux(Ratings):
pass
class NA_Shen_Sup_Malphite(Ratings):
pass
class NA_Shen_Sup_Malzahar(Ratings):
pass
class NA_Shen_Sup_Maokai(Ratings):
pass
class NA_Shen_Sup_MasterYi(Ratings):
pass
class NA_Shen_Sup_MissFortune(Ratings):
pass
class NA_Shen_Sup_MonkeyKing(Ratings):
pass
class NA_Shen_Sup_Mordekaiser(Ratings):
pass
class NA_Shen_Sup_Morgana(Ratings):
pass
class NA_Shen_Sup_Nami(Ratings):
pass
class NA_Shen_Sup_Nasus(Ratings):
pass
class NA_Shen_Sup_Nautilus(Ratings):
pass
class NA_Shen_Sup_Nidalee(Ratings):
pass
class NA_Shen_Sup_Nocturne(Ratings):
pass
class NA_Shen_Sup_Nunu(Ratings):
pass
class NA_Shen_Sup_Olaf(Ratings):
pass
class NA_Shen_Sup_Orianna(Ratings):
pass
class NA_Shen_Sup_Ornn(Ratings):
pass
class NA_Shen_Sup_Pantheon(Ratings):
pass
class NA_Shen_Sup_Poppy(Ratings):
pass
class NA_Shen_Sup_Quinn(Ratings):
pass
class NA_Shen_Sup_Rakan(Ratings):
pass
class NA_Shen_Sup_Rammus(Ratings):
pass
class NA_Shen_Sup_RekSai(Ratings):
pass
class NA_Shen_Sup_Renekton(Ratings):
pass
class NA_Shen_Sup_Rengar(Ratings):
pass
class NA_Shen_Sup_Riven(Ratings):
pass
class NA_Shen_Sup_Rumble(Ratings):
pass
class NA_Shen_Sup_Ryze(Ratings):
pass
class NA_Shen_Sup_Sejuani(Ratings):
pass
class NA_Shen_Sup_Shaco(Ratings):
pass
class NA_Shen_Sup_Shen(Ratings):
pass
class NA_Shen_Sup_Shyvana(Ratings):
pass
class NA_Shen_Sup_Singed(Ratings):
pass
class NA_Shen_Sup_Sion(Ratings):
pass
class NA_Shen_Sup_Sivir(Ratings):
pass
class NA_Shen_Sup_Skarner(Ratings):
pass
class NA_Shen_Sup_Sona(Ratings):
pass
class NA_Shen_Sup_Soraka(Ratings):
pass
class NA_Shen_Sup_Swain(Ratings):
pass
class NA_Shen_Sup_Syndra(Ratings):
pass
class NA_Shen_Sup_TahmKench(Ratings):
pass
class NA_Shen_Sup_Taliyah(Ratings):
pass
class NA_Shen_Sup_Talon(Ratings):
pass
class NA_Shen_Sup_Taric(Ratings):
pass
class NA_Shen_Sup_Teemo(Ratings):
pass
class NA_Shen_Sup_Thresh(Ratings):
pass
class NA_Shen_Sup_Tristana(Ratings):
pass
class NA_Shen_Sup_Trundle(Ratings):
pass
class NA_Shen_Sup_Tryndamere(Ratings):
pass
class NA_Shen_Sup_TwistedFate(Ratings):
pass
class NA_Shen_Sup_Twitch(Ratings):
pass
class NA_Shen_Sup_Udyr(Ratings):
pass
class NA_Shen_Sup_Urgot(Ratings):
pass
class NA_Shen_Sup_Varus(Ratings):
pass
class NA_Shen_Sup_Vayne(Ratings):
pass
class NA_Shen_Sup_Veigar(Ratings):
pass
class NA_Shen_Sup_Velkoz(Ratings):
pass
class NA_Shen_Sup_Vi(Ratings):
pass
class NA_Shen_Sup_Viktor(Ratings):
pass
class NA_Shen_Sup_Vladimir(Ratings):
pass
class NA_Shen_Sup_Volibear(Ratings):
pass
class NA_Shen_Sup_Warwick(Ratings):
pass
class NA_Shen_Sup_Xayah(Ratings):
pass
class NA_Shen_Sup_Xerath(Ratings):
pass
class NA_Shen_Sup_XinZhao(Ratings):
pass
class NA_Shen_Sup_Yasuo(Ratings):
pass
class NA_Shen_Sup_Yorick(Ratings):
pass
class NA_Shen_Sup_Zac(Ratings):
pass
class NA_Shen_Sup_Zed(Ratings):
pass
class NA_Shen_Sup_Ziggs(Ratings):
pass
class NA_Shen_Sup_Zilean(Ratings):
pass
class NA_Shen_Sup_Zyra(Ratings):
pass
| 15.033573 | 46 | 0.75642 | 972 | 6,269 | 4.452675 | 0.151235 | 0.223198 | 0.350739 | 0.446396 | 0.791359 | 0.791359 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177221 | 6,269 | 416 | 47 | 15.069712 | 0.839085 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
31d27afea5793a54dcf957bcdd7464dc8e5efb59 | 11,486 | py | Python | riptable/tests/test_categorical_ordered.py | 972d5defe3218bd62b741e6a2f11f5b3/riptable | bb928c11752e831ec701f91964979b31db53826a | [
"BSD-2-Clause-Patent"
] | 307 | 2020-08-27T20:25:11.000Z | 2022-03-08T15:51:19.000Z | riptable/tests/test_categorical_ordered.py | 972d5defe3218bd62b741e6a2f11f5b3/riptable | bb928c11752e831ec701f91964979b31db53826a | [
"BSD-2-Clause-Patent"
] | 206 | 2020-08-17T19:07:15.000Z | 2022-03-18T11:53:55.000Z | riptable/tests/test_categorical_ordered.py | 972d5defe3218bd62b741e6a2f11f5b3/riptable | bb928c11752e831ec701f91964979b31db53826a | [
"BSD-2-Clause-Patent"
] | 10 | 2020-08-28T00:22:05.000Z | 2021-04-30T20:22:28.000Z | from riptable import *
from riptable.rt_enum import GROUPBY_KEY_PREFIX
str_list = ['b', 'b', 'a', 'c', 'b']
str_sorted = ['a', 'b', 'c']
str_unsorted = ['b', 'a', 'c']
int_list = [20, 20, 10, 30, 20]
int_sorted = [10, 20, 30]
int_unsorted = [20, 10, 30]
flt_list = [20.0, 20.0, 10.0, 30.0, 20.0]
flt_sorted = [10.0, 20.0, 30.0]
flt_unsorted = [20.0, 10.0, 30.0]
data = arange(5)
datasum_sorted = [2, 5, 3]
datasum_unsorted = [5, 2, 3]
def arr_equal(a, b):
return bool(np.all(a == b))
class TestCategoricalOrdered:
def test_single_values(self):
# -------------SINGLE STRINGS----------------------------
c = Categorical(str_list)
ds = c.sum(data)
assert arr_equal(c.category_array, str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(str_list, ordered=True)
ds = c.sum(data)
assert arr_equal(c.category_array, str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(str_list, ordered=False)
ds = c.sum(data)
assert arr_equal(c.category_array, str_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
c = Categorical(str_list, sort_gb=True)
ds = c.sum(data)
assert arr_equal(c.category_array, str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(str_list, ordered=True, sort_gb=True)
ds = c.sum(data)
assert arr_equal(c.category_array, str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(str_list, ordered=False, sort_gb=True)
ds = c.sum(data)
assert arr_equal(c.category_array, str_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(str_list, sort_gb=False)
ds = c.sum(data)
assert arr_equal(c.category_array, str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(str_list, ordered=True, sort_gb=False)
ds = c.sum(data)
assert arr_equal(c.category_array, str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(str_list, ordered=False, sort_gb=False)
ds = c.sum(data)
assert arr_equal(c.category_array, str_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
# -------------SINGLE INTEGERS----------------------------
c = Categorical(int_list)
ds = c.sum(data)
assert arr_equal(c.category_array, int_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(int_list, ordered=True)
ds = c.sum(data)
assert arr_equal(c.category_array, int_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(int_list, ordered=False)
ds = c.sum(data)
assert arr_equal(c.category_array, int_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], int_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
c = Categorical(int_list, sort_gb=True)
ds = c.sum(data)
assert arr_equal(c.category_array, int_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(int_list, ordered=True, sort_gb=True)
ds = c.sum(data)
assert arr_equal(c.category_array, int_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(int_list, ordered=False, sort_gb=True)
ds = c.sum(data)
assert arr_equal(c.category_array, int_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(int_list, sort_gb=False)
ds = c.sum(data)
assert arr_equal(c.category_array, int_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(int_list, ordered=True, sort_gb=False)
ds = c.sum(data)
assert arr_equal(c.category_array, int_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(int_list, ordered=False, sort_gb=False)
ds = c.sum(data)
assert arr_equal(c.category_array, int_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], int_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
# -------------SINGLE FLOATS----------------------------
c = Categorical(flt_list)
ds = c.sum(data)
assert arr_equal(c.category_array, flt_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], flt_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(flt_list, ordered=True)
ds = c.sum(data)
assert arr_equal(c.category_array, flt_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], flt_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(flt_list, ordered=False)
ds = c.sum(data)
assert arr_equal(c.category_array, flt_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], flt_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
c = Categorical(flt_list, sort_gb=True)
ds = c.sum(data)
assert arr_equal(c.category_array, flt_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], flt_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(flt_list, ordered=True, sort_gb=True)
ds = c.sum(data)
assert arr_equal(c.category_array, flt_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], flt_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(flt_list, ordered=False, sort_gb=True)
ds = c.sum(data)
assert arr_equal(c.category_array, flt_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], flt_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(flt_list, sort_gb=False)
ds = c.sum(data)
assert arr_equal(c.category_array, flt_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], flt_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(flt_list, ordered=True, sort_gb=False)
ds = c.sum(data)
assert arr_equal(c.category_array, flt_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], flt_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical(flt_list, ordered=False, sort_gb=False)
ds = c.sum(data)
assert arr_equal(c.category_array, flt_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], flt_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
def test_multikey(self):
c = Categorical([FA(str_list), FA(int_list)])
ds = c.sum(data)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_1'], int_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
# 5/9/2019 - multikey will now hold uniques in sorted order if requested, behaves like single key
# unlike single key, still defaults to holding unsorted (searchsorted doesn't apply to keys after the first one)
c = Categorical([FA(str_list), FA(int_list)], ordered=True)
ds = c.sum(data)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_1'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical([FA(str_list), FA(int_list)], ordered=False)
ds = c.sum(data)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_1'], int_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
c = Categorical([FA(str_list), FA(int_list)], sort_gb=True)
ds = c.sum(data)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_1'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical([FA(str_list), FA(int_list)], ordered=True, sort_gb=True)
ds = c.sum(data)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_1'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical([FA(str_list), FA(int_list)], ordered=False, sort_gb=True)
ds = c.sum(data)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_1'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical([FA(str_list), FA(int_list)], sort_gb=False)
ds = c.sum(data)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_1'], int_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
c = Categorical([FA(str_list), FA(int_list)], ordered=True, sort_gb=False)
ds = c.sum(data)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_sorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_1'], int_sorted)
assert arr_equal(ds.col_0, datasum_sorted)
c = Categorical([FA(str_list), FA(int_list)], ordered=False, sort_gb=False)
ds = c.sum(data)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_0'], str_unsorted)
assert arr_equal(ds[GROUPBY_KEY_PREFIX + '_1'], int_unsorted)
assert arr_equal(ds.col_0, datasum_unsorted)
def test_values_cats(self):
c = Categorical(str_list, str_unsorted)
assert arr_equal(c.category_array, str_unsorted)
c = Categorical(str_list, str_unsorted, ordered=True)
assert arr_equal(c.category_array, str_unsorted)
c = Categorical(str_list, str_unsorted, ordered=False)
assert arr_equal(c.category_array, str_unsorted)
c = Categorical(flt_list, flt_unsorted)
assert arr_equal(c.category_array, flt_unsorted)
c = Categorical(flt_list, flt_unsorted, ordered=True)
assert arr_equal(c.category_array, flt_unsorted)
c = Categorical(flt_list, flt_unsorted, ordered=False)
assert arr_equal(c.category_array, flt_unsorted)
| 43.180451 | 121 | 0.642695 | 1,644 | 11,486 | 4.166058 | 0.051703 | 0.134326 | 0.233027 | 0.189225 | 0.916338 | 0.916338 | 0.909622 | 0.905826 | 0.900861 | 0.892685 | 0 | 0.017093 | 0.241076 | 11,486 | 265 | 122 | 43.343396 | 0.768613 | 0.032561 | 0 | 0.707547 | 0 | 0 | 0.009316 | 0 | 0 | 0 | 0 | 0 | 0.537736 | 1 | 0.018868 | false | 0 | 0.009434 | 0.004717 | 0.037736 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
9ee1b3db7fb4c7b66ab6687602b618943f9cb9d4 | 6,142 | py | Python | rosWorkspace/ObstacleCourseTask/src/ObstacleCourseTask/srv/_Toggle.py | chris-blay/guillemot-core | 5e20bf46c10da2e6b57c3293a9d9aa402c864288 | [
"Apache-2.0"
] | null | null | null | rosWorkspace/ObstacleCourseTask/src/ObstacleCourseTask/srv/_Toggle.py | chris-blay/guillemot-core | 5e20bf46c10da2e6b57c3293a9d9aa402c864288 | [
"Apache-2.0"
] | null | null | null | rosWorkspace/ObstacleCourseTask/src/ObstacleCourseTask/srv/_Toggle.py | chris-blay/guillemot-core | 5e20bf46c10da2e6b57c3293a9d9aa402c864288 | [
"Apache-2.0"
] | null | null | null | """autogenerated by genmsg_py from ToggleRequest.msg. Do not edit."""
import roslib.message
import struct
class ToggleRequest(roslib.message.Message):
_md5sum = "a6443b0eeced033f2bdf37f5297439af"
_type = "ObstacleCourseTask/ToggleRequest"
_has_header = False #flag to mark the presence of a Header object
_full_text = """int8 enabled
"""
__slots__ = ['enabled']
_slot_types = ['int8']
def __init__(self, *args, **kwds):
"""
Constructor. Any message fields that are implicitly/explicitly
set to None will be assigned a default value. The recommend
use is keyword arguments as this is more robust to future message
changes. You cannot mix in-order arguments and keyword arguments.
The available fields are:
enabled
@param args: complete set of field values, in .msg order
@param kwds: use keyword arguments corresponding to message field names
to set specific fields.
"""
if args or kwds:
super(ToggleRequest, self).__init__(*args, **kwds)
#message fields cannot be None, assign default values for those that are
if self.enabled is None:
self.enabled = 0
else:
self.enabled = 0
def _get_types(self):
"""
internal API method
"""
return self._slot_types
def serialize(self, buff):
"""
serialize message into buffer
@param buff: buffer
@type buff: StringIO
"""
try:
buff.write(_struct_b.pack(self.enabled))
except struct.error, se: self._check_types(se)
except TypeError, te: self._check_types(te)
def deserialize(self, str):
"""
unpack serialized message in str into this message instance
@param str: byte array of serialized message
@type str: str
"""
try:
end = 0
start = end
end += 1
(self.enabled,) = _struct_b.unpack(str[start:end])
return self
except struct.error, e:
raise roslib.message.DeserializationError(e) #most likely buffer underfill
def serialize_numpy(self, buff, numpy):
"""
serialize message with numpy array types into buffer
@param buff: buffer
@type buff: StringIO
@param numpy: numpy python module
@type numpy module
"""
try:
buff.write(_struct_b.pack(self.enabled))
except struct.error, se: self._check_types(se)
except TypeError, te: self._check_types(te)
def deserialize_numpy(self, str, numpy):
"""
unpack serialized message in str into this message instance using numpy for array types
@param str: byte array of serialized message
@type str: str
@param numpy: numpy python module
@type numpy: module
"""
try:
end = 0
start = end
end += 1
(self.enabled,) = _struct_b.unpack(str[start:end])
return self
except struct.error, e:
raise roslib.message.DeserializationError(e) #most likely buffer underfill
_struct_I = roslib.message.struct_I
_struct_b = struct.Struct("<b")
"""autogenerated by genmsg_py from ToggleResponse.msg. Do not edit."""
import roslib.message
import struct
class ToggleResponse(roslib.message.Message):
_md5sum = "4414c67819626a1b8e0f043a9a0d6c9a"
_type = "ObstacleCourseTask/ToggleResponse"
_has_header = False #flag to mark the presence of a Header object
_full_text = """int8 result
"""
__slots__ = ['result']
_slot_types = ['int8']
def __init__(self, *args, **kwds):
"""
Constructor. Any message fields that are implicitly/explicitly
set to None will be assigned a default value. The recommend
use is keyword arguments as this is more robust to future message
changes. You cannot mix in-order arguments and keyword arguments.
The available fields are:
result
@param args: complete set of field values, in .msg order
@param kwds: use keyword arguments corresponding to message field names
to set specific fields.
"""
if args or kwds:
super(ToggleResponse, self).__init__(*args, **kwds)
#message fields cannot be None, assign default values for those that are
if self.result is None:
self.result = 0
else:
self.result = 0
def _get_types(self):
"""
internal API method
"""
return self._slot_types
def serialize(self, buff):
"""
serialize message into buffer
@param buff: buffer
@type buff: StringIO
"""
try:
buff.write(_struct_b.pack(self.result))
except struct.error, se: self._check_types(se)
except TypeError, te: self._check_types(te)
def deserialize(self, str):
"""
unpack serialized message in str into this message instance
@param str: byte array of serialized message
@type str: str
"""
try:
end = 0
start = end
end += 1
(self.result,) = _struct_b.unpack(str[start:end])
return self
except struct.error, e:
raise roslib.message.DeserializationError(e) #most likely buffer underfill
def serialize_numpy(self, buff, numpy):
"""
serialize message with numpy array types into buffer
@param buff: buffer
@type buff: StringIO
@param numpy: numpy python module
@type numpy module
"""
try:
buff.write(_struct_b.pack(self.result))
except struct.error, se: self._check_types(se)
except TypeError, te: self._check_types(te)
def deserialize_numpy(self, str, numpy):
"""
unpack serialized message in str into this message instance using numpy for array types
@param str: byte array of serialized message
@type str: str
@param numpy: numpy python module
@type numpy: module
"""
try:
end = 0
start = end
end += 1
(self.result,) = _struct_b.unpack(str[start:end])
return self
except struct.error, e:
raise roslib.message.DeserializationError(e) #most likely buffer underfill
_struct_I = roslib.message.struct_I
_struct_b = struct.Struct("<b")
class Toggle(roslib.message.ServiceDefinition):
_type = 'ObstacleCourseTask/Toggle'
_md5sum = 'a2f3d572baaef05608a5c9b396bf270d'
_request_class = ToggleRequest
_response_class = ToggleResponse
| 29.109005 | 91 | 0.676164 | 788 | 6,142 | 5.140863 | 0.176396 | 0.020736 | 0.033572 | 0.018761 | 0.863491 | 0.85016 | 0.85016 | 0.85016 | 0.85016 | 0.85016 | 0 | 0.016428 | 0.236894 | 6,142 | 210 | 92 | 29.247619 | 0.847877 | 0.055682 | 0 | 0.815534 | 1 | 0 | 0.073133 | 0.056916 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.038835 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
731ba660520956ddc9a889fbe582713ae96d84f7 | 3,326 | py | Python | tests/test_easytrader.py | chforest/easytrader | 7825efa90aa6af6a5f181a0736dc8c3e8ed852e5 | [
"MIT"
] | 1 | 2019-11-02T14:42:56.000Z | 2019-11-02T14:42:56.000Z | tests/test_easytrader.py | chforest/easytrader | 7825efa90aa6af6a5f181a0736dc8c3e8ed852e5 | [
"MIT"
] | null | null | null | tests/test_easytrader.py | chforest/easytrader | 7825efa90aa6af6a5f181a0736dc8c3e8ed852e5 | [
"MIT"
] | 1 | 2021-09-18T09:26:47.000Z | 2021-09-18T09:26:47.000Z | # coding: utf-8
import os
import sys
import time
import unittest
sys.path.append(".")
TEST_CLIENTS = os.environ.get("EZ_TEST_CLIENTS", "")
IS_WIN_PLATFORM = sys.platform != "darwin"
@unittest.skipUnless("yh" in TEST_CLIENTS and IS_WIN_PLATFORM, "skip yh test")
class TestYhClientTrader(unittest.TestCase):
@classmethod
def setUpClass(cls):
import easytrader
if "yh" not in TEST_CLIENTS:
return
# input your test account and password
cls._ACCOUNT = os.environ.get("EZ_TEST_YH_ACCOUNT") or "your account"
cls._PASSWORD = (
os.environ.get("EZ_TEST_YH_PASSWORD") or "your password"
)
cls._user = easytrader.use("yh_client")
cls._user.prepare(user=cls._ACCOUNT, password=cls._PASSWORD)
def test_balance(self):
time.sleep(3)
result = self._user.balance
def test_today_entrusts(self):
result = self._user.today_entrusts
def test_today_trades(self):
result = self._user.today_trades
def test_cancel_entrusts(self):
result = self._user.cancel_entrusts
def test_cancel_entrust(self):
result = self._user.cancel_entrust("123456789")
def test_invalid_buy(self):
import easytrader
with self.assertRaises(easytrader.exceptions.TradeError):
result = self._user.buy("511990", 1, 1e10)
def test_invalid_sell(self):
import easytrader
with self.assertRaises(easytrader.exceptions.TradeError):
result = self._user.sell("162411", 200, 1e10)
def test_auto_ipo(self):
self._user.auto_ipo()
@unittest.skipUnless("ht" in TEST_CLIENTS and IS_WIN_PLATFORM, "skip ht test")
class TestHTClientTrader(unittest.TestCase):
@classmethod
def setUpClass(cls):
import easytrader
if "ht" not in TEST_CLIENTS:
return
# input your test account and password
cls._ACCOUNT = os.environ.get("EZ_TEST_HT_ACCOUNT") or "your account"
cls._PASSWORD = (
os.environ.get("EZ_TEST_HT_PASSWORD") or "your password"
)
cls._COMM_PASSWORD = (
os.environ.get("EZ_TEST_HT_COMM_PASSWORD") or "your comm password"
)
cls._user = easytrader.use("ht_client")
cls._user.prepare(
user=cls._ACCOUNT,
password=cls._PASSWORD,
comm_password=cls._COMM_PASSWORD,
)
def test_balance(self):
time.sleep(3)
result = self._user.balance
def test_today_entrusts(self):
result = self._user.today_entrusts
def test_today_trades(self):
result = self._user.today_trades
def test_cancel_entrusts(self):
result = self._user.cancel_entrusts
def test_cancel_entrust(self):
result = self._user.cancel_entrust("123456789")
def test_invalid_buy(self):
import easytrader
with self.assertRaises(easytrader.exceptions.TradeError):
result = self._user.buy("511990", 1, 1e10)
def test_invalid_sell(self):
import easytrader
with self.assertRaises(easytrader.exceptions.TradeError):
result = self._user.sell("162411", 200, 1e10)
def test_auto_ipo(self):
self._user.auto_ipo()
if __name__ == "__main__":
unittest.main(verbosity=2)
| 27.04065 | 78 | 0.653037 | 407 | 3,326 | 5.061425 | 0.186732 | 0.054369 | 0.095146 | 0.069903 | 0.856796 | 0.801942 | 0.799029 | 0.784466 | 0.752427 | 0.693204 | 0 | 0.026389 | 0.248046 | 3,326 | 122 | 79 | 27.262295 | 0.797281 | 0.026158 | 0 | 0.619048 | 0 | 0 | 0.089026 | 0.007419 | 0 | 0 | 0 | 0 | 0.047619 | 1 | 0.214286 | false | 0.107143 | 0.119048 | 0 | 0.380952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
7334bdbe2a3eb475476dc878bfd298d6e72e84e6 | 136 | py | Python | history_actions/settings.py | marcosschroh/django-history-actions | fc29eee29ed4f6ba71a366783fefdbe223cbed21 | [
"MIT"
] | 1 | 2018-09-11T18:35:42.000Z | 2018-09-11T18:35:42.000Z | history_actions/settings.py | marcosschroh/django-history-actions | fc29eee29ed4f6ba71a366783fefdbe223cbed21 | [
"MIT"
] | null | null | null | history_actions/settings.py | marcosschroh/django-history-actions | fc29eee29ed4f6ba71a366783fefdbe223cbed21 | [
"MIT"
] | null | null | null | from django.conf import settings
HISTORY_ACTIONS_GET_USER_FROM_MODEL = getattr(settings, 'HISTORY_ACTIONS_GET_USER_FROM_MODEL', False)
| 34 | 101 | 0.867647 | 20 | 136 | 5.4 | 0.6 | 0.277778 | 0.407407 | 0.462963 | 0.703704 | 0.703704 | 0.703704 | 0 | 0 | 0 | 0 | 0 | 0.073529 | 136 | 3 | 102 | 45.333333 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0.257353 | 0.257353 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
734962d9b11404cc647a4dfb191a97f71c17dff6 | 3,285 | py | Python | Client_Functions.py | CSharpTeoMan911/Client_App | ff3c4aeb2747299d5c775152e1f1c76d4e66a059 | [
"CC0-1.0"
] | null | null | null | Client_Functions.py | CSharpTeoMan911/Client_App | ff3c4aeb2747299d5c775152e1f1c76d4e66a059 | [
"CC0-1.0"
] | null | null | null | Client_Functions.py | CSharpTeoMan911/Client_App | ff3c4aeb2747299d5c775152e1f1c76d4e66a059 | [
"CC0-1.0"
] | null | null | null | import sys
import main
import Server_Connection
class Credential_Functions:
def __init__(self, credential_function, user_Id, user_password):
match credential_function:
case "R":
self.__Register(user_Id, user_password)
case "L":
self.__Log_In(user_Id, user_password)
case "_L":
self.__Log_Out(user_Id, user_password)
def __Log_In(self, user_Id, user_password):
try:
connection = Server_Connection.Functional_Server_Connection(user_Id, user_password)
connection.Log_In_Server_Connection()
except KeyboardInterrupt:
main.SYSTEM_EXIT = True
sys.exit(0)
def __Log_Out(self, user_Id, user_password):
try:
connection = Server_Connection.Functional_Server_Connection(user_Id, user_password)
connection.Log_Out_Server_Connection()
except KeyboardInterrupt:
main.SYSTEM_EXIT = True
sys.exit(0)
def __Register(self, user_Id, user_password):
try:
connection = Server_Connection.Functional_Server_Connection(user_Id, user_password)
connection.Registration_Server_Connection()
except KeyboardInterrupt:
main.SYSTEM_EXIT = True
sys.exit(0)
class Profile_Functions:
id = ""
password = ""
def __init__(self, user_id, user_password):
self.id = user_id
self.password = user_password
def Load_Profile_Picture(self):
try:
connection = Server_Connection.Functional_Server_Connection(self.id, self.password)
connection.Load_Profile_Picture()
except KeyboardInterrupt:
main.SYSTEM_EXIT = True
sys.exit(0)
class Contacts_Functions:
id = ""
password = ""
def __init__(self, user_id, user_password):
self.id = user_id
self.password = user_password
def Load_Contacts(self):
try:
connection = Server_Connection.Functional_Server_Connection(self.id, self.password)
connection.Load_Contacts()
except KeyboardInterrupt:
main.SYSTEM_EXIT = True
sys.exit(0)
class Grades_Function:
id = ""
password = ""
def __init__(self, user_id, user_password):
self.id = user_id
self.password = user_password
def Load_Grades(self):
try:
connection = Server_Connection.Functional_Server_Connection(self.id, self.password)
connection.Load_Grades()
except KeyboardInterrupt:
main.SYSTEM_EXIT = True
sys.exit(0)
class Material_Function:
id = ""
password = ""
subject = 0
def __init__(self, user_id, user_password):
self.id = user_id
self.password = user_password
def Load_Materials(self):
try:
connection = Server_Connection.Functional_Server_Connection(self.id, self.password)
connection.Load_Materials_Info()
except KeyboardInterrupt:
main.SYSTEM_EXIT = True
sys.exit(0)
| 24.886364 | 96 | 0.603044 | 338 | 3,285 | 5.482249 | 0.121302 | 0.155424 | 0.075553 | 0.135996 | 0.808419 | 0.808419 | 0.808419 | 0.808419 | 0.776039 | 0.749595 | 0 | 0.003612 | 0.325723 | 3,285 | 131 | 97 | 25.076336 | 0.832957 | 0 | 0 | 0.639535 | 0 | 0 | 0.001271 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.139535 | false | 0.302326 | 0.034884 | 0 | 0.337209 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
b4056c0816cb3fdc44d2cb050e6598f998ec29a8 | 45 | py | Python | app_utils/validators/length/__init__.py | kskarbinski/threads-api | c144c1cb51422095922310d278f80e4996c10ea0 | [
"MIT"
] | null | null | null | app_utils/validators/length/__init__.py | kskarbinski/threads-api | c144c1cb51422095922310d278f80e4996c10ea0 | [
"MIT"
] | null | null | null | app_utils/validators/length/__init__.py | kskarbinski/threads-api | c144c1cb51422095922310d278f80e4996c10ea0 | [
"MIT"
] | null | null | null | from .validate_length import validate_length
| 22.5 | 44 | 0.888889 | 6 | 45 | 6.333333 | 0.666667 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 45 | 1 | 45 | 45 | 0.926829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b4371cfcc27b572f7e70f3dfce7f2a7901ef4295 | 12,656 | py | Python | beacon/endpoints/html/forms.py | CINECA-project/beacon-2.x | 986214fc910491206cb3b17ad4d14f00890e888d | [
"Apache-2.0"
] | 6 | 2020-01-30T17:29:40.000Z | 2022-03-18T05:27:50.000Z | beacon/endpoints/html/forms.py | CINECA-project/beacon-2.x | 986214fc910491206cb3b17ad4d14f00890e888d | [
"Apache-2.0"
] | 43 | 2019-12-05T14:28:04.000Z | 2022-03-11T12:10:35.000Z | beacon/endpoints/html/forms.py | CINECA-project/beacon-2.x | 986214fc910491206cb3b17ad4d14f00890e888d | [
"Apache-2.0"
] | 14 | 2020-01-14T09:51:48.000Z | 2022-02-17T13:53:46.000Z | import logging
from urllib.parse import urlencode
import re
from django import forms
from django.core.exceptions import ValidationError
from django.utils.translation import gettext_lazy as _
from django.utils.safestring import mark_safe
from django.conf import settings
from . import conf
LOG = logging.getLogger(__name__)
###########################################################################
### For the regular queries
###########################################################################
variantTypes = ('DEL:ME','INS:ME','DUP:TANDEM','DUP','DEL','INS','INV','CNV','SNP','MNP')
regex = re.compile(r'^(X|Y|MT|[1-9]|1[0-9]|2[0-2])\s*\:\s*(\d+)\s+([ATCGN]+)\s*\>\s*(DEL:ME|INS:ME|DUP:TANDEM|DUP|DEL|INS|INV|CNV|SNP|MNP|[ATCGN]+)$', re.I)
# class IncludeDatasetResponsesWidget(forms.RadioSelect):
# template_name='forms/include_dataset_responses.html'
class QueryForm(forms.Form):
assemblyId = forms.ChoiceField(required=True,
choices=( (i,i) for i in conf.BEACON_ASSEMBLYIDS ),
error_messages = { 'invalid_choice': ('<p>Select a valid choice.</p>'
'<p>%(value)s is not one of the available choices.</p>'),
'required': '<p>is required</p>' },
label='Assembly Id')
query = forms.CharField(
strip=True,
required=True,
label=mark_safe('Chromosome : Position ReferenceBase > (AlternateBase|VariantType)'),
label_suffix = '',
error_messages = { 'required': "<p>Eh? ... What was the query again?</p>"},
widget=forms.TextInput(attrs={'data-lpignore':'true', # data-lpignore=true to ignore LastPass injected code
'placeholder': 'For example 10 : 12345 A > T'}),
)
includeDatasetResponses = forms.ChoiceField(required=True,
choices=( (i.upper(),i) for i in ('All','Hit','Miss','None') ),
label='Included Dataset Responses',
widget=forms.Select, # instead of IncludeDatasetResponsesWidget
initial='ALL')
print(includeDatasetResponses)
def is_valid(self):
self.full_clean() # Populate fields (or read self.errors)
# Short circuit already
if not super().is_valid():
return False
query = self.cleaned_data.get('query')
LOG.debug('Query: %s', query)
# So far so good
self.query_deconstructed_data = None
# Testing the regular Query
m = regex.match(query)
if m:
d = { 'referenceName': m.group(1),
'start': m.group(2),
'referenceBases': m.group(3),
'includeDatasetResponses': self.cleaned_data.get('includeDatasetResponses'),
'assemblyId': self.cleaned_data.get('assemblyId')
}
v = m.group(4)
k = 'variantType' if v in variantTypes else 'alternateBases'
d[k] = v
self.query_deconstructed_data = d
return True
# Invalid query
self.add_error('query', ValidationError(_('<p><span class="bold">Oops! </span>Query <code>%(value)s</code> must be of the form:</p>'
'<p><span class="query-form">Regular Query</span>Chromosome : Position ReferenceBase > (AlternateBase|VariantType)</p>'
'<div class="small">'
'<p>where</p>'
'<ul>'
'<li>- Chromosome: 1-22, X, Y, or MT</li>'
'<li>- Position: a positive integer</li>'
'<li>- VariantType: either DEL:ME, INS:ME, DUP:TANDEM, DUP, DEL, INS, INV, CNV, SNP, or MNP</li>'
'<li>- ReferenceBase or AlternateBase: a combination of one or more A, T, C, G, or N</li>'
'</ul>'
'</div>'),
params={'value':query}))
return False
###########################################################################
### For the region queries
###########################################################################
region_regex = re.compile(r'^(X|Y|MT|[1-9]|1[0-9]|2[0-2])\s*\:\s*(\d+)\s*-\s*(\d+)$', re.I)
class QueryRegionForm(forms.Form):
assemblyId = forms.ChoiceField(required=True,
choices=( (i,i) for i in conf.BEACON_ASSEMBLYIDS ),
error_messages = { 'invalid_choice': ('<p>Select a valid choice.</p>'
'<p>%(value)s is not one of the available choices.</p>'),
'required': '<p>is required</p>' },
label='Assembly Id')
query = forms.CharField(
strip=True,
required=True,
label=mark_safe('Chromosome : Start-End'),
label_suffix = '',
error_messages = { 'required': "<p>Eh? ... What was the query again?</p>"},
widget=forms.TextInput(attrs={'data-lpignore':'true', # data-lpignore=true to ignore LastPass injected code
'placeholder': 'For example 10 : 1234 - 5678'}),
)
includeDatasetResponses = forms.ChoiceField(required=True,
choices=( (i.upper(),i) for i in ('All','Hit','Miss','None') ),
label='Included Dataset Responses',
widget=forms.Select, # instead of IncludeDatasetResponsesWidget
initial='ALL')
def is_valid(self):
self.full_clean() # Populate fields (or read self.errors)
# Short circuit already
if not super().is_valid():
return False
query = self.cleaned_data.get('query')
LOG.debug('Query: %s', query)
# So far so good
self.query_deconstructed_data = None
# Testing for Region Query
m = region_regex.match(query)
if m: # Correct Region Query
self.query_deconstructed_data = { 'referenceName': m.group(1),
'start': m.group(2),
'end': m.group(3),
'includeDatasetResponses': self.cleaned_data.get('includeDatasetResponses'),
'assemblyId': self.cleaned_data.get('assemblyId')
}
return True
# Invalid query
self.add_error('query', ValidationError(_('<p><span class="bold">Oops! </span>Query <code>%(value)s</code> must be of the form:</p>'
'<p><span class="query-form">Region Query</span>Chromosome : Start-End</p>'
'<div class="small">'
'<p>where</p>'
'<ul>'
'<li>- Chromosome is either 1-22, X, Y, or MT</li>'
'<li>- Start, End are positive integers</li>'
'</ul>'
'</div>'),
params={'value':query}))
return False
###########################################################################
### For the samples queries
###########################################################################
variantTypes = ('DEL:ME','INS:ME','DUP:TANDEM','DUP','DEL','INS','INV','CNV','SNP','MNP')
regex = re.compile(r'^(X|Y|MT|[1-9]|1[0-9]|2[0-2])\s*\:\s*(\d+)\s+([ATCGN]+)\s*\>\s*(DEL:ME|INS:ME|DUP:TANDEM|DUP|DEL|INS|INV|CNV|SNP|MNP|[ATCGN]+)$', re.I)
class QuerySamplesForm(forms.Form):
assemblyId = forms.ChoiceField(required=False,
choices=( (i,i) for i in conf.BEACON_ASSEMBLYIDS ),
error_messages = { 'invalid_choice': ('<p>Select a valid choice.</p>'
'<p>%(value)s is not one of the available choices.</p>'),
'required': '<p>is required</p>' },
label='Assembly Id')
query = forms.CharField(
strip=True,
required=False,
label=mark_safe('Chromosome : Position ReferenceBase > (AlternateBase|VariantType)'),
label_suffix = '',
error_messages = { 'required': "<p>Eh? ... What was the query again?</p>"},
widget=forms.TextInput(attrs={'data-lpignore':'true', # data-lpignore=true to ignore LastPass injected code
'placeholder': 'For example 10 : 12345 A > T'}),
)
includeDatasetResponses = forms.ChoiceField(required=True,
choices=( (i.upper(),i) for i in ('All','Hit','Miss','None') ),
label='Included Dataset Responses',
widget=forms.Select, # instead of IncludeDatasetResponsesWidget
initial='ALL')
print(includeDatasetResponses)
def is_valid(self):
self.full_clean() # Populate fields (or read self.errors)
# Short circuit already
if not super().is_valid():
return False
query = self.cleaned_data.get('query')
LOG.debug('Query: %s', query)
# Since for this endpoint the query is not requiered
if query:
# So far so good
self.query_deconstructed_data = None
# Testing the regular Query
m = regex.match(query)
if m:
d = { 'referenceName': m.group(1),
'start': m.group(2),
'referenceBases': m.group(3),
'includeDatasetResponses': self.cleaned_data.get('includeDatasetResponses'),
'assemblyId': self.cleaned_data.get('assemblyId')
}
v = m.group(4)
k = 'variantType' if v in variantTypes else 'alternateBases'
d[k] = v
self.query_deconstructed_data = d
return True
# Invalid query
self.add_error('query', ValidationError(_('<p><span class="bold">Oops! </span>Query <code>%(value)s</code> must be of the form:</p>'
'<p><span class="query-form">Regular Query</span>Chromosome : Position ReferenceBase > (AlternateBase|VariantType)</p>'
'<div class="small">'
'<p>where</p>'
'<ul>'
'<li>- Chromosome: 1-22, X, Y, or MT</li>'
'<li>- Position: a positive integer</li>'
'<li>- VariantType: either DEL:ME, INS:ME, DUP:TANDEM, DUP, DEL, INS, INV, CNV, SNP, or MNP</li>'
'<li>- ReferenceBase or AlternateBase: a combination of one or more A, T, C, G, or N</li>'
'</ul>'
'</div>'),
params={'value':query}))
return False
self.query_deconstructed_data = { 'includeDatasetResponses': self.cleaned_data.get('includeDatasetResponses'),
'assemblyId': self.cleaned_data.get('assemblyId')
}
return True
| 50.222222 | 174 | 0.440424 | 1,165 | 12,656 | 4.724464 | 0.169957 | 0.021984 | 0.029978 | 0.035974 | 0.876272 | 0.873001 | 0.865189 | 0.865189 | 0.857013 | 0.857013 | 0 | 0.009037 | 0.40542 | 12,656 | 251 | 175 | 50.422311 | 0.722392 | 0.069611 | 0 | 0.814607 | 0 | 0.073034 | 0.28326 | 0.072454 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016854 | false | 0 | 0.050562 | 0 | 0.191011 | 0.011236 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c356a23b56ab9c7833d97fb4e2df96d518a6e4be | 5,401 | py | Python | lib/framereader.py | fulinjie/vaapi-fits | e8574fbbc2454c518770b90ff578732bdb6d898c | [
"BSD-3-Clause"
] | null | null | null | lib/framereader.py | fulinjie/vaapi-fits | e8574fbbc2454c518770b90ff578732bdb6d898c | [
"BSD-3-Clause"
] | null | null | null | lib/framereader.py | fulinjie/vaapi-fits | e8574fbbc2454c518770b90ff578732bdb6d898c | [
"BSD-3-Clause"
] | null | null | null | ###
### Copyright (C) 2018-2019 Intel Corporation
###
### SPDX-License-Identifier: BSD-3-Clause
###
import numpy
def read_frame_422H(fd, width, height):
width2 = (width + 1) / 2
size = width * height
size2 = width2 * height
y = numpy.fromfile(fd, dtype=numpy.uint8, count=size).reshape((height,width))
u = numpy.fromfile(fd, dtype=numpy.uint8, count=size2).reshape((height,width2))
v = numpy.fromfile(fd, dtype=numpy.uint8, count=size2).reshape((height,width2))
return y, u, v
def read_frame_422V(fd, width, height):
height2 = (height + 1) / 2
size = width * height
size2 = width * height2
y = numpy.fromfile(fd, dtype=numpy.uint8, count=size).reshape((height,width))
u = numpy.fromfile(fd, dtype=numpy.uint8, count=size2).reshape((height2,width))
v = numpy.fromfile(fd, dtype=numpy.uint8, count=size2).reshape((height2,width))
return y, u, v
def read_frame_444P(fd, width, height):
size = width * height
y = numpy.fromfile(fd, dtype=numpy.uint8, count=size).reshape((height,width))
u = numpy.fromfile(fd, dtype=numpy.uint8, count=size).reshape((height,width))
v = numpy.fromfile(fd, dtype=numpy.uint8, count=size).reshape((height,width))
return y, u, v
def read_frame_I420(fd, width, height):
width2 = (width + 1) / 2
height2 = (height + 1) / 2
size = width * height
size2 = width2 * height2
y = numpy.fromfile(fd, dtype=numpy.uint8, count=size).reshape((height, width))
u = numpy.fromfile(fd, dtype=numpy.uint8, count=size2).reshape((height2, width2))
v = numpy.fromfile(fd, dtype=numpy.uint8, count=size2).reshape((height2, width2))
return y, u, v
def read_frame_Y800(fd, width, height):
size = width * height
y = numpy.fromfile(fd, dtype=numpy.uint8, count=size).reshape((height, width))
return y, None, None
def read_frame_YV12(fd, width, height):
width2 = (width + 1) / 2
height2 = (height + 1) / 2
size = width * height
size2 = width2 * height2
y = numpy.fromfile(fd, dtype=numpy.uint8, count=size).reshape((height, width))
v = numpy.fromfile(fd, dtype=numpy.uint8, count=size2).reshape((height2, width2))
u = numpy.fromfile(fd, dtype=numpy.uint8, count=size2).reshape((height2, width2))
return y, u, v
def read_frame_NV12(fd, width, height):
width2 = (width + 1) / 2
height2 = (height + 1) / 2
size = width * height
size2 = width2 * height2 * 2
y = numpy.fromfile(fd, dtype=numpy.uint8, count=size).reshape((height, width))
uv = numpy.fromfile(fd, dtype=numpy.uint8, count=size2)
return y, uv[0::2].reshape((height2, width2)), uv[1::2].reshape((height2, width2))
def read_frame_P010(fd, width, height):
width2 = (width + 1) / 2
height2 = (height + 1) / 2
size = width * height
size2 = width2 * height2 * 2
y = numpy.fromfile(fd, dtype=numpy.uint16, count=size).reshape((height, width))
uv = numpy.fromfile(fd, dtype=numpy.uint16, count=size2)
return y, uv[0::2].reshape((height2, width2)), uv[1::2].reshape((height2, width2))
def read_frame_AYUV(fd, width, height):
size = width * height * 4
ayuv = numpy.fromfile(fd, dtype=numpy.uint8, count=size)
a = ayuv[0::4].reshape((height, width))
y = ayuv[1::4].reshape((height, width))
u = ayuv[2::4].reshape((height, width))
v = ayuv[3::4].reshape((height, width))
return y, u, v
def read_frame_YUY2(fd, width, height):
size = width * height * 2
yuv = numpy.fromfile(fd, dtype=numpy.uint8, count=size)
# frames with odd width and height produce an odd number of bytes
# in uv components and therefore cannot be effectively reshaped
return yuv[0::2].reshape((height, width)), yuv[1::4], yuv[3::4]
def read_frame_ARGB(fd, width, height):
size = width * height * 4
argb = numpy.fromfile(fd, dtype=numpy.uint8, count=size)
a = argb[0::4].reshape((height, width))
r = argb[1::4].reshape((height, width))
g = argb[2::4].reshape((height, width))
b = argb[3::4].reshape((height, width))
return r, g, b
def read_frame_BGRA(fd, width, height):
size = width * height * 4
argb = numpy.fromfile(fd, dtype=numpy.uint8, count=size)
a = argb[3::4].reshape((height, width))
r = argb[2::4].reshape((height, width))
g = argb[1::4].reshape((height, width))
b = argb[0::4].reshape((height, width))
return r, g, b
def read_frame_P210(fd, width, height):
width2 = (width + 1) / 2
size = width * height
size2 = width2 * height
y = numpy.fromfile(fd, dtype=numpy.uint16, count=size).reshape((height,width))
u = numpy.fromfile(fd, dtype=numpy.uint16, count=size2).reshape((height,width2))
v = numpy.fromfile(fd, dtype=numpy.uint16, count=size2).reshape((height,width2))
return y, u, v
def read_frame_P410(fd, width, height):
size = width * height
y = numpy.fromfile(fd, dtype=numpy.uint16, count=size).reshape((height,width))
u = numpy.fromfile(fd, dtype=numpy.uint16, count=size).reshape((height,width))
v = numpy.fromfile(fd, dtype=numpy.uint16, count=size).reshape((height,width))
return y, u, v
FrameReaders = {
"I420" : read_frame_I420,
"422H" : read_frame_422H,
"422V" : read_frame_422V,
"444P" : read_frame_444P,
"NV12" : read_frame_NV12,
"YV12" : read_frame_YV12,
"P010" : read_frame_P010,
"Y800" : read_frame_Y800,
"YUY2" : read_frame_YUY2,
"AYUV" : read_frame_AYUV,
"ARGB" : read_frame_ARGB,
"P210" : read_frame_P210,
"P410" : read_frame_P410,
"BGRA" : read_frame_BGRA,
}
| 31.584795 | 84 | 0.669876 | 814 | 5,401 | 4.375921 | 0.093366 | 0.113139 | 0.126334 | 0.168445 | 0.827344 | 0.8105 | 0.759405 | 0.751263 | 0.721505 | 0.707748 | 0 | 0.058354 | 0.165525 | 5,401 | 170 | 85 | 31.770588 | 0.731972 | 0.037956 | 0 | 0.487395 | 0 | 0 | 0.010815 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.008403 | 0 | 0.243697 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c3728d3dc4517ef9e9d5f143d9408eb05caced6b | 6,538 | py | Python | pyteal/compiler/flatten_test.py | sepandhaghighi/pyteal | a2ab5f31d82a9279f892e6edbddf21f81062aec1 | [
"MIT"
] | 1 | 2021-07-24T21:28:59.000Z | 2021-07-24T21:28:59.000Z | pyteal/compiler/flatten_test.py | sepandhaghighi/pyteal | a2ab5f31d82a9279f892e6edbddf21f81062aec1 | [
"MIT"
] | null | null | null | pyteal/compiler/flatten_test.py | sepandhaghighi/pyteal | a2ab5f31d82a9279f892e6edbddf21f81062aec1 | [
"MIT"
] | 1 | 2021-05-26T02:41:37.000Z | 2021-05-26T02:41:37.000Z | from .. import *
from .flatten import flattenBlocks
def test_flatten_none():
blocks = []
expected = []
actual = flattenBlocks(blocks)
assert actual == expected
def test_flatten_single_empty():
blocks = [
TealSimpleBlock([])
]
expected = []
actual = flattenBlocks(blocks)
assert actual == expected
def test_flatten_single_one():
blocks = [
TealSimpleBlock([TealOp(None, Op.int, 1)])
]
expected = [TealOp(None, Op.int, 1)]
actual = flattenBlocks(blocks)
assert actual == expected
def test_flatten_single_many():
blocks = [
TealSimpleBlock([
TealOp(None, Op.int, 1),
TealOp(None, Op.int, 2),
TealOp(None, Op.int, 3),
TealOp(None, Op.add),
TealOp(None, Op.add)
])
]
expected = [
TealOp(None, Op.int, 1),
TealOp(None, Op.int, 2),
TealOp(None, Op.int, 3),
TealOp(None, Op.add),
TealOp(None, Op.add)
]
actual = flattenBlocks(blocks)
assert actual == expected
def test_flatten_sequence():
block5 = TealSimpleBlock([TealOp(None, Op.int, 5)])
block4 = TealSimpleBlock([TealOp(None, Op.int, 4)])
block4.setNextBlock(block5)
block3 = TealSimpleBlock([TealOp(None, Op.int, 3)])
block3.setNextBlock(block4)
block2 = TealSimpleBlock([TealOp(None, Op.int, 2)])
block2.setNextBlock(block3)
block1 = TealSimpleBlock([TealOp(None, Op.int, 1)])
block1.setNextBlock(block2)
block1.addIncoming()
block1.validateTree()
blocks = [block1, block2, block3, block4, block5]
expected = [
TealOp(None, Op.int, 1),
TealOp(None, Op.int, 2),
TealOp(None, Op.int, 3),
TealOp(None, Op.int, 4),
TealOp(None, Op.int, 5)
]
actual = flattenBlocks(blocks)
assert actual == expected
def test_flatten_branch():
blockTrue = TealSimpleBlock([TealOp(None, Op.byte, "\"true\""), TealOp(None, Op.return_)])
blockFalse = TealSimpleBlock([TealOp(None, Op.byte, "\"false\""), TealOp(None, Op.return_)])
block = TealConditionalBlock([TealOp(None, Op.int, 1)])
block.setTrueBlock(blockTrue)
block.setFalseBlock(blockFalse)
block.addIncoming()
block.validateTree()
blocks = [block, blockFalse, blockTrue]
expected = [
TealOp(None, Op.int, 1),
TealOp(None, Op.bnz, "l2"),
TealOp(None, Op.byte, "\"false\""),
TealOp(None, Op.return_),
TealLabel(None, "l2"),
TealOp(None, Op.byte, "\"true\""),
TealOp(None, Op.return_)
]
actual = flattenBlocks(blocks)
assert actual == expected
def test_flatten_branch_converge():
blockEnd = TealSimpleBlock([TealOp(None, Op.return_)])
blockTrue = TealSimpleBlock([TealOp(None, Op.byte, "\"true\"")])
blockTrue.setNextBlock(blockEnd)
blockFalse = TealSimpleBlock([TealOp(None, Op.byte, "\"false\"")])
blockFalse.setNextBlock(blockEnd)
block = TealConditionalBlock([TealOp(None, Op.int, 1)])
block.setTrueBlock(blockTrue)
block.setFalseBlock(blockFalse)
block.addIncoming()
block.validateTree()
blocks = [block, blockFalse, blockTrue, blockEnd]
expected = [
TealOp(None, Op.int, 1),
TealOp(None, Op.bnz, "l2"),
TealOp(None, Op.byte, "\"false\""),
TealOp(None, Op.b, "l3"),
TealLabel(None, "l2"),
TealOp(None, Op.byte, "\"true\""),
TealLabel(None, "l3"),
TealOp(None, Op.return_)
]
actual = flattenBlocks(blocks)
assert actual == expected
def test_flatten_multiple_branch():
blockTrueTrue = TealSimpleBlock([TealOp(None, Op.byte, "\"true true\""), TealOp(None, Op.return_)])
blockTrueFalse = TealSimpleBlock([TealOp(None, Op.byte, "\"true false\""), TealOp(None, Op.err)])
blockTrueBranch = TealConditionalBlock([])
blockTrueBranch.setTrueBlock(blockTrueTrue)
blockTrueBranch.setFalseBlock(blockTrueFalse)
blockTrue = TealSimpleBlock([TealOp(None, Op.byte, "\"true\"")])
blockTrue.setNextBlock(blockTrueBranch)
blockFalse = TealSimpleBlock([TealOp(None, Op.byte, "\"false\""), TealOp(None, Op.return_)])
block = TealConditionalBlock([TealOp(None, Op.int, 1)])
block.setTrueBlock(blockTrue)
block.setFalseBlock(blockFalse)
block.addIncoming()
block.validateTree()
blocks = [block, blockFalse, blockTrue, blockTrueBranch, blockTrueFalse, blockTrueTrue]
expected = [
TealOp(None, Op.int, 1),
TealOp(None, Op.bnz, "l2"),
TealOp(None, Op.byte, "\"false\""),
TealOp(None, Op.return_),
TealLabel(None, "l2"),
TealOp(None, Op.byte, "\"true\""),
TealOp(None, Op.bnz, "l5"),
TealOp(None, Op.byte, "\"true false\""),
TealOp(None, Op.err),
TealLabel(None, "l5"),
TealOp(None, Op.byte, "\"true true\""),
TealOp(None, Op.return_)
]
actual = flattenBlocks(blocks)
assert actual == expected
def test_flatten_multiple_branch_converge():
blockEnd = TealSimpleBlock([TealOp(None, Op.return_)])
blockTrueTrue = TealSimpleBlock([TealOp(None, Op.byte, "\"true true\"")])
blockTrueTrue.setNextBlock(blockEnd)
blockTrueFalse = TealSimpleBlock([TealOp(None, Op.byte, "\"true false\""), TealOp(None, Op.err)])
blockTrueBranch = TealConditionalBlock([])
blockTrueBranch.setTrueBlock(blockTrueTrue)
blockTrueBranch.setFalseBlock(blockTrueFalse)
blockTrue = TealSimpleBlock([TealOp(None, Op.byte, "\"true\"")])
blockTrue.setNextBlock(blockTrueBranch)
blockFalse = TealSimpleBlock([TealOp(None, Op.byte, "\"false\"")])
blockFalse.setNextBlock(blockEnd)
block = TealConditionalBlock([TealOp(None, Op.int, 1)])
block.setTrueBlock(blockTrue)
block.setFalseBlock(blockFalse)
block.addIncoming()
block.validateTree()
blocks = [block, blockFalse, blockTrue, blockTrueBranch, blockTrueFalse, blockTrueTrue, blockEnd]
expected = [
TealOp(None, Op.int, 1),
TealOp(None, Op.bnz, "l2"),
TealOp(None, Op.byte, "\"false\""),
TealOp(None, Op.b, "l6"),
TealLabel(None, "l2"),
TealOp(None, Op.byte, "\"true\""),
TealOp(None, Op.bnz, "l5"),
TealOp(None, Op.byte, "\"true false\""),
TealOp(None, Op.err),
TealLabel(None, "l5"),
TealOp(None, Op.byte, "\"true true\""),
TealLabel(None, "l6"),
TealOp(None, Op.return_)
]
actual = flattenBlocks(blocks)
assert actual == expected
| 32.527363 | 103 | 0.62695 | 687 | 6,538 | 5.912664 | 0.085881 | 0.192024 | 0.230428 | 0.096012 | 0.907189 | 0.871738 | 0.858198 | 0.838011 | 0.785574 | 0.745692 | 0 | 0.012143 | 0.219027 | 6,538 | 200 | 104 | 32.69 | 0.783392 | 0 | 0 | 0.69186 | 0 | 0 | 0.008565 | 0 | 0 | 0 | 0 | 0 | 0.052326 | 1 | 0.052326 | false | 0 | 0.011628 | 0 | 0.063953 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c37c230dfb54a6f376d29847430cc2fbb7b00fda | 1,457 | py | Python | aumhaa/v2/control_surface/__init__.py | thomasf/LiveRemoteScripts | 866330653e1561a140e076c9a7ae64dd486e5692 | [
"MIT"
] | 25 | 2015-02-02T21:41:51.000Z | 2022-02-19T13:08:53.000Z | aumhaa/v2/control_surface/__init__.py | thomasf/LiveRemoteScripts | 866330653e1561a140e076c9a7ae64dd486e5692 | [
"MIT"
] | null | null | null | aumhaa/v2/control_surface/__init__.py | thomasf/LiveRemoteScripts | 866330653e1561a140e076c9a7ae64dd486e5692 | [
"MIT"
] | 13 | 2015-10-25T04:44:09.000Z | 2020-03-01T18:02:27.000Z |
from __future__ import absolute_import, print_function
from .mod import CS_LIST_KEY, hascontrol, unpack_values ,unpack_items, enumerate_track_device, get_monomodular, get_control_surfaces, SpecialInputSignal, ElementTranslation, StoredElement, Grid, ButtonGrid, Array, RadioArray, RingedStoredElement, RingedGrid, ModHandler, NavigationBox, ModClient, ModRouter
from .mono_modes import SendSysexMode, DisplayMessageMode, SendLividSysexMode, MomentaryBehaviour, ExcludingBehaviourMixin, ExcludingMomentaryBehaviour, DelayedExcludingMomentaryBehaviour, ShiftedBehaviour, CancellableBehaviour, CancellableBehaviourWithRelease, LatchingShiftedBehaviour, FlashingBehaviour, ColoredCancellableBehaviourWithRelease, BicoloredMomentaryBehaviour, DefaultedBehaviour
__all__ = (CS_LIST_KEY,
hascontrol,
unpack_values,
unpack_items,
enumerate_track_device,
get_monomodular,
get_control_surfaces,
SpecialInputSignal,
ElementTranslation,
StoredElement,
Grid,
ButtonGrid,
Array,
RadioArray,
RingedStoredElement,
RingedGrid,
ModHandler,
NavigationBox,
ModClient,
ModRouter,
SendSysexMode,
DisplayMessageMode,
SendLividSysexMode,
MomentaryBehaviour,
ExcludingBehaviourMixin,
ExcludingMomentaryBehaviour,
DelayedExcludingMomentaryBehaviour,
ShiftedBehaviour,
CancellableBehaviour,
CancellableBehaviourWithRelease,
LatchingShiftedBehaviour,
FlashingBehaviour,
ColoredCancellableBehaviourWithRelease,
BicoloredMomentaryBehaviour,
DefaultedBehaviour)
| 34.690476 | 394 | 0.875086 | 103 | 1,457 | 12.097087 | 0.504854 | 0.009631 | 0.014446 | 0.030498 | 0.9374 | 0.9374 | 0.9374 | 0.9374 | 0.9374 | 0.9374 | 0 | 0 | 0.074811 | 1,457 | 41 | 395 | 35.536585 | 0.924332 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.078947 | 0 | 0.078947 | 0.026316 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6f06346bd5223bbbd9196d4c010dd87a4891e428 | 18,340 | py | Python | fonts/DejaVuSans_Bold_20.py | ironss/micropython-lib | 61719636dad9aaa581c8e39e71ccc515e75c2d43 | [
"MIT"
] | null | null | null | fonts/DejaVuSans_Bold_20.py | ironss/micropython-lib | 61719636dad9aaa581c8e39e71ccc515e75c2d43 | [
"MIT"
] | null | null | null | fonts/DejaVuSans_Bold_20.py | ironss/micropython-lib | 61719636dad9aaa581c8e39e71ccc515e75c2d43 | [
"MIT"
] | 2 | 2019-09-24T13:36:55.000Z | 2020-04-18T02:05:38.000Z | # Code generated by font-to-py.py.
# Font: DejaVuSans-Bold.ttf
version = '0.26'
def height():
return 19
def max_width():
return 21
def hmap():
return False
def reverse():
return False
def monospaced():
return False
def min_ch():
return 32
def max_ch():
return 126
_font =\
b'\x0b\x00\x0c\x00\x00\x06\x00\x00\x06\x77\x00\x86\x77\x00\xc6\x77'\
b'\x00\xfe\x01\x00\xfc\x00\x00\x78\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x07\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x09\x00\xfe\x77\x00\xfe'\
b'\x77\x00\xfe\x77\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x0a\x00\x3e\x00\x00\x3e\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x3e\x00\x00\x3e\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x10\x00\x00\x06\x00\x30\x06\x00\x30'\
b'\x7e\x00\xf0\x7f\x00\xf8\x07\x00\x7e\x06\x00\x36\x66\x00\x30\x7f'\
b'\x00\xf0\x1f\x00\xfe\x06\x00\x3e\x06\x00\x30\x06\x00\x30\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0d\x00\xe0\x31\x00\xf0\x61'\
b'\x00\xf8\x63\x00\x98\x63\x00\xfe\xff\x03\xfe\xff\x03\x18\x67\x00'\
b'\x18\x67\x00\x18\x7f\x00\x30\x3e\x00\x00\x1c\x00\x00\x00\x00\x00'\
b'\x00\x00\x13\x00\x78\x00\x00\xfc\x00\x00\x86\x01\x00\x86\x41\x00'\
b'\x86\x61\x00\xfc\x78\x00\x78\x3c\x00\x00\x0f\x00\x80\x07\x00\xe0'\
b'\x01\x00\xf0\x00\x00\x3c\x1e\x00\x1e\x3f\x00\x86\x61\x00\x82\x61'\
b'\x00\x80\x61\x00\x00\x3f\x00\x00\x1e\x00\x00\x00\x00\x11\x00\x00'\
b'\x1e\x00\x80\x3f\x00\xb8\x3f\x00\xfc\x71\x00\xfe\x61\x00\xe6\x63'\
b'\x00\x86\x67\x00\x06\x6f\x00\x06\x3e\x00\x0c\x3c\x00\x00\x7e\x00'\
b'\x80\x7f\x00\x80\x67\x00\x80\x41\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x06\x00\x3e\x00\x00\x3e\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x09\x00\xc0\x1f\x00\xf8\xff\x00\xfe\xff'\
b'\x03\x1e\xc0\x03\x02\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x09\x00\x02\x00\x02\x1e\xc0\x03\xfe\xff\x03\xf8\xff'\
b'\x00\xc0\x1f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x0a\x00\x98\x01\x00\x98\x01\x00\xf0\x00\x00\xf0\x00\x00\xfe\x03'\
b'\x00\xfe\x03\x00\xf0\x00\x00\xf0\x00\x00\x98\x01\x00\x98\x01\x00'\
b'\x10\x00\x00\x03\x00\x00\x03\x00\x00\x03\x00\x00\x03\x00\x00\x03'\
b'\x00\xf8\x7f\x00\xf8\x7f\x00\x00\x03\x00\x00\x03\x00\x00\x03\x00'\
b'\x00\x03\x00\x00\x03\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x07\x00\x00\x80\x01\x00\xf8\x01\x00\xf8\x00\x00\x78\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x00\x00\x07\x00\x00\x07'\
b'\x00\x00\x07\x00\x00\x07\x00\x00\x07\x00\x00\x07\x00\x00\x00\x00'\
b'\x00\x00\x00\x07\x00\x00\x78\x00\x00\x78\x00\x00\x78\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\x00\x00\x80\x01\x00'\
b'\xf0\x01\x00\xfe\x00\xc0\x0f\x00\xfc\x01\x00\x3e\x00\x00\x06\x00'\
b'\x00\x0d\x00\xe0\x07\x00\xf8\x1f\x00\xfc\x3f\x00\x0e\x70\x00\x06'\
b'\x60\x00\x06\x60\x00\x06\x60\x00\x0e\x70\x00\xfc\x3f\x00\xf8\x1f'\
b'\x00\xe0\x07\x00\x00\x00\x00\x00\x00\x00\x0d\x00\x0c\x60\x00\x0e'\
b'\x60\x00\x06\x60\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x00\x60'\
b'\x00\x00\x60\x00\x00\x60\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x0d\x00\x1c\x60\x00\x0e\x70\x00\x06\x78\x00\x06\x7c'\
b'\x00\x06\x7e\x00\x06\x6f\x00\x8e\x67\x00\xfe\x67\x00\xfc\x63\x00'\
b'\xf8\x60\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0d\x00\x0c\x30'\
b'\x00\x06\x60\x00\xc6\x60\x00\xc6\x60\x00\xc6\x60\x00\xc6\x60\x00'\
b'\xee\x71\x00\xfe\x3f\x00\xbc\x3f\x00\x38\x1f\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x0d\x00\x00\x0f\x00\x80\x0f\x00\xc0\x0d\x00'\
b'\xf0\x0c\x00\x38\x0c\x00\x1e\x0c\x00\x06\x0c\x00\xfe\x7f\x00\xfe'\
b'\x7f\x00\xfe\x7f\x00\x00\x0c\x00\x00\x0c\x00\x00\x00\x00\x0d\x00'\
b'\x00\x30\x00\xfe\x61\x00\xfe\x60\x00\xfe\x60\x00\xc6\x60\x00\xc6'\
b'\x60\x00\xc6\x71\x00\xc6\x3f\x00\x86\x3f\x00\x00\x1f\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x0d\x00\xe0\x07\x00\xf8\x1f\x00\xfc'\
b'\x3f\x00\x9c\x71\x00\xce\x60\x00\xc6\x60\x00\xc6\x60\x00\xc6\x71'\
b'\x00\xc6\x3f\x00\x8c\x3f\x00\x00\x1f\x00\x00\x00\x00\x00\x00\x00'\
b'\x0d\x00\x06\x00\x00\x06\x00\x00\x06\x60\x00\x06\x78\x00\x06\x7f'\
b'\x00\xc6\x1f\x00\xf6\x07\x00\xfe\x01\x00\x7e\x00\x00\x0e\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0d\x00\x38\x1f\x00\xfc\x3f'\
b'\x00\xfe\x7f\x00\xc6\x71\x00\xc6\x60\x00\xc6\x60\x00\xc6\x60\x00'\
b'\xc6\x71\x00\xfe\x7f\x00\xfc\x3f\x00\x38\x1f\x00\x00\x00\x00\x00'\
b'\x00\x00\x0d\x00\xf8\x00\x00\xfc\x31\x00\xfc\x63\x00\x8e\x63\x00'\
b'\x06\x63\x00\x06\x63\x00\x06\x73\x00\x8e\x39\x00\xfc\x3f\x00\xf8'\
b'\x1f\x00\xe0\x07\x00\x00\x00\x00\x00\x00\x00\x08\x00\xe0\x79\x00'\
b'\xe0\x79\x00\xe0\x79\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x08\x00\x00\x80\x01\xe0\xf9\x01\xe0\xf9\x00'\
b'\xe0\x79\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x10'\
b'\x00\x00\x07\x00\x00\x07\x00\x80\x0f\x00\x80\x0d\x00\x80\x0d\x00'\
b'\xc0\x1d\x00\xc0\x18\x00\xc0\x18\x00\xe0\x38\x00\x60\x30\x00\x60'\
b'\x30\x00\x70\x70\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x10\x00\xc0\x0c\x00\xc0\x0c\x00\xc0\x0c\x00\xc0\x0c\x00\xc0'\
b'\x0c\x00\xc0\x0c\x00\xc0\x0c\x00\xc0\x0c\x00\xc0\x0c\x00\xc0\x0c'\
b'\x00\xc0\x0c\x00\xc0\x0c\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x10\x00\x70\x70\x00\x60\x30\x00\x60\x30\x00\xe0\x38'\
b'\x00\xc0\x18\x00\xc0\x18\x00\xc0\x1d\x00\x80\x0d\x00\x80\x0d\x00'\
b'\x80\x0f\x00\x00\x07\x00\x00\x07\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x0b\x00\x0c\x00\x00\x06\x00\x00\x06\x77\x00'\
b'\x86\x77\x00\xc6\x77\x00\xfe\x01\x00\xfc\x00\x00\x78\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x13\x00\x80\x0f\x00\xe0\x3f\x00'\
b'\x78\xf0\x00\x1c\xc0\x00\x8c\x8f\x01\xce\x9f\x01\xe6\x38\x03\x66'\
b'\x30\x03\x66\x30\x03\xc6\x18\x03\xe6\x3f\x03\xec\xbf\x01\x1c\xf0'\
b'\x01\x38\x98\x00\xf0\x1f\x00\xc0\x07\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x0f\x00\x00\x40\x00\x00\x78\x00\x00\x7f\x00\xc0\x3f'\
b'\x00\xf8\x0f\x00\xfe\x0d\x00\x3e\x0c\x00\x0e\x0c\x00\x3e\x0c\x00'\
b'\xfe\x0d\x00\xf8\x0f\x00\xc0\x3f\x00\x00\x7f\x00\x00\x78\x00\x00'\
b'\x40\x00\x0e\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x86\x61\x00'\
b'\x86\x61\x00\x86\x61\x00\x86\x61\x00\xfe\x73\x00\xfc\x7f\x00\x78'\
b'\x3f\x00\x00\x1e\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0e\x00'\
b'\xe0\x07\x00\xf8\x1f\x00\xf8\x1f\x00\x1c\x38\x00\x0e\x70\x00\x06'\
b'\x60\x00\x06\x60\x00\x06\x60\x00\x06\x60\x00\x06\x60\x00\x06\x60'\
b'\x00\x0c\x30\x00\x00\x00\x00\x00\x00\x00\x10\x00\xfe\x7f\x00\xfe'\
b'\x7f\x00\xfe\x7f\x00\x06\x60\x00\x06\x60\x00\x06\x60\x00\x06\x60'\
b'\x00\x0e\x70\x00\x0c\x30\x00\x1c\x3c\x00\xf8\x1f\x00\xf0\x0f\x00'\
b'\xe0\x07\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0d\x00\xfe\x7f'\
b'\x00\xfe\x7f\x00\xfe\x7f\x00\x86\x61\x00\x86\x61\x00\x86\x61\x00'\
b'\x86\x61\x00\x86\x61\x00\x86\x61\x00\x06\x60\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x0d\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00'\
b'\x86\x01\x00\x86\x01\x00\x86\x01\x00\x86\x01\x00\x86\x01\x00\x86'\
b'\x01\x00\x86\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x10\x00'\
b'\xe0\x07\x00\xf8\x1f\x00\xfc\x3f\x00\x1c\x38\x00\x0e\x70\x00\x06'\
b'\x60\x00\x06\x60\x00\x06\x60\x00\x86\x61\x00\x86\x61\x00\x86\x7f'\
b'\x00\x8c\x3f\x00\x80\x3f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x10\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x80\x01\x00\x80\x01'\
b'\x00\x80\x01\x00\x80\x01\x00\x80\x01\x00\x80\x01\x00\xfe\x7f\x00'\
b'\xfe\x7f\x00\xfe\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x07\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\x00\x00\x00\x06\x00\x00'\
b'\x06\x00\x00\x07\xfe\xff\x07\xfe\xff\x03\xfe\xff\x01\x00\x00\x00'\
b'\x0f\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\xc0\x01\x00\xe0\x03'\
b'\x00\xf0\x07\x00\x78\x0f\x00\x3c\x1e\x00\x1e\x3c\x00\x0e\x78\x00'\
b'\x06\x70\x00\x02\x60\x00\x00\x40\x00\x00\x00\x00\x00\x00\x00\x0c'\
b'\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x00\x60\x00\x00\x60\x00'\
b'\x00\x60\x00\x00\x60\x00\x00\x60\x00\x00\x60\x00\x00\x60\x00\x00'\
b'\x00\x00\x00\x00\x00\x13\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00'\
b'\x3e\x00\x00\xf8\x00\x00\xe0\x03\x00\x80\x0f\x00\x00\x0e\x00\x80'\
b'\x0f\x00\xe0\x03\x00\xf8\x00\x00\x3e\x00\x00\xfe\x7f\x00\xfe\x7f'\
b'\x00\xfe\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x10\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x1e\x00\x00\x78\x00'\
b'\x00\xe0\x01\x00\x80\x07\x00\x00\x1e\x00\x00\x78\x00\xfe\x7f\x00'\
b'\xfe\x7f\x00\xfe\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x10\x00\xe0\x07\x00\xf8\x1f\x00\xfc\x3f\x00\x1c\x38\x00'\
b'\x0e\x70\x00\x06\x60\x00\x06\x60\x00\x06\x60\x00\x06\x60\x00\x0e'\
b'\x70\x00\x1c\x38\x00\xfc\x3f\x00\xf8\x1f\x00\xe0\x07\x00\x00\x00'\
b'\x00\x00\x00\x00\x0e\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x06'\
b'\x03\x00\x06\x03\x00\x06\x03\x00\x06\x03\x00\x8e\x03\x00\xfe\x03'\
b'\x00\xfc\x01\x00\xf8\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x10\x00\xe0\x07\x00\xf8\x1f\x00\xfc\x3f\x00\x1c\x38\x00\x0e\x70'\
b'\x00\x06\x60\x00\x06\x60\x00\x06\x60\x00\x06\xe0\x00\x0e\xf0\x03'\
b'\x1c\xf8\x03\xfc\x3f\x03\xf8\x1f\x02\xe0\x07\x00\x00\x00\x00\x00'\
b'\x00\x00\x0f\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x06\x03\x00'\
b'\x06\x03\x00\x06\x03\x00\x06\x03\x00\x8e\x07\x00\xfe\x1f\x00\xfc'\
b'\x7f\x00\x78\x78\x00\x00\x70\x00\x00\x40\x00\x00\x00\x00\x00\x00'\
b'\x00\x0e\x00\x78\x30\x00\xfc\x70\x00\xfe\x61\x00\xc6\x61\x00\xc6'\
b'\x61\x00\xc6\x63\x00\x86\x63\x00\x86\x63\x00\x86\x7f\x00\x0e\x3f'\
b'\x00\x00\x1e\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0d\x00\x06'\
b'\x00\x00\x06\x00\x00\x06\x00\x00\x06\x00\x00\x06\x00\x00\xfe\x7f'\
b'\x00\xfe\x7f\x00\xfe\x7f\x00\x06\x00\x00\x06\x00\x00\x06\x00\x00'\
b'\x06\x00\x00\x06\x00\x00\x0f\x00\xfe\x0f\x00\xfe\x3f\x00\xfe\x3f'\
b'\x00\x00\x70\x00\x00\x60\x00\x00\x60\x00\x00\x60\x00\x00\x60\x00'\
b'\x00\x70\x00\xfe\x3f\x00\xfe\x3f\x00\xfe\x0f\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x0f\x00\x02\x00\x00\x1e\x00\x00\xfe\x00\x00'\
b'\xfc\x03\x00\xf0\x1f\x00\x80\x7f\x00\x00\x7c\x00\x00\x70\x00\x00'\
b'\x7c\x00\x80\x7f\x00\xf0\x1f\x00\xfc\x03\x00\xfe\x00\x00\x1e\x00'\
b'\x00\x02\x00\x00\x15\x00\x06\x00\x00\x7e\x00\x00\xfe\x0f\x00\xf8'\
b'\x7f\x00\x80\x7f\x00\x00\x78\x00\x00\x7e\x00\xf0\x1f\x00\xfe\x01'\
b'\x00\x1e\x00\x00\x0e\x00\x00\xfe\x01\x00\xf0\x1f\x00\x00\x7f\x00'\
b'\x00\x78\x00\x80\x7f\x00\xf8\x7f\x00\xfe\x0f\x00\x7e\x00\x00\x06'\
b'\x00\x00\x00\x00\x00\x0f\x00\x02\x40\x00\x06\x60\x00\x1e\x78\x00'\
b'\x3e\x7c\x00\x7c\x3f\x00\xf0\x0f\x00\xe0\x07\x00\xe0\x07\x00\xf0'\
b'\x0f\x00\x7c\x3f\x00\x3e\x7c\x00\x1e\x78\x00\x06\x60\x00\x02\x40'\
b'\x00\x00\x00\x00\x0e\x00\x02\x00\x00\x06\x00\x00\x1e\x00\x00\x7e'\
b'\x00\x00\xf8\x00\x00\xf0\x7f\x00\xc0\x7f\x00\xf0\x7f\x00\xf8\x00'\
b'\x00\x7e\x00\x00\x1e\x00\x00\x06\x00\x00\x02\x00\x00\x00\x00\x00'\
b'\x0e\x00\x06\x70\x00\x06\x78\x00\x06\x7c\x00\x06\x7e\x00\x06\x6f'\
b'\x00\xc6\x67\x00\xe6\x63\x00\xf6\x61\x00\x7e\x60\x00\x3e\x60\x00'\
b'\x1e\x60\x00\x0e\x60\x00\x00\x00\x00\x00\x00\x00\x09\x00\xfe\xff'\
b'\x03\xfe\xff\x03\xfe\xff\x03\x06\x00\x03\x06\x00\x03\x06\x00\x03'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\x00\x06\x00\x00\x3e\x00'\
b'\x00\xfc\x01\x00\xc0\x0f\x00\x00\xfe\x00\x00\xf0\x01\x00\x80\x01'\
b'\x09\x00\x06\x00\x03\x06\x00\x03\x06\x00\x03\xfe\xff\x03\xfe\xff'\
b'\x03\xfe\xff\x03\x00\x00\x00\x00\x00\x00\x00\x00\x00\x10\x00\x20'\
b'\x00\x00\x30\x00\x00\x38\x00\x00\x1c\x00\x00\x0e\x00\x00\x0e\x00'\
b'\x00\x0e\x00\x00\x0e\x00\x00\x1c\x00\x00\x38\x00\x00\x30\x00\x00'\
b'\x20\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0a'\
b'\x00\x00\x80\x01\x00\x80\x01\x00\x80\x01\x00\x80\x01\x00\x80\x01'\
b'\x00\x80\x01\x00\x80\x01\x00\x80\x01\x00\x80\x01\x00\x80\x01\x0a'\
b'\x00\x01\x00\x00\x03\x00\x00\x07\x00\x00\x0c\x00\x00\x08\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0d'\
b'\x00\x00\x3c\x00\xc0\x7c\x00\x60\x7e\x00\x60\x66\x00\x60\x66\x00'\
b'\x60\x66\x00\x60\x36\x00\xe0\x7f\x00\xc0\x7f\x00\x80\x7f\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x0e\x00\xfe\x7f\x00\xfe\x7f\x00'\
b'\xfe\x7f\x00\xc0\x30\x00\x40\x20\x00\x60\x60\x00\x60\x60\x00\xe0'\
b'\x70\x00\xe0\x7f\x00\xc0\x3f\x00\x00\x0f\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x0b\x00\x00\x0f\x00\xc0\x3f\x00\xc0\x3f\x00\xe0'\
b'\x70\x00\x60\x60\x00\x60\x60\x00\x60\x60\x00\x60\x60\x00\xc0\x30'\
b'\x00\x00\x00\x00\x00\x00\x00\x0e\x00\x00\x0f\x00\xc0\x3f\x00\xe0'\
b'\x7f\x00\xe0\x70\x00\x60\x60\x00\x60\x60\x00\x40\x20\x00\xc0\x30'\
b'\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x0d\x00\x00\x0f\x00\xc0\x3f\x00\xc0\x3f\x00\xe0\x76'\
b'\x00\x60\x66\x00\x60\x66\x00\x60\x66\x00\xe0\x66\x00\xc0\x67\x00'\
b'\xc0\x67\x00\x00\x37\x00\x00\x00\x00\x00\x00\x00\x08\x00\x60\x00'\
b'\x00\x60\x00\x00\xfc\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\x66\x00\x00'\
b'\x66\x00\x00\x66\x00\x00\x0e\x00\x00\x0f\x00\xc0\x3f\x03\xe0\x7f'\
b'\x06\xe0\x70\x06\x60\x60\x06\x60\x60\x06\x40\x20\x06\xc0\x30\x07'\
b'\xe0\xff\x03\xe0\xff\x03\xe0\xff\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x0e\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00\xc0\x00\x00'\
b'\x40\x00\x00\x60\x00\x00\x60\x00\x00\xe0\x7f\x00\xe0\x7f\x00\x80'\
b'\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\x00'\
b'\xee\x7f\x00\xee\x7f\x00\xee\x7f\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x07\x00\x00\x00\x06\x00\x00\x06\xee\xff\x07'\
b'\xee\xff\x03\xee\xff\x01\x00\x00\x00\x00\x00\x00\x0d\x00\xfe\x7f'\
b'\x00\xfe\x7f\x00\xfe\x7f\x00\x00\x07\x00\x80\x0f\x00\xc0\x1f\x00'\
b'\xe0\x3d\x00\xe0\x78\x00\x60\x70\x00\x20\x60\x00\x00\x40\x00\x00'\
b'\x00\x00\x00\x00\x00\x07\x00\xfe\x7f\x00\xfe\x7f\x00\xfe\x7f\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x15\x00\xe0\x7f'\
b'\x00\xe0\x7f\x00\xe0\x7f\x00\xc0\x00\x00\x60\x00\x00\x60\x00\x00'\
b'\x60\x00\x00\xe0\x7f\x00\xc0\x7f\x00\x80\x7f\x00\xc0\x00\x00\x60'\
b'\x00\x00\x60\x00\x00\x60\x00\x00\xe0\x7f\x00\xc0\x7f\x00\x80\x7f'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0e\x00\xe0'\
b'\x7f\x00\xe0\x7f\x00\xe0\x7f\x00\xc0\x00\x00\x40\x00\x00\x60\x00'\
b'\x00\x60\x00\x00\xe0\x7f\x00\xe0\x7f\x00\x80\x7f\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0d\x00\x00\x0f\x00\xc0\x3f'\
b'\x00\xc0\x3f\x00\xe0\x70\x00\x60\x60\x00\x60\x60\x00\x60\x60\x00'\
b'\xe0\x70\x00\xc0\x3f\x00\xc0\x3f\x00\x00\x0f\x00\x00\x00\x00\x00'\
b'\x00\x00\x0e\x00\xe0\xff\x07\xe0\xff\x07\xe0\xff\x07\xc0\x30\x00'\
b'\x40\x20\x00\x60\x60\x00\x60\x60\x00\xe0\x70\x00\xe0\x7f\x00\xc0'\
b'\x3f\x00\x00\x0f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0e\x00'\
b'\x00\x0f\x00\xc0\x3f\x00\xe0\x7f\x00\xe0\x70\x00\x60\x60\x00\x60'\
b'\x60\x00\x40\x20\x00\xc0\x30\x00\xe0\xff\x07\xe0\xff\x07\xe0\xff'\
b'\x07\x00\x00\x00\x00\x00\x00\x00\x00\x00\x09\x00\xe0\x7f\x00\xe0'\
b'\x7f\x00\xe0\x7f\x00\xc0\x00\x00\x40\x00\x00\x60\x00\x00\x60\x00'\
b'\x00\x60\x00\x00\x00\x00\x00\x0b\x00\xc0\x33\x00\xc0\x63\x00\xe0'\
b'\x67\x00\x60\x66\x00\x60\x66\x00\x60\x66\x00\x60\x7e\x00\x60\x3c'\
b'\x00\xc0\x3c\x00\x00\x00\x00\x00\x00\x00\x09\x00\x60\x00\x00\x60'\
b'\x00\x00\xfc\x3f\x00\xfc\x7f\x00\xfc\x7f\x00\x60\x60\x00\x60\x60'\
b'\x00\x60\x60\x00\x00\x00\x00\x0e\x00\xe0\x1f\x00\xe0\x3f\x00\xe0'\
b'\x7f\x00\x00\x60\x00\x00\x60\x00\x00\x20\x00\x00\x30\x00\xe0\x7f'\
b'\x00\xe0\x7f\x00\xe0\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x0c\x00\x20\x00\x00\xe0\x01\x00\xe0\x07\x00\xc0\x1f'\
b'\x00\x00\x7e\x00\x00\x78\x00\x00\x78\x00\x00\x7e\x00\xc0\x1f\x00'\
b'\xe0\x07\x00\xe0\x01\x00\x20\x00\x00\x12\x00\x60\x00\x00\xe0\x03'\
b'\x00\xe0\x1f\x00\x80\x7f\x00\x00\x78\x00\x00\x7e\x00\xc0\x1f\x00'\
b'\xe0\x03\x00\xe0\x01\x00\xc0\x1f\x00\x00\x7e\x00\x00\x78\x00\x80'\
b'\x7f\x00\xe0\x1f\x00\xe0\x03\x00\x60\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x0c\x00\x20\x40\x00\x60\x60\x00\xe0\x70\x00\xe0\x79\x00\xc0'\
b'\x3f\x00\x00\x0f\x00\x00\x0f\x00\xc0\x3f\x00\xe0\x79\x00\xe0\x70'\
b'\x00\x60\x60\x00\x20\x40\x00\x0c\x00\x20\x00\x00\xe0\x00\x00\xe0'\
b'\x03\x06\xc0\x0f\x06\x00\x3f\x07\x00\xfc\x07\x00\xf0\x03\x00\xfe'\
b'\x00\x80\x3f\x00\xe0\x0f\x00\xe0\x01\x00\x60\x00\x00\x0b\x00\x60'\
b'\x70\x00\x60\x78\x00\x60\x7c\x00\x60\x7e\x00\x60\x6f\x00\xe0\x67'\
b'\x00\xe0\x63\x00\xe0\x61\x00\xe0\x60\x00\x00\x00\x00\x00\x00\x00'\
b'\x0e\x00\x00\x06\x00\x00\x06\x00\x00\x07\x00\xfc\xff\x03\xfe\xff'\
b'\x07\xfe\xf9\x07\x06\x00\x06\x06\x00\x06\x06\x00\x06\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\x00\xff\xff'\
b'\x07\xff\xff\x07\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x0e\x00\x06\x00\x06\x06\x00\x06\x06\x00\x06\xfe\xf9'\
b'\x07\xfe\xff\x07\xfc\xff\x03\x00\x07\x00\x00\x06\x00\x00\x06\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x10'\
b'\x00\x00\x03\x00\x80\x01\x00\x80\x01\x00\x80\x01\x00\x80\x01\x00'\
b'\x80\x01\x00\x00\x03\x00\x00\x03\x00\x00\x03\x00\x00\x03\x00\x00'\
b'\x03\x00\x80\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00'
_index =\
b'\x00\x00\x23\x00\x3a\x00\x57\x00\x77\x00\xa9\x00\xd2\x00\x0d\x01'\
b'\x42\x01\x56\x01\x73\x01\x90\x01\xb0\x01\xe2\x01\xf9\x01\x13\x02'\
b'\x2a\x02\x41\x02\x6a\x02\x93\x02\xbc\x02\xe5\x02\x0e\x03\x37\x03'\
b'\x60\x03\x89\x03\xb2\x03\xdb\x03\xf5\x03\x0f\x04\x41\x04\x73\x04'\
b'\xa5\x04\xc8\x04\x03\x05\x32\x05\x5e\x05\x8a\x05\xbc\x05\xe5\x05'\
b'\x0e\x06\x40\x06\x72\x06\x89\x06\xa0\x06\xcf\x06\xf5\x06\x30\x07'\
b'\x62\x07\x94\x07\xc0\x07\xf2\x07\x21\x08\x4d\x08\x76\x08\xa5\x08'\
b'\xd4\x08\x15\x09\x44\x09\x70\x09\x9c\x09\xb9\x09\xd0\x09\xed\x09'\
b'\x1f\x0a\x3f\x0a\x5f\x0a\x88\x0a\xb4\x0a\xd7\x0a\x03\x0b\x2c\x0b'\
b'\x46\x0b\x72\x0b\x9e\x0b\xb5\x0b\xcc\x0b\xf5\x0b\x0c\x0c\x4d\x0c'\
b'\x79\x0c\xa2\x0c\xce\x0c\xfa\x0c\x17\x0d\x3a\x0d\x57\x0d\x83\x0d'\
b'\xa9\x0d\xe1\x0d\x07\x0e\x2d\x0e\x50\x0e\x7c\x0e\x93\x0e\xbf\x0e'\
b'\xf1\x0e'
_mvfont = memoryview(_font)
def _chr_addr(ordch):
offset = 2 * (ordch - 32)
return int.from_bytes(_index[offset:offset + 2], 'little')
def get_width(s):
width = 0
for ch in s:
ordch = ord(ch)
ordch = ordch + 1 if ordch >= 32 and ordch <= 126 else 32
offset = _chr_addr(ordch)
width += int.from_bytes(_font[offset:offset + 2], 'little')
return width
def get_ch(ch):
ordch = ord(ch)
ordch = ordch + 1 if ordch >= 32 and ordch <= 126 else 32
offset = _chr_addr(ordch)
width = int.from_bytes(_font[offset:offset + 2], 'little')
next_offs = _chr_addr(ordch +1)
return _mvfont[offset + 2:next_offs], width
| 59.934641 | 68 | 0.705234 | 4,422 | 18,340 | 2.919041 | 0.042062 | 0.443911 | 0.42462 | 0.45832 | 0.72428 | 0.638209 | 0.592191 | 0.549892 | 0.50612 | 0.456151 | 0 | 0.382269 | 0.027644 | 18,340 | 305 | 69 | 60.131148 | 0.341558 | 0.003162 | 0 | 0.038062 | 1 | 0.868512 | 0.880731 | 0.878871 | 0 | 1 | 0 | 0 | 0 | 1 | 0.034602 | false | 0 | 0 | 0.024221 | 0.069204 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
6f5eb252a5edb15edb3faa3f588713edbf77ac74 | 76,130 | py | Python | python/examples/kaitai/elf.py | carsonharmon/binaryninja-api | f7ad332ad69d370aa29cd54f4c7307da4d9173e2 | [
"MIT"
] | 1 | 2021-04-05T15:01:23.000Z | 2021-04-05T15:01:23.000Z | python/examples/kaitai/elf.py | carsonharmon/binaryninja-api | f7ad332ad69d370aa29cd54f4c7307da4d9173e2 | [
"MIT"
] | null | null | null | python/examples/kaitai/elf.py | carsonharmon/binaryninja-api | f7ad332ad69d370aa29cd54f4c7307da4d9173e2 | [
"MIT"
] | 1 | 2021-06-10T04:27:19.000Z | 2021-06-10T04:27:19.000Z | # This is a generated file! Please edit source .ksy file and use kaitai-struct-compiler to rebuild
from pkg_resources import parse_version
from .kaitaistruct import __version__ as ks_version, KaitaiStruct, KaitaiStream, BytesIO
from enum import Enum
import collections
if parse_version(ks_version) < parse_version('0.7'):
raise Exception("Incompatible Kaitai Struct Python API: 0.7 or later is required, but you have %s" % (ks_version))
class Elf(KaitaiStruct):
"""
.. seealso::
Source - https://sourceware.org/git/?p=glibc.git;a=blob;f=elf/elf.h;hb=HEAD
"""
class Endian(Enum):
le = 1
be = 2
class ShType(Enum):
null_type = 0
progbits = 1
symtab = 2
strtab = 3
rela = 4
hash = 5
dynamic = 6
note = 7
nobits = 8
rel = 9
shlib = 10
dynsym = 11
init_array = 14
fini_array = 15
preinit_array = 16
group = 17
symtab_shndx = 18
sunw_capchain = 1879048175
sunw_capinfo = 1879048176
sunw_symsort = 1879048177
sunw_tlssort = 1879048178
sunw_ldynsym = 1879048179
sunw_dof = 1879048180
sunw_cap = 1879048181
sunw_signature = 1879048182
sunw_annotate = 1879048183
sunw_debugstr = 1879048184
sunw_debug = 1879048185
sunw_move = 1879048186
sunw_comdat = 1879048187
sunw_syminfo = 1879048188
sunw_verdef = 1879048189
sunw_verneed = 1879048190
sunw_versym = 1879048191
sparc_gotdata = 1879048192
arm_exidx = 1879048193
arm_preemptmap = 1879048194
arm_attributes = 1879048195
class OsAbi(Enum):
system_v = 0
hp_ux = 1
netbsd = 2
gnu = 3
solaris = 6
aix = 7
irix = 8
freebsd = 9
tru64 = 10
modesto = 11
openbsd = 12
openvms = 13
nsk = 14
aros = 15
fenixos = 16
cloudabi = 17
openvos = 18
class Machine(Enum):
not_set = 0
sparc = 2
x86 = 3
mips = 8
powerpc = 20
arm = 40
superh = 42
ia_64 = 50
x86_64 = 62
aarch64 = 183
riscv = 243
bpf = 247
class DynamicArrayTags(Enum):
null = 0
needed = 1
pltrelsz = 2
pltgot = 3
hash = 4
strtab = 5
symtab = 6
rela = 7
relasz = 8
relaent = 9
strsz = 10
syment = 11
init = 12
fini = 13
soname = 14
rpath = 15
symbolic = 16
rel = 17
relsz = 18
relent = 19
pltrel = 20
debug = 21
textrel = 22
jmprel = 23
bind_now = 24
init_array = 25
fini_array = 26
init_arraysz = 27
fini_arraysz = 28
runpath = 29
flags = 30
preinit_array = 32
preinit_arraysz = 33
maxpostags = 34
sunw_auxiliary = 1610612749
sunw_filter = 1610612750
sunw_cap = 1610612752
sunw_symtab = 1610612753
sunw_symsz = 1610612754
sunw_sortent = 1610612755
sunw_symsort = 1610612756
sunw_symsortsz = 1610612757
sunw_tlssort = 1610612758
sunw_tlssortsz = 1610612759
sunw_capinfo = 1610612760
sunw_strpad = 1610612761
sunw_capchain = 1610612762
sunw_ldmach = 1610612763
sunw_capchainent = 1610612765
sunw_capchainsz = 1610612767
hios = 1879044096
valrnglo = 1879047424
gnu_prelinked = 1879047669
gnu_conflictsz = 1879047670
gnu_liblistsz = 1879047671
checksum = 1879047672
pltpadsz = 1879047673
moveent = 1879047674
movesz = 1879047675
feature_1 = 1879047676
posflag_1 = 1879047677
syminsz = 1879047678
valrnghi = 1879047679
addrrnglo = 1879047680
gnu_hash = 1879047925
tlsdesc_plt = 1879047926
tlsdesc_got = 1879047927
gnu_conflict = 1879047928
gnu_liblist = 1879047929
config = 1879047930
depaudit = 1879047931
audit = 1879047932
pltpad = 1879047933
movetab = 1879047934
addrrnghi = 1879047935
versym = 1879048176
relacount = 1879048185
relcount = 1879048186
flags_1 = 1879048187
verdef = 1879048188
verdefnum = 1879048189
verneed = 1879048190
verneednum = 1879048191
loproc = 1879048192
sparc_register = 1879048193
auxiliary = 2147483645
used = 2147483646
hiproc = 2147483647
class Bits(Enum):
b32 = 1
b64 = 2
class PhType(Enum):
null_type = 0
load = 1
dynamic = 2
interp = 3
note = 4
shlib = 5
phdr = 6
tls = 7
gnu_eh_frame = 1685382480
gnu_stack = 1685382481
gnu_relro = 1685382482
pax_flags = 1694766464
hios = 1879048191
arm_exidx = 1879048193
class ObjType(Enum):
relocatable = 1
executable = 2
shared = 3
core = 4
SEQ_FIELDS = ["magic", "bits", "endian", "ei_version", "abi", "abi_version", "pad", "header"]
def __init__(self, _io, _parent=None, _root=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._debug = collections.defaultdict(dict)
def _read(self):
self._debug['magic']['start'] = self._io.pos()
self.magic = self._io.ensure_fixed_contents(b"\x7F\x45\x4C\x46")
self._debug['magic']['end'] = self._io.pos()
self._debug['bits']['start'] = self._io.pos()
self.bits = KaitaiStream.resolve_enum(self._root.Bits, self._io.read_u1())
self._debug['bits']['end'] = self._io.pos()
self._debug['endian']['start'] = self._io.pos()
self.endian = KaitaiStream.resolve_enum(self._root.Endian, self._io.read_u1())
self._debug['endian']['end'] = self._io.pos()
self._debug['ei_version']['start'] = self._io.pos()
self.ei_version = self._io.read_u1()
self._debug['ei_version']['end'] = self._io.pos()
self._debug['abi']['start'] = self._io.pos()
self.abi = KaitaiStream.resolve_enum(self._root.OsAbi, self._io.read_u1())
self._debug['abi']['end'] = self._io.pos()
self._debug['abi_version']['start'] = self._io.pos()
self.abi_version = self._io.read_u1()
self._debug['abi_version']['end'] = self._io.pos()
self._debug['pad']['start'] = self._io.pos()
self.pad = self._io.read_bytes(7)
self._debug['pad']['end'] = self._io.pos()
self._debug['header']['start'] = self._io.pos()
self.header = self._root.EndianElf(self._io, self, self._root)
self.header._read()
self._debug['header']['end'] = self._io.pos()
class PhdrTypeFlags(KaitaiStruct):
SEQ_FIELDS = []
def __init__(self, value, _io, _parent=None, _root=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.value = value
self._debug = collections.defaultdict(dict)
def _read(self):
pass
@property
def read(self):
if hasattr(self, '_m_read'):
return self._m_read if hasattr(self, '_m_read') else None
self._m_read = (self.value & 4) != 0
return self._m_read if hasattr(self, '_m_read') else None
@property
def write(self):
if hasattr(self, '_m_write'):
return self._m_write if hasattr(self, '_m_write') else None
self._m_write = (self.value & 2) != 0
return self._m_write if hasattr(self, '_m_write') else None
@property
def execute(self):
if hasattr(self, '_m_execute'):
return self._m_execute if hasattr(self, '_m_execute') else None
self._m_execute = (self.value & 1) != 0
return self._m_execute if hasattr(self, '_m_execute') else None
@property
def mask_proc(self):
if hasattr(self, '_m_mask_proc'):
return self._m_mask_proc if hasattr(self, '_m_mask_proc') else None
self._m_mask_proc = (self.value & 4026531840) != 0
return self._m_mask_proc if hasattr(self, '_m_mask_proc') else None
class SectionHeaderFlags(KaitaiStruct):
SEQ_FIELDS = []
def __init__(self, value, _io, _parent=None, _root=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.value = value
self._debug = collections.defaultdict(dict)
def _read(self):
pass
@property
def merge(self):
"""might be merged."""
if hasattr(self, '_m_merge'):
return self._m_merge if hasattr(self, '_m_merge') else None
self._m_merge = (self.value & 16) != 0
return self._m_merge if hasattr(self, '_m_merge') else None
@property
def mask_os(self):
"""OS-specific."""
if hasattr(self, '_m_mask_os'):
return self._m_mask_os if hasattr(self, '_m_mask_os') else None
self._m_mask_os = (self.value & 267386880) != 0
return self._m_mask_os if hasattr(self, '_m_mask_os') else None
@property
def exclude(self):
"""section is excluded unless referenced or allocated (Solaris)."""
if hasattr(self, '_m_exclude'):
return self._m_exclude if hasattr(self, '_m_exclude') else None
self._m_exclude = (self.value & 134217728) != 0
return self._m_exclude if hasattr(self, '_m_exclude') else None
@property
def mask_proc(self):
"""Processor-specific."""
if hasattr(self, '_m_mask_proc'):
return self._m_mask_proc if hasattr(self, '_m_mask_proc') else None
self._m_mask_proc = (self.value & 4026531840) != 0
return self._m_mask_proc if hasattr(self, '_m_mask_proc') else None
@property
def strings(self):
"""contains nul-terminated strings."""
if hasattr(self, '_m_strings'):
return self._m_strings if hasattr(self, '_m_strings') else None
self._m_strings = (self.value & 32) != 0
return self._m_strings if hasattr(self, '_m_strings') else None
@property
def os_non_conforming(self):
"""non-standard OS specific handling required."""
if hasattr(self, '_m_os_non_conforming'):
return self._m_os_non_conforming if hasattr(self, '_m_os_non_conforming') else None
self._m_os_non_conforming = (self.value & 256) != 0
return self._m_os_non_conforming if hasattr(self, '_m_os_non_conforming') else None
@property
def alloc(self):
"""occupies memory during execution."""
if hasattr(self, '_m_alloc'):
return self._m_alloc if hasattr(self, '_m_alloc') else None
self._m_alloc = (self.value & 2) != 0
return self._m_alloc if hasattr(self, '_m_alloc') else None
@property
def exec_instr(self):
"""executable."""
if hasattr(self, '_m_exec_instr'):
return self._m_exec_instr if hasattr(self, '_m_exec_instr') else None
self._m_exec_instr = (self.value & 4) != 0
return self._m_exec_instr if hasattr(self, '_m_exec_instr') else None
@property
def info_link(self):
"""'sh_info' contains SHT index."""
if hasattr(self, '_m_info_link'):
return self._m_info_link if hasattr(self, '_m_info_link') else None
self._m_info_link = (self.value & 64) != 0
return self._m_info_link if hasattr(self, '_m_info_link') else None
@property
def write(self):
"""writable."""
if hasattr(self, '_m_write'):
return self._m_write if hasattr(self, '_m_write') else None
self._m_write = (self.value & 1) != 0
return self._m_write if hasattr(self, '_m_write') else None
@property
def link_order(self):
"""preserve order after combining."""
if hasattr(self, '_m_link_order'):
return self._m_link_order if hasattr(self, '_m_link_order') else None
self._m_link_order = (self.value & 128) != 0
return self._m_link_order if hasattr(self, '_m_link_order') else None
@property
def ordered(self):
"""special ordering requirement (Solaris)."""
if hasattr(self, '_m_ordered'):
return self._m_ordered if hasattr(self, '_m_ordered') else None
self._m_ordered = (self.value & 67108864) != 0
return self._m_ordered if hasattr(self, '_m_ordered') else None
@property
def tls(self):
"""section hold thread-local data."""
if hasattr(self, '_m_tls'):
return self._m_tls if hasattr(self, '_m_tls') else None
self._m_tls = (self.value & 1024) != 0
return self._m_tls if hasattr(self, '_m_tls') else None
@property
def group(self):
"""section is member of a group."""
if hasattr(self, '_m_group'):
return self._m_group if hasattr(self, '_m_group') else None
self._m_group = (self.value & 512) != 0
return self._m_group if hasattr(self, '_m_group') else None
class DtFlag1Values(KaitaiStruct):
SEQ_FIELDS = []
def __init__(self, value, _io, _parent=None, _root=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self.value = value
self._debug = collections.defaultdict(dict)
def _read(self):
pass
@property
def singleton(self):
"""Singleton symbols are used."""
if hasattr(self, '_m_singleton'):
return self._m_singleton if hasattr(self, '_m_singleton') else None
self._m_singleton = (self.value & 33554432) != 0
return self._m_singleton if hasattr(self, '_m_singleton') else None
@property
def ignmuldef(self):
if hasattr(self, '_m_ignmuldef'):
return self._m_ignmuldef if hasattr(self, '_m_ignmuldef') else None
self._m_ignmuldef = (self.value & 262144) != 0
return self._m_ignmuldef if hasattr(self, '_m_ignmuldef') else None
@property
def loadfltr(self):
"""Trigger filtee loading at runtime."""
if hasattr(self, '_m_loadfltr'):
return self._m_loadfltr if hasattr(self, '_m_loadfltr') else None
self._m_loadfltr = (self.value & 16) != 0
return self._m_loadfltr if hasattr(self, '_m_loadfltr') else None
@property
def initfirst(self):
"""Set RTLD_INITFIRST for this object."""
if hasattr(self, '_m_initfirst'):
return self._m_initfirst if hasattr(self, '_m_initfirst') else None
self._m_initfirst = (self.value & 32) != 0
return self._m_initfirst if hasattr(self, '_m_initfirst') else None
@property
def symintpose(self):
"""Object has individual interposers."""
if hasattr(self, '_m_symintpose'):
return self._m_symintpose if hasattr(self, '_m_symintpose') else None
self._m_symintpose = (self.value & 8388608) != 0
return self._m_symintpose if hasattr(self, '_m_symintpose') else None
@property
def noreloc(self):
if hasattr(self, '_m_noreloc'):
return self._m_noreloc if hasattr(self, '_m_noreloc') else None
self._m_noreloc = (self.value & 4194304) != 0
return self._m_noreloc if hasattr(self, '_m_noreloc') else None
@property
def confalt(self):
"""Configuration alternative created."""
if hasattr(self, '_m_confalt'):
return self._m_confalt if hasattr(self, '_m_confalt') else None
self._m_confalt = (self.value & 8192) != 0
return self._m_confalt if hasattr(self, '_m_confalt') else None
@property
def dispreldne(self):
"""Disp reloc applied at build time."""
if hasattr(self, '_m_dispreldne'):
return self._m_dispreldne if hasattr(self, '_m_dispreldne') else None
self._m_dispreldne = (self.value & 32768) != 0
return self._m_dispreldne if hasattr(self, '_m_dispreldne') else None
@property
def rtld_global(self):
"""Set RTLD_GLOBAL for this object."""
if hasattr(self, '_m_rtld_global'):
return self._m_rtld_global if hasattr(self, '_m_rtld_global') else None
self._m_rtld_global = (self.value & 2) != 0
return self._m_rtld_global if hasattr(self, '_m_rtld_global') else None
@property
def nodelete(self):
"""Set RTLD_NODELETE for this object."""
if hasattr(self, '_m_nodelete'):
return self._m_nodelete if hasattr(self, '_m_nodelete') else None
self._m_nodelete = (self.value & 8) != 0
return self._m_nodelete if hasattr(self, '_m_nodelete') else None
@property
def trans(self):
if hasattr(self, '_m_trans'):
return self._m_trans if hasattr(self, '_m_trans') else None
self._m_trans = (self.value & 512) != 0
return self._m_trans if hasattr(self, '_m_trans') else None
@property
def origin(self):
"""$ORIGIN must be handled."""
if hasattr(self, '_m_origin'):
return self._m_origin if hasattr(self, '_m_origin') else None
self._m_origin = (self.value & 128) != 0
return self._m_origin if hasattr(self, '_m_origin') else None
@property
def now(self):
"""Set RTLD_NOW for this object."""
if hasattr(self, '_m_now'):
return self._m_now if hasattr(self, '_m_now') else None
self._m_now = (self.value & 1) != 0
return self._m_now if hasattr(self, '_m_now') else None
@property
def nohdr(self):
if hasattr(self, '_m_nohdr'):
return self._m_nohdr if hasattr(self, '_m_nohdr') else None
self._m_nohdr = (self.value & 1048576) != 0
return self._m_nohdr if hasattr(self, '_m_nohdr') else None
@property
def endfiltee(self):
"""Filtee terminates filters search."""
if hasattr(self, '_m_endfiltee'):
return self._m_endfiltee if hasattr(self, '_m_endfiltee') else None
self._m_endfiltee = (self.value & 16384) != 0
return self._m_endfiltee if hasattr(self, '_m_endfiltee') else None
@property
def nodirect(self):
"""Object has no-direct binding."""
if hasattr(self, '_m_nodirect'):
return self._m_nodirect if hasattr(self, '_m_nodirect') else None
self._m_nodirect = (self.value & 131072) != 0
return self._m_nodirect if hasattr(self, '_m_nodirect') else None
@property
def globaudit(self):
"""Global auditing required."""
if hasattr(self, '_m_globaudit'):
return self._m_globaudit if hasattr(self, '_m_globaudit') else None
self._m_globaudit = (self.value & 16777216) != 0
return self._m_globaudit if hasattr(self, '_m_globaudit') else None
@property
def noksyms(self):
if hasattr(self, '_m_noksyms'):
return self._m_noksyms if hasattr(self, '_m_noksyms') else None
self._m_noksyms = (self.value & 524288) != 0
return self._m_noksyms if hasattr(self, '_m_noksyms') else None
@property
def interpose(self):
"""Object is used to interpose."""
if hasattr(self, '_m_interpose'):
return self._m_interpose if hasattr(self, '_m_interpose') else None
self._m_interpose = (self.value & 1024) != 0
return self._m_interpose if hasattr(self, '_m_interpose') else None
@property
def nodump(self):
"""Object can't be dldump'ed."""
if hasattr(self, '_m_nodump'):
return self._m_nodump if hasattr(self, '_m_nodump') else None
self._m_nodump = (self.value & 4096) != 0
return self._m_nodump if hasattr(self, '_m_nodump') else None
@property
def disprelpnd(self):
"""Disp reloc applied at run-time."""
if hasattr(self, '_m_disprelpnd'):
return self._m_disprelpnd if hasattr(self, '_m_disprelpnd') else None
self._m_disprelpnd = (self.value & 65536) != 0
return self._m_disprelpnd if hasattr(self, '_m_disprelpnd') else None
@property
def noopen(self):
"""Set RTLD_NOOPEN for this object."""
if hasattr(self, '_m_noopen'):
return self._m_noopen if hasattr(self, '_m_noopen') else None
self._m_noopen = (self.value & 64) != 0
return self._m_noopen if hasattr(self, '_m_noopen') else None
@property
def stub(self):
if hasattr(self, '_m_stub'):
return self._m_stub if hasattr(self, '_m_stub') else None
self._m_stub = (self.value & 67108864) != 0
return self._m_stub if hasattr(self, '_m_stub') else None
@property
def direct(self):
"""Direct binding enabled."""
if hasattr(self, '_m_direct'):
return self._m_direct if hasattr(self, '_m_direct') else None
self._m_direct = (self.value & 256) != 0
return self._m_direct if hasattr(self, '_m_direct') else None
@property
def edited(self):
"""Object is modified after built."""
if hasattr(self, '_m_edited'):
return self._m_edited if hasattr(self, '_m_edited') else None
self._m_edited = (self.value & 2097152) != 0
return self._m_edited if hasattr(self, '_m_edited') else None
@property
def group(self):
"""Set RTLD_GROUP for this object."""
if hasattr(self, '_m_group'):
return self._m_group if hasattr(self, '_m_group') else None
self._m_group = (self.value & 4) != 0
return self._m_group if hasattr(self, '_m_group') else None
@property
def pie(self):
if hasattr(self, '_m_pie'):
return self._m_pie if hasattr(self, '_m_pie') else None
self._m_pie = (self.value & 134217728) != 0
return self._m_pie if hasattr(self, '_m_pie') else None
@property
def nodeflib(self):
"""Ignore default lib search path."""
if hasattr(self, '_m_nodeflib'):
return self._m_nodeflib if hasattr(self, '_m_nodeflib') else None
self._m_nodeflib = (self.value & 2048) != 0
return self._m_nodeflib if hasattr(self, '_m_nodeflib') else None
class EndianElf(KaitaiStruct):
SEQ_FIELDS = ["e_type", "machine", "e_version", "entry_point", "program_header_offset", "section_header_offset", "flags", "e_ehsize", "program_header_entry_size", "qty_program_header", "section_header_entry_size", "qty_section_header", "section_names_idx"]
def __init__(self, _io, _parent=None, _root=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._debug = collections.defaultdict(dict)
def _read(self):
_on = self._root.endian
if _on == self._root.Endian.le:
self._is_le = True
elif _on == self._root.Endian.be:
self._is_le = False
if self._is_le == True:
self._read_le()
elif self._is_le == False:
self._read_be()
else:
raise Exception("Unable to decide endianness")
def _read_le(self):
self._debug['e_type']['start'] = self._io.pos()
self.e_type = KaitaiStream.resolve_enum(self._root.ObjType, self._io.read_u2le())
self._debug['e_type']['end'] = self._io.pos()
self._debug['machine']['start'] = self._io.pos()
self.machine = KaitaiStream.resolve_enum(self._root.Machine, self._io.read_u2le())
self._debug['machine']['end'] = self._io.pos()
self._debug['e_version']['start'] = self._io.pos()
self.e_version = self._io.read_u4le()
self._debug['e_version']['end'] = self._io.pos()
self._debug['entry_point']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.entry_point = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.entry_point = self._io.read_u8le()
self._debug['entry_point']['end'] = self._io.pos()
self._debug['program_header_offset']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.program_header_offset = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.program_header_offset = self._io.read_u8le()
self._debug['program_header_offset']['end'] = self._io.pos()
self._debug['section_header_offset']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.section_header_offset = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.section_header_offset = self._io.read_u8le()
self._debug['section_header_offset']['end'] = self._io.pos()
self._debug['flags']['start'] = self._io.pos()
self.flags = self._io.read_bytes(4)
self._debug['flags']['end'] = self._io.pos()
self._debug['e_ehsize']['start'] = self._io.pos()
self.e_ehsize = self._io.read_u2le()
self._debug['e_ehsize']['end'] = self._io.pos()
self._debug['program_header_entry_size']['start'] = self._io.pos()
self.program_header_entry_size = self._io.read_u2le()
self._debug['program_header_entry_size']['end'] = self._io.pos()
self._debug['qty_program_header']['start'] = self._io.pos()
self.qty_program_header = self._io.read_u2le()
self._debug['qty_program_header']['end'] = self._io.pos()
self._debug['section_header_entry_size']['start'] = self._io.pos()
self.section_header_entry_size = self._io.read_u2le()
self._debug['section_header_entry_size']['end'] = self._io.pos()
self._debug['qty_section_header']['start'] = self._io.pos()
self.qty_section_header = self._io.read_u2le()
self._debug['qty_section_header']['end'] = self._io.pos()
self._debug['section_names_idx']['start'] = self._io.pos()
self.section_names_idx = self._io.read_u2le()
self._debug['section_names_idx']['end'] = self._io.pos()
def _read_be(self):
self._debug['e_type']['start'] = self._io.pos()
self.e_type = KaitaiStream.resolve_enum(self._root.ObjType, self._io.read_u2be())
self._debug['e_type']['end'] = self._io.pos()
self._debug['machine']['start'] = self._io.pos()
self.machine = KaitaiStream.resolve_enum(self._root.Machine, self._io.read_u2be())
self._debug['machine']['end'] = self._io.pos()
self._debug['e_version']['start'] = self._io.pos()
self.e_version = self._io.read_u4be()
self._debug['e_version']['end'] = self._io.pos()
self._debug['entry_point']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.entry_point = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.entry_point = self._io.read_u8be()
self._debug['entry_point']['end'] = self._io.pos()
self._debug['program_header_offset']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.program_header_offset = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.program_header_offset = self._io.read_u8be()
self._debug['program_header_offset']['end'] = self._io.pos()
self._debug['section_header_offset']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.section_header_offset = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.section_header_offset = self._io.read_u8be()
self._debug['section_header_offset']['end'] = self._io.pos()
self._debug['flags']['start'] = self._io.pos()
self.flags = self._io.read_bytes(4)
self._debug['flags']['end'] = self._io.pos()
self._debug['e_ehsize']['start'] = self._io.pos()
self.e_ehsize = self._io.read_u2be()
self._debug['e_ehsize']['end'] = self._io.pos()
self._debug['program_header_entry_size']['start'] = self._io.pos()
self.program_header_entry_size = self._io.read_u2be()
self._debug['program_header_entry_size']['end'] = self._io.pos()
self._debug['qty_program_header']['start'] = self._io.pos()
self.qty_program_header = self._io.read_u2be()
self._debug['qty_program_header']['end'] = self._io.pos()
self._debug['section_header_entry_size']['start'] = self._io.pos()
self.section_header_entry_size = self._io.read_u2be()
self._debug['section_header_entry_size']['end'] = self._io.pos()
self._debug['qty_section_header']['start'] = self._io.pos()
self.qty_section_header = self._io.read_u2be()
self._debug['qty_section_header']['end'] = self._io.pos()
self._debug['section_names_idx']['start'] = self._io.pos()
self.section_names_idx = self._io.read_u2be()
self._debug['section_names_idx']['end'] = self._io.pos()
class DynsymSectionEntry64(KaitaiStruct):
SEQ_FIELDS = ["name_offset", "info", "other", "shndx", "value", "size"]
def __init__(self, _io, _parent=None, _root=None, _is_le=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._is_le = _is_le
self._debug = collections.defaultdict(dict)
def _read(self):
if self._is_le == True:
self._read_le()
elif self._is_le == False:
self._read_be()
else:
raise Exception("Unable to decide endianness")
def _read_le(self):
self._debug['name_offset']['start'] = self._io.pos()
self.name_offset = self._io.read_u4le()
self._debug['name_offset']['end'] = self._io.pos()
self._debug['info']['start'] = self._io.pos()
self.info = self._io.read_u1()
self._debug['info']['end'] = self._io.pos()
self._debug['other']['start'] = self._io.pos()
self.other = self._io.read_u1()
self._debug['other']['end'] = self._io.pos()
self._debug['shndx']['start'] = self._io.pos()
self.shndx = self._io.read_u2le()
self._debug['shndx']['end'] = self._io.pos()
self._debug['value']['start'] = self._io.pos()
self.value = self._io.read_u8le()
self._debug['value']['end'] = self._io.pos()
self._debug['size']['start'] = self._io.pos()
self.size = self._io.read_u8le()
self._debug['size']['end'] = self._io.pos()
def _read_be(self):
self._debug['name_offset']['start'] = self._io.pos()
self.name_offset = self._io.read_u4be()
self._debug['name_offset']['end'] = self._io.pos()
self._debug['info']['start'] = self._io.pos()
self.info = self._io.read_u1()
self._debug['info']['end'] = self._io.pos()
self._debug['other']['start'] = self._io.pos()
self.other = self._io.read_u1()
self._debug['other']['end'] = self._io.pos()
self._debug['shndx']['start'] = self._io.pos()
self.shndx = self._io.read_u2be()
self._debug['shndx']['end'] = self._io.pos()
self._debug['value']['start'] = self._io.pos()
self.value = self._io.read_u8be()
self._debug['value']['end'] = self._io.pos()
self._debug['size']['start'] = self._io.pos()
self.size = self._io.read_u8be()
self._debug['size']['end'] = self._io.pos()
class ProgramHeader(KaitaiStruct):
SEQ_FIELDS = ["type", "flags64", "offset", "vaddr", "paddr", "filesz", "memsz", "flags32", "align"]
def __init__(self, _io, _parent=None, _root=None, _is_le=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._is_le = _is_le
self._debug = collections.defaultdict(dict)
def _read(self):
if self._is_le == True:
self._read_le()
elif self._is_le == False:
self._read_be()
else:
raise Exception("Unable to decide endianness")
def _read_le(self):
self._debug['type']['start'] = self._io.pos()
self.type = KaitaiStream.resolve_enum(self._root.PhType, self._io.read_u4le())
self._debug['type']['end'] = self._io.pos()
if self._root.bits == self._root.Bits.b64:
self._debug['flags64']['start'] = self._io.pos()
self.flags64 = self._io.read_u4le()
self._debug['flags64']['end'] = self._io.pos()
self._debug['offset']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.offset = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.offset = self._io.read_u8le()
self._debug['offset']['end'] = self._io.pos()
self._debug['vaddr']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.vaddr = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.vaddr = self._io.read_u8le()
self._debug['vaddr']['end'] = self._io.pos()
self._debug['paddr']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.paddr = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.paddr = self._io.read_u8le()
self._debug['paddr']['end'] = self._io.pos()
self._debug['filesz']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.filesz = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.filesz = self._io.read_u8le()
self._debug['filesz']['end'] = self._io.pos()
self._debug['memsz']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.memsz = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.memsz = self._io.read_u8le()
self._debug['memsz']['end'] = self._io.pos()
if self._root.bits == self._root.Bits.b32:
self._debug['flags32']['start'] = self._io.pos()
self.flags32 = self._io.read_u4le()
self._debug['flags32']['end'] = self._io.pos()
self._debug['align']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.align = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.align = self._io.read_u8le()
self._debug['align']['end'] = self._io.pos()
def _read_be(self):
self._debug['type']['start'] = self._io.pos()
self.type = KaitaiStream.resolve_enum(self._root.PhType, self._io.read_u4be())
self._debug['type']['end'] = self._io.pos()
if self._root.bits == self._root.Bits.b64:
self._debug['flags64']['start'] = self._io.pos()
self.flags64 = self._io.read_u4be()
self._debug['flags64']['end'] = self._io.pos()
self._debug['offset']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.offset = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.offset = self._io.read_u8be()
self._debug['offset']['end'] = self._io.pos()
self._debug['vaddr']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.vaddr = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.vaddr = self._io.read_u8be()
self._debug['vaddr']['end'] = self._io.pos()
self._debug['paddr']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.paddr = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.paddr = self._io.read_u8be()
self._debug['paddr']['end'] = self._io.pos()
self._debug['filesz']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.filesz = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.filesz = self._io.read_u8be()
self._debug['filesz']['end'] = self._io.pos()
self._debug['memsz']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.memsz = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.memsz = self._io.read_u8be()
self._debug['memsz']['end'] = self._io.pos()
if self._root.bits == self._root.Bits.b32:
self._debug['flags32']['start'] = self._io.pos()
self.flags32 = self._io.read_u4be()
self._debug['flags32']['end'] = self._io.pos()
self._debug['align']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.align = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.align = self._io.read_u8be()
self._debug['align']['end'] = self._io.pos()
@property
def dynamic(self):
if hasattr(self, '_m_dynamic'):
return self._m_dynamic if hasattr(self, '_m_dynamic') else None
if self.type == self._root.PhType.dynamic:
io = self._root._io
_pos = io.pos()
io.seek(self.offset)
self._debug['_m_dynamic']['start'] = io.pos()
if self._is_le:
self._raw__m_dynamic = io.read_bytes(self.filesz)
io = KaitaiStream(BytesIO(self._raw__m_dynamic))
self._m_dynamic = self._root.EndianElf.DynamicSection(io, self, self._root, self._is_le)
self._m_dynamic._read()
else:
self._raw__m_dynamic = io.read_bytes(self.filesz)
io = KaitaiStream(BytesIO(self._raw__m_dynamic))
self._m_dynamic = self._root.EndianElf.DynamicSection(io, self, self._root, self._is_le)
self._m_dynamic._read()
self._debug['_m_dynamic']['end'] = io.pos()
io.seek(_pos)
return self._m_dynamic if hasattr(self, '_m_dynamic') else None
@property
def flags_obj(self):
if hasattr(self, '_m_flags_obj'):
return self._m_flags_obj if hasattr(self, '_m_flags_obj') else None
self._debug['_m_flags_obj']['start'] = self._io.pos()
if self._is_le:
self._m_flags_obj = self._root.PhdrTypeFlags((self.flags64 | self.flags32), self._io, self, self._root)
self._m_flags_obj._read()
else:
self._m_flags_obj = self._root.PhdrTypeFlags((self.flags64 | self.flags32), self._io, self, self._root)
self._m_flags_obj._read()
self._debug['_m_flags_obj']['end'] = self._io.pos()
return self._m_flags_obj if hasattr(self, '_m_flags_obj') else None
class DynamicSectionEntry(KaitaiStruct):
SEQ_FIELDS = ["tag", "value_or_ptr"]
def __init__(self, _io, _parent=None, _root=None, _is_le=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._is_le = _is_le
self._debug = collections.defaultdict(dict)
def _read(self):
if self._is_le == True:
self._read_le()
elif self._is_le == False:
self._read_be()
else:
raise Exception("Unable to decide endianness")
def _read_le(self):
self._debug['tag']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.tag = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.tag = self._io.read_u8le()
self._debug['tag']['end'] = self._io.pos()
self._debug['value_or_ptr']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.value_or_ptr = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.value_or_ptr = self._io.read_u8le()
self._debug['value_or_ptr']['end'] = self._io.pos()
def _read_be(self):
self._debug['tag']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.tag = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.tag = self._io.read_u8be()
self._debug['tag']['end'] = self._io.pos()
self._debug['value_or_ptr']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.value_or_ptr = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.value_or_ptr = self._io.read_u8be()
self._debug['value_or_ptr']['end'] = self._io.pos()
@property
def tag_enum(self):
if hasattr(self, '_m_tag_enum'):
return self._m_tag_enum if hasattr(self, '_m_tag_enum') else None
self._m_tag_enum = KaitaiStream.resolve_enum(self._root.DynamicArrayTags, self.tag)
return self._m_tag_enum if hasattr(self, '_m_tag_enum') else None
@property
def flag_1_values(self):
if hasattr(self, '_m_flag_1_values'):
return self._m_flag_1_values if hasattr(self, '_m_flag_1_values') else None
if self.tag_enum == self._root.DynamicArrayTags.flags_1:
self._debug['_m_flag_1_values']['start'] = self._io.pos()
if self._is_le:
self._m_flag_1_values = self._root.DtFlag1Values(self.value_or_ptr, self._io, self, self._root)
self._m_flag_1_values._read()
else:
self._m_flag_1_values = self._root.DtFlag1Values(self.value_or_ptr, self._io, self, self._root)
self._m_flag_1_values._read()
self._debug['_m_flag_1_values']['end'] = self._io.pos()
return self._m_flag_1_values if hasattr(self, '_m_flag_1_values') else None
class SectionHeader(KaitaiStruct):
SEQ_FIELDS = ["ofs_name", "type", "flags", "addr", "ofs_body", "len_body", "linked_section_idx", "info", "align", "entry_size"]
def __init__(self, _io, _parent=None, _root=None, _is_le=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._is_le = _is_le
self._debug = collections.defaultdict(dict)
def _read(self):
if self._is_le == True:
self._read_le()
elif self._is_le == False:
self._read_be()
else:
raise Exception("Unable to decide endianness")
def _read_le(self):
self._debug['ofs_name']['start'] = self._io.pos()
self.ofs_name = self._io.read_u4le()
self._debug['ofs_name']['end'] = self._io.pos()
self._debug['type']['start'] = self._io.pos()
self.type = KaitaiStream.resolve_enum(self._root.ShType, self._io.read_u4le())
self._debug['type']['end'] = self._io.pos()
self._debug['flags']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.flags = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.flags = self._io.read_u8le()
self._debug['flags']['end'] = self._io.pos()
self._debug['addr']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.addr = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.addr = self._io.read_u8le()
self._debug['addr']['end'] = self._io.pos()
self._debug['ofs_body']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.ofs_body = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.ofs_body = self._io.read_u8le()
self._debug['ofs_body']['end'] = self._io.pos()
self._debug['len_body']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.len_body = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.len_body = self._io.read_u8le()
self._debug['len_body']['end'] = self._io.pos()
self._debug['linked_section_idx']['start'] = self._io.pos()
self.linked_section_idx = self._io.read_u4le()
self._debug['linked_section_idx']['end'] = self._io.pos()
self._debug['info']['start'] = self._io.pos()
self.info = self._io.read_bytes(4)
self._debug['info']['end'] = self._io.pos()
self._debug['align']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.align = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.align = self._io.read_u8le()
self._debug['align']['end'] = self._io.pos()
self._debug['entry_size']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.entry_size = self._io.read_u4le()
elif _on == self._root.Bits.b64:
self.entry_size = self._io.read_u8le()
self._debug['entry_size']['end'] = self._io.pos()
def _read_be(self):
self._debug['ofs_name']['start'] = self._io.pos()
self.ofs_name = self._io.read_u4be()
self._debug['ofs_name']['end'] = self._io.pos()
self._debug['type']['start'] = self._io.pos()
self.type = KaitaiStream.resolve_enum(self._root.ShType, self._io.read_u4be())
self._debug['type']['end'] = self._io.pos()
self._debug['flags']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.flags = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.flags = self._io.read_u8be()
self._debug['flags']['end'] = self._io.pos()
self._debug['addr']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.addr = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.addr = self._io.read_u8be()
self._debug['addr']['end'] = self._io.pos()
self._debug['ofs_body']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.ofs_body = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.ofs_body = self._io.read_u8be()
self._debug['ofs_body']['end'] = self._io.pos()
self._debug['len_body']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.len_body = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.len_body = self._io.read_u8be()
self._debug['len_body']['end'] = self._io.pos()
self._debug['linked_section_idx']['start'] = self._io.pos()
self.linked_section_idx = self._io.read_u4be()
self._debug['linked_section_idx']['end'] = self._io.pos()
self._debug['info']['start'] = self._io.pos()
self.info = self._io.read_bytes(4)
self._debug['info']['end'] = self._io.pos()
self._debug['align']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.align = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.align = self._io.read_u8be()
self._debug['align']['end'] = self._io.pos()
self._debug['entry_size']['start'] = self._io.pos()
_on = self._root.bits
if _on == self._root.Bits.b32:
self.entry_size = self._io.read_u4be()
elif _on == self._root.Bits.b64:
self.entry_size = self._io.read_u8be()
self._debug['entry_size']['end'] = self._io.pos()
@property
def body(self):
if hasattr(self, '_m_body'):
return self._m_body if hasattr(self, '_m_body') else None
io = self._root._io
_pos = io.pos()
io.seek(self.ofs_body)
self._debug['_m_body']['start'] = io.pos()
if self._is_le:
_on = self.type
if _on == self._root.ShType.strtab:
self._raw__m_body = io.read_bytes(self.len_body)
io = KaitaiStream(BytesIO(self._raw__m_body))
self._m_body = self._root.EndianElf.StringsStruct(io, self, self._root, self._is_le)
self._m_body._read()
elif _on == self._root.ShType.dynamic:
self._raw__m_body = io.read_bytes(self.len_body)
io = KaitaiStream(BytesIO(self._raw__m_body))
self._m_body = self._root.EndianElf.DynamicSection(io, self, self._root, self._is_le)
self._m_body._read()
elif _on == self._root.ShType.dynsym:
self._raw__m_body = io.read_bytes(self.len_body)
io = KaitaiStream(BytesIO(self._raw__m_body))
self._m_body = self._root.EndianElf.DynsymSection(io, self, self._root, self._is_le)
self._m_body._read()
elif _on == self._root.ShType.dynstr:
self._raw__m_body = io.read_bytes(self.len_body)
io = KaitaiStream(BytesIO(self._raw__m_body))
self._m_body = self._root.EndianElf.StringsStruct(io, self, self._root, self._is_le)
self._m_body._read()
else:
self._m_body = io.read_bytes(self.len_body)
else:
_on = self.type
if _on == self._root.ShType.strtab:
self._raw__m_body = io.read_bytes(self.len_body)
io = KaitaiStream(BytesIO(self._raw__m_body))
self._m_body = self._root.EndianElf.StringsStruct(io, self, self._root, self._is_le)
self._m_body._read()
elif _on == self._root.ShType.dynamic:
self._raw__m_body = io.read_bytes(self.len_body)
io = KaitaiStream(BytesIO(self._raw__m_body))
self._m_body = self._root.EndianElf.DynamicSection(io, self, self._root, self._is_le)
self._m_body._read()
elif _on == self._root.ShType.dynsym:
self._raw__m_body = io.read_bytes(self.len_body)
io = KaitaiStream(BytesIO(self._raw__m_body))
self._m_body = self._root.EndianElf.DynsymSection(io, self, self._root, self._is_le)
self._m_body._read()
elif _on == self._root.ShType.dynstr:
self._raw__m_body = io.read_bytes(self.len_body)
io = KaitaiStream(BytesIO(self._raw__m_body))
self._m_body = self._root.EndianElf.StringsStruct(io, self, self._root, self._is_le)
self._m_body._read()
else:
self._m_body = io.read_bytes(self.len_body)
self._debug['_m_body']['end'] = io.pos()
io.seek(_pos)
return self._m_body if hasattr(self, '_m_body') else None
@property
def name(self):
if hasattr(self, '_m_name'):
return self._m_name if hasattr(self, '_m_name') else None
io = self._root.header.strings._io
_pos = io.pos()
io.seek(self.ofs_name)
self._debug['_m_name']['start'] = io.pos()
if self._is_le:
self._m_name = (io.read_bytes_term(0, False, True, True)).decode(u"ASCII")
else:
self._m_name = (io.read_bytes_term(0, False, True, True)).decode(u"ASCII")
self._debug['_m_name']['end'] = io.pos()
io.seek(_pos)
return self._m_name if hasattr(self, '_m_name') else None
@property
def flags_obj(self):
if hasattr(self, '_m_flags_obj'):
return self._m_flags_obj if hasattr(self, '_m_flags_obj') else None
self._debug['_m_flags_obj']['start'] = self._io.pos()
if self._is_le:
self._m_flags_obj = self._root.SectionHeaderFlags(self.flags, self._io, self, self._root)
self._m_flags_obj._read()
else:
self._m_flags_obj = self._root.SectionHeaderFlags(self.flags, self._io, self, self._root)
self._m_flags_obj._read()
self._debug['_m_flags_obj']['end'] = self._io.pos()
return self._m_flags_obj if hasattr(self, '_m_flags_obj') else None
class DynamicSection(KaitaiStruct):
SEQ_FIELDS = ["entries"]
def __init__(self, _io, _parent=None, _root=None, _is_le=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._is_le = _is_le
self._debug = collections.defaultdict(dict)
def _read(self):
if self._is_le == True:
self._read_le()
elif self._is_le == False:
self._read_be()
else:
raise Exception("Unable to decide endianness")
def _read_le(self):
self._debug['entries']['start'] = self._io.pos()
self.entries = []
i = 0
while not self._io.is_eof():
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
_t_entries = self._root.EndianElf.DynamicSectionEntry(self._io, self, self._root, self._is_le)
_t_entries._read()
self.entries.append(_t_entries)
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
i += 1
self._debug['entries']['end'] = self._io.pos()
def _read_be(self):
self._debug['entries']['start'] = self._io.pos()
self.entries = []
i = 0
while not self._io.is_eof():
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
_t_entries = self._root.EndianElf.DynamicSectionEntry(self._io, self, self._root, self._is_le)
_t_entries._read()
self.entries.append(_t_entries)
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
i += 1
self._debug['entries']['end'] = self._io.pos()
class DynsymSection(KaitaiStruct):
SEQ_FIELDS = ["entries"]
def __init__(self, _io, _parent=None, _root=None, _is_le=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._is_le = _is_le
self._debug = collections.defaultdict(dict)
def _read(self):
if self._is_le == True:
self._read_le()
elif self._is_le == False:
self._read_be()
else:
raise Exception("Unable to decide endianness")
def _read_le(self):
self._debug['entries']['start'] = self._io.pos()
self.entries = []
i = 0
while not self._io.is_eof():
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
_on = self._root.bits
if _on == self._root.Bits.b32:
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
_t_entries = self._root.EndianElf.DynsymSectionEntry32(self._io, self, self._root, self._is_le)
_t_entries._read()
self.entries.append(_t_entries)
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
elif _on == self._root.Bits.b64:
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
_t_entries = self._root.EndianElf.DynsymSectionEntry64(self._io, self, self._root, self._is_le)
_t_entries._read()
self.entries.append(_t_entries)
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
i += 1
self._debug['entries']['end'] = self._io.pos()
def _read_be(self):
self._debug['entries']['start'] = self._io.pos()
self.entries = []
i = 0
while not self._io.is_eof():
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
_on = self._root.bits
if _on == self._root.Bits.b32:
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
_t_entries = self._root.EndianElf.DynsymSectionEntry32(self._io, self, self._root, self._is_le)
_t_entries._read()
self.entries.append(_t_entries)
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
elif _on == self._root.Bits.b64:
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
_t_entries = self._root.EndianElf.DynsymSectionEntry64(self._io, self, self._root, self._is_le)
_t_entries._read()
self.entries.append(_t_entries)
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
i += 1
self._debug['entries']['end'] = self._io.pos()
class DynsymSectionEntry32(KaitaiStruct):
SEQ_FIELDS = ["name_offset", "value", "size", "info", "other", "shndx"]
def __init__(self, _io, _parent=None, _root=None, _is_le=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._is_le = _is_le
self._debug = collections.defaultdict(dict)
def _read(self):
if self._is_le == True:
self._read_le()
elif self._is_le == False:
self._read_be()
else:
raise Exception("Unable to decide endianness")
def _read_le(self):
self._debug['name_offset']['start'] = self._io.pos()
self.name_offset = self._io.read_u4le()
self._debug['name_offset']['end'] = self._io.pos()
self._debug['value']['start'] = self._io.pos()
self.value = self._io.read_u4le()
self._debug['value']['end'] = self._io.pos()
self._debug['size']['start'] = self._io.pos()
self.size = self._io.read_u4le()
self._debug['size']['end'] = self._io.pos()
self._debug['info']['start'] = self._io.pos()
self.info = self._io.read_u1()
self._debug['info']['end'] = self._io.pos()
self._debug['other']['start'] = self._io.pos()
self.other = self._io.read_u1()
self._debug['other']['end'] = self._io.pos()
self._debug['shndx']['start'] = self._io.pos()
self.shndx = self._io.read_u2le()
self._debug['shndx']['end'] = self._io.pos()
def _read_be(self):
self._debug['name_offset']['start'] = self._io.pos()
self.name_offset = self._io.read_u4be()
self._debug['name_offset']['end'] = self._io.pos()
self._debug['value']['start'] = self._io.pos()
self.value = self._io.read_u4be()
self._debug['value']['end'] = self._io.pos()
self._debug['size']['start'] = self._io.pos()
self.size = self._io.read_u4be()
self._debug['size']['end'] = self._io.pos()
self._debug['info']['start'] = self._io.pos()
self.info = self._io.read_u1()
self._debug['info']['end'] = self._io.pos()
self._debug['other']['start'] = self._io.pos()
self.other = self._io.read_u1()
self._debug['other']['end'] = self._io.pos()
self._debug['shndx']['start'] = self._io.pos()
self.shndx = self._io.read_u2be()
self._debug['shndx']['end'] = self._io.pos()
class StringsStruct(KaitaiStruct):
SEQ_FIELDS = ["entries"]
def __init__(self, _io, _parent=None, _root=None, _is_le=None):
self._io = _io
self._parent = _parent
self._root = _root if _root else self
self._is_le = _is_le
self._debug = collections.defaultdict(dict)
def _read(self):
if self._is_le == True:
self._read_le()
elif self._is_le == False:
self._read_be()
else:
raise Exception("Unable to decide endianness")
def _read_le(self):
self._debug['entries']['start'] = self._io.pos()
self.entries = []
i = 0
while not self._io.is_eof():
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
self.entries.append((self._io.read_bytes_term(0, False, True, True)).decode(u"ASCII"))
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
i += 1
self._debug['entries']['end'] = self._io.pos()
def _read_be(self):
self._debug['entries']['start'] = self._io.pos()
self.entries = []
i = 0
while not self._io.is_eof():
if not 'arr' in self._debug['entries']:
self._debug['entries']['arr'] = []
self._debug['entries']['arr'].append({'start': self._io.pos()})
self.entries.append((self._io.read_bytes_term(0, False, True, True)).decode(u"ASCII"))
self._debug['entries']['arr'][len(self.entries) - 1]['end'] = self._io.pos()
i += 1
self._debug['entries']['end'] = self._io.pos()
@property
def program_headers(self):
if hasattr(self, '_m_program_headers'):
return self._m_program_headers if hasattr(self, '_m_program_headers') else None
_pos = self._io.pos()
self._io.seek(self.program_header_offset)
self._debug['_m_program_headers']['start'] = self._io.pos()
if self._is_le:
self._raw__m_program_headers = [None] * (self.qty_program_header)
self._m_program_headers = [None] * (self.qty_program_header)
for i in range(self.qty_program_header):
if not 'arr' in self._debug['_m_program_headers']:
self._debug['_m_program_headers']['arr'] = []
self._debug['_m_program_headers']['arr'].append({'start': self._io.pos()})
self._raw__m_program_headers[i] = self._io.read_bytes(self.program_header_entry_size)
io = KaitaiStream(BytesIO(self._raw__m_program_headers[i]))
_t__m_program_headers = self._root.EndianElf.ProgramHeader(io, self, self._root, self._is_le)
_t__m_program_headers._read()
self._m_program_headers[i] = _t__m_program_headers
self._debug['_m_program_headers']['arr'][i]['end'] = self._io.pos()
else:
self._raw__m_program_headers = [None] * (self.qty_program_header)
self._m_program_headers = [None] * (self.qty_program_header)
for i in range(self.qty_program_header):
if not 'arr' in self._debug['_m_program_headers']:
self._debug['_m_program_headers']['arr'] = []
self._debug['_m_program_headers']['arr'].append({'start': self._io.pos()})
self._raw__m_program_headers[i] = self._io.read_bytes(self.program_header_entry_size)
io = KaitaiStream(BytesIO(self._raw__m_program_headers[i]))
_t__m_program_headers = self._root.EndianElf.ProgramHeader(io, self, self._root, self._is_le)
_t__m_program_headers._read()
self._m_program_headers[i] = _t__m_program_headers
self._debug['_m_program_headers']['arr'][i]['end'] = self._io.pos()
self._debug['_m_program_headers']['end'] = self._io.pos()
self._io.seek(_pos)
return self._m_program_headers if hasattr(self, '_m_program_headers') else None
@property
def section_headers(self):
if hasattr(self, '_m_section_headers'):
return self._m_section_headers if hasattr(self, '_m_section_headers') else None
_pos = self._io.pos()
self._io.seek(self.section_header_offset)
self._debug['_m_section_headers']['start'] = self._io.pos()
if self._is_le:
self._raw__m_section_headers = [None] * (self.qty_section_header)
self._m_section_headers = [None] * (self.qty_section_header)
for i in range(self.qty_section_header):
if not 'arr' in self._debug['_m_section_headers']:
self._debug['_m_section_headers']['arr'] = []
self._debug['_m_section_headers']['arr'].append({'start': self._io.pos()})
self._raw__m_section_headers[i] = self._io.read_bytes(self.section_header_entry_size)
io = KaitaiStream(BytesIO(self._raw__m_section_headers[i]))
_t__m_section_headers = self._root.EndianElf.SectionHeader(io, self, self._root, self._is_le)
_t__m_section_headers._read()
self._m_section_headers[i] = _t__m_section_headers
self._debug['_m_section_headers']['arr'][i]['end'] = self._io.pos()
else:
self._raw__m_section_headers = [None] * (self.qty_section_header)
self._m_section_headers = [None] * (self.qty_section_header)
for i in range(self.qty_section_header):
if not 'arr' in self._debug['_m_section_headers']:
self._debug['_m_section_headers']['arr'] = []
self._debug['_m_section_headers']['arr'].append({'start': self._io.pos()})
self._raw__m_section_headers[i] = self._io.read_bytes(self.section_header_entry_size)
io = KaitaiStream(BytesIO(self._raw__m_section_headers[i]))
_t__m_section_headers = self._root.EndianElf.SectionHeader(io, self, self._root, self._is_le)
_t__m_section_headers._read()
self._m_section_headers[i] = _t__m_section_headers
self._debug['_m_section_headers']['arr'][i]['end'] = self._io.pos()
self._debug['_m_section_headers']['end'] = self._io.pos()
self._io.seek(_pos)
return self._m_section_headers if hasattr(self, '_m_section_headers') else None
@property
def strings(self):
if hasattr(self, '_m_strings'):
return self._m_strings if hasattr(self, '_m_strings') else None
_pos = self._io.pos()
self._io.seek(self.section_headers[self.section_names_idx].ofs_body)
self._debug['_m_strings']['start'] = self._io.pos()
if self._is_le:
self._raw__m_strings = self._io.read_bytes(self.section_headers[self.section_names_idx].len_body)
io = KaitaiStream(BytesIO(self._raw__m_strings))
self._m_strings = self._root.EndianElf.StringsStruct(io, self, self._root, self._is_le)
self._m_strings._read()
else:
self._raw__m_strings = self._io.read_bytes(self.section_headers[self.section_names_idx].len_body)
io = KaitaiStream(BytesIO(self._raw__m_strings))
self._m_strings = self._root.EndianElf.StringsStruct(io, self, self._root, self._is_le)
self._m_strings._read()
self._debug['_m_strings']['end'] = self._io.pos()
self._io.seek(_pos)
return self._m_strings if hasattr(self, '_m_strings') else None
| 45.154211 | 264 | 0.531039 | 8,957 | 76,130 | 4.142682 | 0.064307 | 0.071794 | 0.06185 | 0.059909 | 0.831698 | 0.796637 | 0.774187 | 0.749798 | 0.746375 | 0.741201 | 0 | 0.032089 | 0.342585 | 76,130 | 1,685 | 265 | 45.181009 | 0.709305 | 0.016485 | 0 | 0.656011 | 1 | 0 | 0.085274 | 0.00616 | 0 | 0 | 0 | 0 | 0 | 1 | 0.069493 | false | 0.002085 | 0.00278 | 0 | 0.165393 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
48cc4f3a5d1dbf3dafde9e1db891e8d5bdcea827 | 24,044 | py | Python | sdk/python/pulumi_gcp/compute/network_endpoint.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 121 | 2018-06-18T19:16:42.000Z | 2022-03-31T06:06:48.000Z | sdk/python/pulumi_gcp/compute/network_endpoint.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 492 | 2018-06-22T19:41:03.000Z | 2022-03-31T15:33:53.000Z | sdk/python/pulumi_gcp/compute/network_endpoint.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 43 | 2018-06-19T01:43:13.000Z | 2022-03-23T22:43:37.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['NetworkEndpointArgs', 'NetworkEndpoint']
@pulumi.input_type
class NetworkEndpointArgs:
def __init__(__self__, *,
instance: pulumi.Input[str],
ip_address: pulumi.Input[str],
network_endpoint_group: pulumi.Input[str],
port: pulumi.Input[int],
project: Optional[pulumi.Input[str]] = None,
zone: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a NetworkEndpoint resource.
:param pulumi.Input[str] instance: The name for a specific VM instance that the IP address belongs to.
This is required for network endpoints of type GCE_VM_IP_PORT.
The instance must be in the same zone of network endpoint group.
:param pulumi.Input[str] ip_address: IPv4 address of network endpoint. The IP address must belong
to a VM in GCE (either the primary IP or as part of an aliased IP
range).
:param pulumi.Input[str] network_endpoint_group: The network endpoint group this endpoint is part of.
:param pulumi.Input[int] port: Port number of network endpoint.
:param pulumi.Input[str] project: The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
:param pulumi.Input[str] zone: Zone where the containing network endpoint group is located.
"""
pulumi.set(__self__, "instance", instance)
pulumi.set(__self__, "ip_address", ip_address)
pulumi.set(__self__, "network_endpoint_group", network_endpoint_group)
pulumi.set(__self__, "port", port)
if project is not None:
pulumi.set(__self__, "project", project)
if zone is not None:
pulumi.set(__self__, "zone", zone)
@property
@pulumi.getter
def instance(self) -> pulumi.Input[str]:
"""
The name for a specific VM instance that the IP address belongs to.
This is required for network endpoints of type GCE_VM_IP_PORT.
The instance must be in the same zone of network endpoint group.
"""
return pulumi.get(self, "instance")
@instance.setter
def instance(self, value: pulumi.Input[str]):
pulumi.set(self, "instance", value)
@property
@pulumi.getter(name="ipAddress")
def ip_address(self) -> pulumi.Input[str]:
"""
IPv4 address of network endpoint. The IP address must belong
to a VM in GCE (either the primary IP or as part of an aliased IP
range).
"""
return pulumi.get(self, "ip_address")
@ip_address.setter
def ip_address(self, value: pulumi.Input[str]):
pulumi.set(self, "ip_address", value)
@property
@pulumi.getter(name="networkEndpointGroup")
def network_endpoint_group(self) -> pulumi.Input[str]:
"""
The network endpoint group this endpoint is part of.
"""
return pulumi.get(self, "network_endpoint_group")
@network_endpoint_group.setter
def network_endpoint_group(self, value: pulumi.Input[str]):
pulumi.set(self, "network_endpoint_group", value)
@property
@pulumi.getter
def port(self) -> pulumi.Input[int]:
"""
Port number of network endpoint.
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: pulumi.Input[int]):
pulumi.set(self, "port", value)
@property
@pulumi.getter
def project(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project", value)
@property
@pulumi.getter
def zone(self) -> Optional[pulumi.Input[str]]:
"""
Zone where the containing network endpoint group is located.
"""
return pulumi.get(self, "zone")
@zone.setter
def zone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "zone", value)
@pulumi.input_type
class _NetworkEndpointState:
def __init__(__self__, *,
instance: Optional[pulumi.Input[str]] = None,
ip_address: Optional[pulumi.Input[str]] = None,
network_endpoint_group: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
project: Optional[pulumi.Input[str]] = None,
zone: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering NetworkEndpoint resources.
:param pulumi.Input[str] instance: The name for a specific VM instance that the IP address belongs to.
This is required for network endpoints of type GCE_VM_IP_PORT.
The instance must be in the same zone of network endpoint group.
:param pulumi.Input[str] ip_address: IPv4 address of network endpoint. The IP address must belong
to a VM in GCE (either the primary IP or as part of an aliased IP
range).
:param pulumi.Input[str] network_endpoint_group: The network endpoint group this endpoint is part of.
:param pulumi.Input[int] port: Port number of network endpoint.
:param pulumi.Input[str] project: The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
:param pulumi.Input[str] zone: Zone where the containing network endpoint group is located.
"""
if instance is not None:
pulumi.set(__self__, "instance", instance)
if ip_address is not None:
pulumi.set(__self__, "ip_address", ip_address)
if network_endpoint_group is not None:
pulumi.set(__self__, "network_endpoint_group", network_endpoint_group)
if port is not None:
pulumi.set(__self__, "port", port)
if project is not None:
pulumi.set(__self__, "project", project)
if zone is not None:
pulumi.set(__self__, "zone", zone)
@property
@pulumi.getter
def instance(self) -> Optional[pulumi.Input[str]]:
"""
The name for a specific VM instance that the IP address belongs to.
This is required for network endpoints of type GCE_VM_IP_PORT.
The instance must be in the same zone of network endpoint group.
"""
return pulumi.get(self, "instance")
@instance.setter
def instance(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance", value)
@property
@pulumi.getter(name="ipAddress")
def ip_address(self) -> Optional[pulumi.Input[str]]:
"""
IPv4 address of network endpoint. The IP address must belong
to a VM in GCE (either the primary IP or as part of an aliased IP
range).
"""
return pulumi.get(self, "ip_address")
@ip_address.setter
def ip_address(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ip_address", value)
@property
@pulumi.getter(name="networkEndpointGroup")
def network_endpoint_group(self) -> Optional[pulumi.Input[str]]:
"""
The network endpoint group this endpoint is part of.
"""
return pulumi.get(self, "network_endpoint_group")
@network_endpoint_group.setter
def network_endpoint_group(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "network_endpoint_group", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
Port number of network endpoint.
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter
def project(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project", value)
@property
@pulumi.getter
def zone(self) -> Optional[pulumi.Input[str]]:
"""
Zone where the containing network endpoint group is located.
"""
return pulumi.get(self, "zone")
@zone.setter
def zone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "zone", value)
class NetworkEndpoint(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
instance: Optional[pulumi.Input[str]] = None,
ip_address: Optional[pulumi.Input[str]] = None,
network_endpoint_group: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
project: Optional[pulumi.Input[str]] = None,
zone: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
A Network endpoint represents a IP address and port combination that is
part of a specific network endpoint group (NEG). NEGs are zonal
collections of these endpoints for GCP resources within a
single subnet. **NOTE**: Network endpoints cannot be created outside of a
network endpoint group.
To get more information about NetworkEndpoint, see:
* [API documentation](https://cloud.google.com/compute/docs/reference/rest/beta/networkEndpointGroups)
* How-to Guides
* [Official Documentation](https://cloud.google.com/load-balancing/docs/negs/)
## Example Usage
### Network Endpoint
```python
import pulumi
import pulumi_gcp as gcp
my_image = gcp.compute.get_image(family="debian-9",
project="debian-cloud")
default_network = gcp.compute.Network("defaultNetwork", auto_create_subnetworks=False)
default_subnetwork = gcp.compute.Subnetwork("defaultSubnetwork",
ip_cidr_range="10.0.0.1/16",
region="us-central1",
network=default_network.id)
endpoint_instance = gcp.compute.Instance("endpoint-instance",
machine_type="e2-medium",
boot_disk=gcp.compute.InstanceBootDiskArgs(
initialize_params=gcp.compute.InstanceBootDiskInitializeParamsArgs(
image=my_image.self_link,
),
),
network_interfaces=[gcp.compute.InstanceNetworkInterfaceArgs(
subnetwork=default_subnetwork.id,
access_configs=[gcp.compute.InstanceNetworkInterfaceAccessConfigArgs()],
)])
default_endpoint = gcp.compute.NetworkEndpoint("default-endpoint",
network_endpoint_group=google_compute_network_endpoint_group["neg"]["name"],
instance=endpoint_instance.name,
port=google_compute_network_endpoint_group["neg"]["default_port"],
ip_address=endpoint_instance.network_interfaces[0].network_ip)
group = gcp.compute.NetworkEndpointGroup("group",
network=default_network.id,
subnetwork=default_subnetwork.id,
default_port=90,
zone="us-central1-a")
```
## Import
NetworkEndpoint can be imported using any of these accepted formats
```sh
$ pulumi import gcp:compute/networkEndpoint:NetworkEndpoint default projects/{{project}}/zones/{{zone}}/networkEndpointGroups/{{network_endpoint_group}}/{{instance}}/{{ip_address}}/{{port}}
```
```sh
$ pulumi import gcp:compute/networkEndpoint:NetworkEndpoint default {{project}}/{{zone}}/{{network_endpoint_group}}/{{instance}}/{{ip_address}}/{{port}}
```
```sh
$ pulumi import gcp:compute/networkEndpoint:NetworkEndpoint default {{zone}}/{{network_endpoint_group}}/{{instance}}/{{ip_address}}/{{port}}
```
```sh
$ pulumi import gcp:compute/networkEndpoint:NetworkEndpoint default {{network_endpoint_group}}/{{instance}}/{{ip_address}}/{{port}}
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] instance: The name for a specific VM instance that the IP address belongs to.
This is required for network endpoints of type GCE_VM_IP_PORT.
The instance must be in the same zone of network endpoint group.
:param pulumi.Input[str] ip_address: IPv4 address of network endpoint. The IP address must belong
to a VM in GCE (either the primary IP or as part of an aliased IP
range).
:param pulumi.Input[str] network_endpoint_group: The network endpoint group this endpoint is part of.
:param pulumi.Input[int] port: Port number of network endpoint.
:param pulumi.Input[str] project: The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
:param pulumi.Input[str] zone: Zone where the containing network endpoint group is located.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: NetworkEndpointArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
A Network endpoint represents a IP address and port combination that is
part of a specific network endpoint group (NEG). NEGs are zonal
collections of these endpoints for GCP resources within a
single subnet. **NOTE**: Network endpoints cannot be created outside of a
network endpoint group.
To get more information about NetworkEndpoint, see:
* [API documentation](https://cloud.google.com/compute/docs/reference/rest/beta/networkEndpointGroups)
* How-to Guides
* [Official Documentation](https://cloud.google.com/load-balancing/docs/negs/)
## Example Usage
### Network Endpoint
```python
import pulumi
import pulumi_gcp as gcp
my_image = gcp.compute.get_image(family="debian-9",
project="debian-cloud")
default_network = gcp.compute.Network("defaultNetwork", auto_create_subnetworks=False)
default_subnetwork = gcp.compute.Subnetwork("defaultSubnetwork",
ip_cidr_range="10.0.0.1/16",
region="us-central1",
network=default_network.id)
endpoint_instance = gcp.compute.Instance("endpoint-instance",
machine_type="e2-medium",
boot_disk=gcp.compute.InstanceBootDiskArgs(
initialize_params=gcp.compute.InstanceBootDiskInitializeParamsArgs(
image=my_image.self_link,
),
),
network_interfaces=[gcp.compute.InstanceNetworkInterfaceArgs(
subnetwork=default_subnetwork.id,
access_configs=[gcp.compute.InstanceNetworkInterfaceAccessConfigArgs()],
)])
default_endpoint = gcp.compute.NetworkEndpoint("default-endpoint",
network_endpoint_group=google_compute_network_endpoint_group["neg"]["name"],
instance=endpoint_instance.name,
port=google_compute_network_endpoint_group["neg"]["default_port"],
ip_address=endpoint_instance.network_interfaces[0].network_ip)
group = gcp.compute.NetworkEndpointGroup("group",
network=default_network.id,
subnetwork=default_subnetwork.id,
default_port=90,
zone="us-central1-a")
```
## Import
NetworkEndpoint can be imported using any of these accepted formats
```sh
$ pulumi import gcp:compute/networkEndpoint:NetworkEndpoint default projects/{{project}}/zones/{{zone}}/networkEndpointGroups/{{network_endpoint_group}}/{{instance}}/{{ip_address}}/{{port}}
```
```sh
$ pulumi import gcp:compute/networkEndpoint:NetworkEndpoint default {{project}}/{{zone}}/{{network_endpoint_group}}/{{instance}}/{{ip_address}}/{{port}}
```
```sh
$ pulumi import gcp:compute/networkEndpoint:NetworkEndpoint default {{zone}}/{{network_endpoint_group}}/{{instance}}/{{ip_address}}/{{port}}
```
```sh
$ pulumi import gcp:compute/networkEndpoint:NetworkEndpoint default {{network_endpoint_group}}/{{instance}}/{{ip_address}}/{{port}}
```
:param str resource_name: The name of the resource.
:param NetworkEndpointArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(NetworkEndpointArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
instance: Optional[pulumi.Input[str]] = None,
ip_address: Optional[pulumi.Input[str]] = None,
network_endpoint_group: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
project: Optional[pulumi.Input[str]] = None,
zone: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = NetworkEndpointArgs.__new__(NetworkEndpointArgs)
if instance is None and not opts.urn:
raise TypeError("Missing required property 'instance'")
__props__.__dict__["instance"] = instance
if ip_address is None and not opts.urn:
raise TypeError("Missing required property 'ip_address'")
__props__.__dict__["ip_address"] = ip_address
if network_endpoint_group is None and not opts.urn:
raise TypeError("Missing required property 'network_endpoint_group'")
__props__.__dict__["network_endpoint_group"] = network_endpoint_group
if port is None and not opts.urn:
raise TypeError("Missing required property 'port'")
__props__.__dict__["port"] = port
__props__.__dict__["project"] = project
__props__.__dict__["zone"] = zone
super(NetworkEndpoint, __self__).__init__(
'gcp:compute/networkEndpoint:NetworkEndpoint',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
instance: Optional[pulumi.Input[str]] = None,
ip_address: Optional[pulumi.Input[str]] = None,
network_endpoint_group: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
project: Optional[pulumi.Input[str]] = None,
zone: Optional[pulumi.Input[str]] = None) -> 'NetworkEndpoint':
"""
Get an existing NetworkEndpoint resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] instance: The name for a specific VM instance that the IP address belongs to.
This is required for network endpoints of type GCE_VM_IP_PORT.
The instance must be in the same zone of network endpoint group.
:param pulumi.Input[str] ip_address: IPv4 address of network endpoint. The IP address must belong
to a VM in GCE (either the primary IP or as part of an aliased IP
range).
:param pulumi.Input[str] network_endpoint_group: The network endpoint group this endpoint is part of.
:param pulumi.Input[int] port: Port number of network endpoint.
:param pulumi.Input[str] project: The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
:param pulumi.Input[str] zone: Zone where the containing network endpoint group is located.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _NetworkEndpointState.__new__(_NetworkEndpointState)
__props__.__dict__["instance"] = instance
__props__.__dict__["ip_address"] = ip_address
__props__.__dict__["network_endpoint_group"] = network_endpoint_group
__props__.__dict__["port"] = port
__props__.__dict__["project"] = project
__props__.__dict__["zone"] = zone
return NetworkEndpoint(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def instance(self) -> pulumi.Output[str]:
"""
The name for a specific VM instance that the IP address belongs to.
This is required for network endpoints of type GCE_VM_IP_PORT.
The instance must be in the same zone of network endpoint group.
"""
return pulumi.get(self, "instance")
@property
@pulumi.getter(name="ipAddress")
def ip_address(self) -> pulumi.Output[str]:
"""
IPv4 address of network endpoint. The IP address must belong
to a VM in GCE (either the primary IP or as part of an aliased IP
range).
"""
return pulumi.get(self, "ip_address")
@property
@pulumi.getter(name="networkEndpointGroup")
def network_endpoint_group(self) -> pulumi.Output[str]:
"""
The network endpoint group this endpoint is part of.
"""
return pulumi.get(self, "network_endpoint_group")
@property
@pulumi.getter
def port(self) -> pulumi.Output[int]:
"""
Port number of network endpoint.
"""
return pulumi.get(self, "port")
@property
@pulumi.getter
def project(self) -> pulumi.Output[str]:
"""
The ID of the project in which the resource belongs.
If it is not provided, the provider project is used.
"""
return pulumi.get(self, "project")
@property
@pulumi.getter
def zone(self) -> pulumi.Output[str]:
"""
Zone where the containing network endpoint group is located.
"""
return pulumi.get(self, "zone")
| 43.244604 | 198 | 0.637831 | 2,805 | 24,044 | 5.286275 | 0.085562 | 0.090032 | 0.095765 | 0.053412 | 0.879485 | 0.862557 | 0.845697 | 0.832142 | 0.824791 | 0.806852 | 0 | 0.002039 | 0.265638 | 24,044 | 555 | 199 | 43.322523 | 0.837741 | 0.475004 | 0 | 0.668085 | 1 | 0 | 0.092676 | 0.024883 | 0 | 0 | 0 | 0 | 0 | 1 | 0.157447 | false | 0.004255 | 0.021277 | 0 | 0.27234 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5b0b5050095301eccc34895aa1180add59ee06af | 164 | py | Python | package/tests/helpers/test_helper.py | tim-spiglanin/Azure-Shell | 58c52994f0d6cfd798c5dca33737419ec18363d4 | [
"Apache-2.0"
] | 5 | 2016-09-08T08:33:47.000Z | 2020-02-10T12:31:15.000Z | package/tests/helpers/test_helper.py | tim-spiglanin/Azure-Shell | 58c52994f0d6cfd798c5dca33737419ec18363d4 | [
"Apache-2.0"
] | 505 | 2016-08-09T07:41:03.000Z | 2021-02-08T20:26:46.000Z | package/tests/helpers/test_helper.py | tim-spiglanin/Azure-Shell | 58c52994f0d6cfd798c5dca33737419ec18363d4 | [
"Apache-2.0"
] | 5 | 2016-12-21T12:52:55.000Z | 2021-07-08T09:50:42.000Z | class TestHelper(object):
@staticmethod
def CheckMethodCalledXTimes(method, call_count=1):
return method.called and method.call_count == call_count
| 32.8 | 64 | 0.75 | 19 | 164 | 6.315789 | 0.684211 | 0.225 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007353 | 0.170732 | 164 | 4 | 65 | 41 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
d288aeb592800dcc356ac9a72226016dfae480e5 | 822,092 | py | Python | epic/antenna_array.py | nithyanandan/MOFF | c2bd68b792a5269cfffe1d93b4710eae9ba8ca55 | [
"MIT"
] | 3 | 2019-12-11T07:14:10.000Z | 2020-11-07T19:25:32.000Z | epic/antenna_array.py | nithyanandan/MOFF | c2bd68b792a5269cfffe1d93b4710eae9ba8ca55 | [
"MIT"
] | 8 | 2015-08-20T19:46:29.000Z | 2015-09-19T01:31:43.000Z | epic/antenna_array.py | epic-astronomy/EPIC | c2bd68b792a5269cfffe1d93b4710eae9ba8ca55 | [
"MIT"
] | 1 | 2019-09-24T19:05:34.000Z | 2019-09-24T19:05:34.000Z | import numpy as NP
import numpy.ma as MA
import multiprocessing as MP
import itertools as IT
import copy
import h5py
import scipy.constants as FCNST
import scipy.sparse as SpM
from astropy.io import fits
import matplotlib.pyplot as PLT
import progressbar as PGB
from astroutils import DSP_modules as DSP
from astroutils import geometry as GEOM
from astroutils import gridding_modules as GRD
from astroutils import mathops as OPS
from astroutils import lookup_operations as LKP
import aperture as APR
################### Routines essential for parallel processing ################
def unwrap_antenna_FT(args, **kwargs):
return Antenna.FT_pp(*args, **kwargs)
def unwrap_interferometer_FX(args, **kwargs):
return Interferometer.FX_pp(*args, **kwargs)
def unwrap_interferometer_stack(args, **kwargs):
return Interferometer.stack_pp(*args, **kwargs)
def unwrap_antenna_update(args, **kwargs):
return Antenna.update_pp(*args, **kwargs)
def unwrap_interferometer_update(args, **kwargs):
return Interferometer.update_pp(*args, **kwargs)
def antenna_grid_mapping(gridind_raveled, values, bins=None):
if bins is None:
raise ValueError('Input parameter bins must be specified')
if NP.iscomplexobj(values):
retval = OPS.binned_statistic(gridind_raveled, values.real, statistic='sum', bins=bins)[0]
retval = retval.astype(NP.complex64)
retval += 1j * OPS.binned_statistic(gridind_raveled, values.imag, statistic='sum', bins=bins)[0]
else:
retval = OPS.binned_statistic(gridind_raveled, values, statistic='sum', bins=bins)[0]
# print MP.current_process().name
return retval
def antenna_grid_mapping_arg_splitter(args, **kwargs):
return antenna_grid_mapping(*args, **kwargs)
def antenna_grid_mapper(gridind_raveled, values, bins, label, outq):
if NP.iscomplexobj(values):
retval = OPS.binned_statistic(gridind_raveled, values.real, statistic='sum', bins=bins)[0]
retval = retval.astype(NP.complex64)
retval += 1j * OPS.binned_statistic(gridind_raveled, values.imag, statistic='sum', bins=bins)[0]
else:
retval = OPS.binned_statistic(gridind_raveled, values, statistic='sum', bins=bins)[0]
outdict = {}
outdict[label] = retval
# print MP.current_process().name
outq.put(outdict)
def baseline_grid_mapping(gridind_raveled, values, bins=None):
if bins is None:
raise ValueError('Input parameter bins must be specified')
if NP.iscomplexobj(values):
retval = OPS.binned_statistic(gridind_raveled, values.real, statistic='sum', bins=bins)[0]
retval = retval.astype(NP.complex64)
retval += 1j * OPS.binned_statistic(gridind_raveled, values.imag, statistic='sum', bins=bins)[0]
else:
retval = OPS.binned_statistic(gridind_raveled, values, statistic='sum', bins=bins)[0]
# print MP.current_process().name
return retval
def baseline_grid_mapping_arg_splitter(args, **kwargs):
return baseline_grid_mapping(*args, **kwargs)
def baseline_grid_mapper(gridind_raveled, values, bins, label, outq):
if NP.iscomplexobj(values):
retval = OPS.binned_statistic(gridind_raveled, values.real, statistic='sum', bins=bins)[0]
retval = retval.astype(NP.complex64)
retval += 1j * OPS.binned_statistic(gridind_raveled, values.imag, statistic='sum', bins=bins)[0]
else:
retval = OPS.binned_statistic(gridind_raveled, values, statistic='sum', bins=bins)[0]
outdict = {}
outdict[label] = retval
# print MP.current_process().name
outq.put(outdict)
def find_1NN_arg_splitter(args, **kwargs):
return LKP.find_1NN(*args, **kwargs)
def genMatrixMapper_arg_splitter(args, **kwargs):
return genMatrixMapper(*args, **kwargs)
def genMatrixMapper(val, ind, shape):
if not isinstance(val, NP.ndarray):
raise TypeError('Input parameter val must be a numpy array')
if not isinstance(ind, (list,tuple)):
raise TypeError('Input parameter ind must be a list or tuple containing numpy arrays')
if val.size != ind[0].size:
raise ValueError('Input parameters val and ind must have the same size')
if not isinstance(shape, (tuple,list)):
raise TypeError('Input parameter shape must be a tuple or list')
if len(ind) != len(shape):
raise ValueError('Number of index groups in input parameter must match the number of dimensions specified in input parameter shape')
if len(ind) > 1:
for i in range(len(ind)-1):
if ind[i+1].size != ind[i].size:
raise ValueError('All index groups must have same size')
return SpM.csr_matrix((val, ind), shape=shape)
def unwrap_multidim_product(args, **kwargs):
return multidim_product(*args, **kwargs)
def multidim_product(spmat, dnsmat1, dnsmat2, spmatshape):
dnsmat = dnsmat1 * dnsmat2
return spmat.toarray().reshape(spmatshape)[NP.newaxis,:,:,:] * dnsmat[:,NP.newaxis,NP.newaxis,:]
################################################################################
def evalApertureResponse(wts_grid, ulocs, vlocs, pad=0, skypos=None):
"""
--------------------------------------------------------------------------
Evaluate response on sky from aperture weights on the UV-plane. It applies
to both single antennas and antenna pairs
Inputs:
wts_grid [numpy array or scipy sparse matrix] Complex weights on
the aperture-plane and along frequency axis. It can be a numpy
array of size nv x nu x nchan or a scipy sparse matrix of
size (nv x nu) x nchan.
ulocs [numpy array] u-locations on grid. It is of size nu and must
match the dimension in wts_grid
vlocs [numpy array] v-locations on grid. It is of size nv and must
match the dimension in wts_grid
pad [integer] indicates the amount of padding before estimating
power pattern. Applicable only when skypos is set to None.
The output power pattern will be of size 2**pad-1 times the
size of the UV-grid along l- and m-axes. Value must
not be negative. Default=0 (implies no padding). pad=1
implies padding by factor 2 along u- and v-axes
skypos [numpy array] Positions on sky at which power pattern is
to be esimated. It is a 2- or 3-column numpy array in
direction cosine coordinates. It must be of size nsrc x 2
or nsrc x 3. If set to None (default), the power pattern is
estimated over a grid on the sky. If a numpy array is
specified, then power pattern at the given locations is
estimated.
Outputs:
pbinfo is a dictionary with the following keys and values:
'pb' [numpy array] If skypos was set to None, the numpy array is
3D masked array of size nm x nl x nchan. The mask is based on
which parts of the grid are valid direction cosine coordinates
on the sky. If skypos was a numpy array denoting specific sky
locations, the value in this key is a 2D numpy array of size
nsrc x nchan
'llocs' [None or numpy array] If the power pattern estimated is a grid
(if input skypos was set to None), it contains the l-locations
of the grid on the sky. If input skypos was not set to None,
the value under this key is set to None
'mlocs' [None or numpy array] If the power pattern estimated is a grid
(if input skypos was set to None), it contains the m-locations
of the grid on the sky. If input skypos was not set to None,
the value under this key is set to None
------------------------------------------------------------------------
"""
try:
wts_grid, ulocs, vlocs
except NameError:
raise NameError('Inputs wts_grid, ulocs and vlocs must be specified')
if skypos is not None:
if not isinstance(skypos, NP.ndarray):
raise TypeError('Input skypos must be a numpy array')
if skypos.ndim != 2:
raise ValueError('Input skypos must be a 2D numpy array')
if (skypos.shape[1] < 2) or (skypos.shape[1] > 3):
raise ValueError('Input skypos must be a 2- or 3-column array')
skypos = skypos[:,:2]
if NP.any(NP.sum(skypos**2, axis=1) > 1.0):
raise ValueError('Magnitude of skypos direction cosine must not exceed unity')
if not isinstance(ulocs, NP.ndarray):
raise TypeError('Input ulocs must be a numpy array')
if not isinstance(vlocs, NP.ndarray):
raise TypeError('Input vlocs must be a numpy array')
if not isinstance(pad, int):
raise TypeError('Input must be an integer')
if pad < 0:
raise ValueError('Input pad must be non-negative')
ulocs = ulocs.ravel()
vlocs = vlocs.ravel()
wts_shape = wts_grid.shape
if wts_shape[0] != ulocs.size * vlocs.size:
raise ValueError('Shape of input wts_grid incompatible with that of ulocs and vlocs')
if SpM.issparse(wts_grid):
sum_wts = wts_grid.sum(axis=0).A # 1 x nchan
sum_wts = sum_wts[NP.newaxis,:,:] # 1 x 1 x nchan
else:
sum_wts = NP.sum(wts_grid, axis=(0,1), keepdims=True) # 1 x 1 x nchan
llocs = None
mlocs = None
if skypos is None:
if SpM.issparse(wts_grid):
shape_tuple = (vlocs.size, ulocs.size) + (wts_grid.shape[1],)
wts_grid = wts_grid.toarray().reshape(shape_tuple)
padded_wts_grid = NP.pad(wts_grid, (((2**pad-1)*vlocs.size/2,(2**pad-1)*vlocs.size/2),((2**pad-1)*ulocs.size/2,(2**pad-1)*ulocs.size/2),(0,0)), mode='constant', constant_values=0)
padded_wts_grid = NP.fft.ifftshift(padded_wts_grid, axes=(0,1))
wts_lmf = NP.fft.fft2(padded_wts_grid, axes=(0,1)) / sum_wts
pb = NP.fft.fftshift(wts_lmf, axes=(0,1))
llocs = NP.fft.fftshift(NP.fft.fftfreq(2**pad * ulocs.size, ulocs[1]-ulocs[0]))
mlocs = NP.fft.fftshift(NP.fft.fftfreq(2**pad * vlocs.size, vlocs[1]-vlocs[0]))
lmgrid_invalid = llocs.reshape(1,-1)**2 + mlocs.reshape(-1,1)**2 > 1.0
lmgrid_invalid = lmgrid_invalid[:,:,NP.newaxis] * NP.ones(pb.shape[2], dtype=NP.bool).reshape(1,1,-1)
pb = MA.array(pb, mask=lmgrid_invalid)
else:
gridu, gridv = NP.meshgrid(ulocs, vlocs)
griduv = NP.hstack((gridu.reshape(-1,1),gridv.reshape(-1,1)))
if SpM.issparse(wts_grid):
uvind = SpM.find(wts_grid)[0]
else:
eps = 1e-10
wts_grid = wts_grid.reshape(griduv.shape[0],-1)
uvind, freqind = NP.where(NP.abs(wts_grid) > eps)
wts_grid = SpM.csr_matrix((wts_grid[(uvind, freqind)], (uvind, freqind)), shape=(gridu.size,wts_grid.shape[1]), dtype=NP.complex64)
uniq_uvind = NP.unique(uvind)
matFT = NP.exp(-1j*2*NP.pi*NP.dot(skypos, griduv[uniq_uvind,:].T))
uvmeshind, srcmeshind = NP.meshgrid(uniq_uvind, NP.arange(skypos.shape[0]))
uvmeshind = uvmeshind.ravel()
srcmeshind = srcmeshind.ravel()
spFTmat = SpM.csr_matrix((matFT.ravel(), (srcmeshind, uvmeshind)), shape=(skypos.shape[0],griduv.shape[0]), dtype=NP.complex64)
sum_wts = wts_grid.sum(axis=0).A
pb = spFTmat.dot(wts_grid) / sum_wts
pb = pb.A
pb = pb.real
pbinfo = {'pb': pb, 'llocs': llocs, 'mlocs': mlocs}
return pbinfo
################################################################################
class CrossPolInfo(object):
"""
----------------------------------------------------------------------------
Class to manage cross polarization products of an interferometer.
Attributes:
Vt [dictionary] holds cross-correlation time series under 4 cross-
polarizations which are stored under keys 'P11', 'P12', 'P21', and
'P22'
Vf [dictionary] holds cross-correlation spectra under 4 cross-
polarizations which are stored under keys 'P11', 'P12', 'P21', and
'P22'
flag [dictionary] holds boolean flags for each of the 4 cross-
polarizations which are stored under keys 'P11', 'P12', 'P21', and
'P22'. Default=True means it is flagged.
Member functions:
__init__() Initializes an instance of class CrossPolInfo
__str__() Prints a summary of current attributes.
update_flags() Updates the flags based on current inputs and verifies and
updates flags based on current values of the electric field.
update() Updates the visibility time series and spectra for different
cross-polarizations
Read the member function docstrings for details.
----------------------------------------------------------------------------
"""
def __init__(self, nsamples=1):
"""
------------------------------------------------------------------------
Initialize the CrossPolInfo Class which manages polarization information
of an interferometer.
Class attributes initialized are:
Vt, Vf, flags, internal attributes _init_flags_on and _init_data_on
Read docstring of class PolInfo for details on these attributes.
------------------------------------------------------------------------
"""
self.Vt = {}
self.Vf = {}
self.flag = {}
self._init_flags_on = True
self._init_data_on = True
if not isinstance(nsamples, int):
raise TypeError('nsamples must be an integer')
elif nsamples <= 0:
nsamples = 1
for pol in ['P11', 'P12', 'P21', 'P22']:
self.Vt[pol] = NP.empty(nsamples, dtype=NP.complex64)
self.Vf[pol] = NP.empty(nsamples, dtype=NP.complex64)
self.Vt[pol].fill(NP.nan)
self.Vf[pol].fill(NP.nan)
self.flag[pol] = True
############################################################################
def __str__(self):
return ' Instance of class "{0}" in module "{1}" \n flag (P11): {2} \n flag (P12): {3} \n flag (P21): {4} \n flag (P22): {5} '.format(self.__class__.__name__, self.__module__, self.flag['P11'], self.flag['P12'], self.flag['P21'], self.flag['P22'])
############################################################################
def update_flags(self, flags=None, verify=True):
"""
------------------------------------------------------------------------
Updates the flags based on current inputs and verifies and updates flags
based on current values of the visibilities.
Inputs:
flags [dictionary] holds boolean flags for each of the 4 cross-
polarizations which are stored under keys 'P11', 'P12', 'P21',
and 'P22'. Default=None means no new flagging to be applied. If
the value under the cross-polarization key is True, it is to be
flagged and if False, it is to be unflagged.
verify [boolean] If True, verify and update the flags, if necessary.
Visibilities are checked for NaN values and if found, the
flag in the corresponding polarization is set to True.
Default=True.
Flag verification and re-updating happens if flags is set to None or if
verify is set to True.
------------------------------------------------------------------------
"""
if not isinstance(verify, bool):
raise TypeError('Input keyword verify must be of boolean type')
if flags is not None:
if not isinstance(flags, dict):
raise TypeError('Input parameter flags must be a dictionary')
for pol in ['P11', 'P12', 'P21', 'P22']:
if pol in flags:
if isinstance(flags[pol], bool):
self.flag[pol] = flags[pol]
else:
raise TypeError('flag values must be boolean')
self._init_flags_on = False
# self.flags = {pol: flags[pol] for pol in ['P11', 'P12', 'P21', 'P22'] if pol in flags}
# self._init_flags_on = False
# Perform flag verification and re-update current flags
if verify or (flags is None):
if not self._init_data_on:
for pol in ['P11', 'P12', 'P21', 'P22']:
if NP.any(NP.isnan(self.Vt[pol])):
self.flag[pol] = True
self._init_flags_on = False
############################################################################
def update(self, Vt=None, Vf=None, flags=None, verify=False):
"""
------------------------------------------------------------------------
Updates the visibility time series and spectra for different
cross-polarizations
Inputs:
Vt [dictionary] holds cross-correlation time series under 4 cross-
polarizations which are stored under keys 'P11', 'P12', 'P21',
and 'P22'. Default=None implies no updates for Vt.
Vf [dictionary] holds cross-correlation spectra under 4 cross-
polarizations which are stored under keys 'P11', 'P12', 'P21',
and 'P22'. Default=None implies no updates for Vt.
flag [dictionary] holds boolean flags for each of the 4 cross-
polarizations which are stored under keys 'P11', 'P12', 'P21',
and 'P22'. Default=None means no updates for flags.
verify [boolean] If True, verify and update the flags, if necessary.
Visibilities are checked for NaN values and if found, the
flag in the corresponding polarization is set to True.
Default=False.
------------------------------------------------------------------------
"""
current_flags = copy.deepcopy(self.flag)
if flags is None:
flags = copy.deepcopy(current_flags)
# if flags is not None:
# self.update_flags(flags)
if Vt is not None:
if isinstance(Vt, dict):
for pol in ['P11', 'P12', 'P21', 'P22']:
if pol in Vt:
self.Vt[pol] = Vt[pol]
if NP.any(NP.isnan(Vt[pol])):
# self.Vt[pol] = NP.nan
flags[pol] = True
# self.flag[pol] = True
self._init_data_on = False
else:
raise TypeError('Input parameter Vt must be a dictionary')
if Vf is not None:
if isinstance(Vf, dict):
for pol in ['P11', 'P12', 'P21', 'P22']:
if pol in Vf:
self.Vf[pol] = Vf[pol]
if NP.any(NP.isnan(Vf[pol])):
# self.Vf[pol] = NP.nan
flags[pol] = True
# self.flag[pol] = True
self._init_data_on = False
else:
raise TypeError('Input parameter Vf must be a dictionary')
# Update flags
self.update_flags(flags=flags, verify=verify)
################################################################################
class Interferometer(object):
"""
----------------------------------------------------------------------------
Class to manage individual 2-element interferometer information.
Attributes:
A1 [instance of class Antenna] First antenna
A2 [instance of class Antenna] Second antenna
corr_type [string] Correlator type. Accepted values are 'FX' (default) and
'XF'
label: [Scalar] A unique identifier (preferably a string) for the
antenna.
latitude: [Scalar] Latitude of the antenna's location.
location: [Instance of GEOM.Point class] The location of the antenna in
local East, North, Up coordinate system.
timestamp: [Scalar] String or float representing the timestamp for the
current attributes
timestamps [list] consists of a list of timestamps common to both of the
individual antennas in the antenna pair
t: [vector] The time axis for the time series of electric fields
f: [vector] Frequency axis obtained by a Fourier Transform of
the electric field time series. Same length as attribute t
f0: [Scalar] Center frequency in Hz.
crosspol: [Instance of class CrossPolInfo] polarization information for
the interferometer. Read docstring of class CrossPolInfo for
details
aperture [Instance of class APR.Aperture] aperture information
for the interferometer. Read docstring of class Aperture for
details
Vt_stack [dictionary] holds a stack of complex visibility time series
measured at various time stamps under 4 polarizations which are
stored under keys 'P11', 'P12', 'P21', and 'P22'. Each value
under the polarization key is stored as numpy array with rows
equal to the number of timestamps and columns equal to the
number of samples in a timeseries
Vf_stack [dictionary] holds a stack of complex visibility spectra
measured at various time stamps under 4 polarizations which are
stored under keys 'P11', 'P12', 'P21' and 'P22'. Each value
under the polarization key is stored as numpy array with rows
equal to the number of timestamps and columns equal to the
number of spectral channels
flag_stack [dictionary] holds a stack of flags appropriate for different
time stamps as a numpy array under 4 polarizations which are
stored under keys 'P11', 'P12', 'P21' and 'P22'. Each value
under the polarization key is stored as numpy array with
number of elements equal to the number of timestamps
Vf_avg [dictionary] holds in keys 'P11', 'P12', 'P21', 'P22' for each
polarization the stacked and averaged complex visibility spectra
as a numpy array where the number of rows is the number of time
bins after averaging visibilities in those time bins and the
number of columns is equal to the number of spectral channels
(same as in Vf_stack)
twts [dictionary] holds in keys 'P11', 'P12', 'P21', 'P22' for each
polarization the number of unflagged timestamps in each time
bin that contributed to the averaging of visibilities stored in
Vf_avg. Each array size equal to the number of rows in Vf_avg
under the corresponding polarization.
tbinsize [scalar or dictionary] Contains bin size of timestamps while
stacking. Default = None means all visibility spectra over all
timestamps are averaged. If scalar, the same (positive) value
applies to all polarizations. If dictionary, timestamp bin size
(positive) is provided under each key 'P11', 'P12', 'P21',
'P22'. If any of the keys is missing the visibilities for that
polarization are averaged over all timestamps.
wts: [dictionary] The gridding weights for interferometer. Different
cross-polarizations 'P11', 'P12', 'P21' and 'P22' form the keys
of this dictionary. These values are in general complex. Under
each key, the values are maintained as a list of numpy vectors,
where each vector corresponds to a frequency channel. See
wtspos_scale for more requirements.
wtspos [dictionary] two-dimensional locations of the gridding weights
in wts for each cross-polarization under keys 'P11', 'P12',
'P21', and 'P22'. The locations are in ENU coordinate system
as a list of 2-column numpy arrays. Each 2-column array in the
list is the position of the gridding weights for a corresponding
frequency channel. The size of the list must be the same as wts
and the number of channels. Units are in number of wavelengths.
See wtspos_scale for more requirements.
wtspos_scale [dictionary] The scaling of weights is specified for each
cross-polarization under one of the keys 'P11', 'P12', 'P21'
or 'P22'. The values under these keys can be either None
(default) or 'scale'. If None, numpy vectors in wts and
wtspos under corresponding keys are provided for each
frequency channel. If set to 'scale' wts and wtspos contain a
list of only one numpy array corresponding to a reference
frequency. This is scaled internally to correspond to the
first channel. The gridding positions are correspondingly
scaled to all the frequency channels.
blc [2-element numpy array] Bottom Left corner where the
interferometer contributes non-zero weight to the grid. Same
for all cross-polarizations
trc [2-element numpy array] Top right corner where the
interferometer contributes non-zero weight to the grid. Same
for all cross-polarizations
Member Functions:
__init__(): Initializes an instance of class Interferometer
__str__(): Prints a summary of current attributes
channels(): Computes the frequency channels from a temporal Fourier
Transform
FX() Computes the visibility spectrum using an FX operation,
i.e., Fourier transform (F) followed by multiplication (X)
using antenna information in attributes A1 and A2. All four
cross polarizations are computed.
FX_pp() Computes the visibility spectrum using an FX operation,
i.e., Fourier transform (F) followed by multiplication (X).
All four cross polarizations are computed. To be used
internally for parallel processing and not by the user directly
XF() Computes the visibility spectrum using an XF operation,
i.e., corss-correlation (X) followed by Fourier transform
(F) using antenna information in attributes A1 and A2. All
four cross polarizations are computed.
f2t() Computes the visibility time-series from the spectra for each
cross-polarization
t2f() Computes the visibility spectra from the time-series for each
cross-polarization
FX_on_stack()
Computes the visibility spectrum using an FX operation on the
time-stacked electric fields in the individual antennas in the
pair, i.e., Fourier transform (F) followed by multiplication
(X). All four cross-polarizations are computed.
flags_on_stack()
Computes the visibility flags from the time-stacked electric
fields for the common timestamps between the pair of antennas.
All four cross-polarizations are computed.
XF_on_stack()
Computes the visibility lags using an XF operation on the
time-stacked electric fields time-series in the individual
antennas in the pair, i.e., Cross-correlation (X) followed by
Fourier transform (F). All four cross-polarizations are
computed.
f2t_on_stack()
Computes the visibility lags from the spectra for each
cross-polarization from time-stacked visibilities
t2f_on_stack()
Computes the visibility spectra from the time-series for each
cross-polarization from time-stacked visibility lags
flip_antenna_pair()
Flip the antenna pair in the interferometer. This inverts the
baseline vector and conjugates the visibility spectra
refresh_antenna_pairs()
Update the individual antenna instances of the antenna pair
forming the interferometer with provided values
get_visibilities()
Returns the visibilities based on selection criteria on
timestamp flags, timestamps and frequency channel indices
and the type of data (most recent, stack or averaged
visibilities)
update_flags()
Updates flags for cross-polarizations from component antenna
polarization flags and also overrides with flags if provided
as input parameters
update(): Updates the interferometer instance with newer attribute values
Updates the visibility spectrum and timeseries and applies FX
or XF operation.
update_pp() Updates the interferometer instance with newer attribute
values. Updates the visibility spectrum and timeseries and
applies FX or XF operation. Used internally when parallel
processing is used. Not to be used by the user directly.
stack() Stacks and computes visibilities and flags from the individual
antennas in the pair.
accumulate() Accumulate and average visibility spectra across timestamps
under different polarizations depending on the time bin size
for the corresponding polarization.
save(): Saves the interferometer information to disk. Needs serious
development.
Read the member function docstrings for details.
----------------------------------------------------------------------------
"""
def __init__(self, antenna1, antenna2, corr_type=None, aperture=None):
"""
------------------------------------------------------------------------
Initialize the Interferometer Class which manages an interferometer's
information
Class attributes initialized are:
label, latitude, location, pol, t, timestamp, f0, f, wts, wtspos,
wtspos_scale, gridinfo, blc, trc, timestamps, Vt_stack, Vf_stack,
flag_stack, Vf_avg, twts, tbinsize, aperture
Read docstring of class Antenna for details on these attributes.
------------------------------------------------------------------------
"""
try:
antenna1, antenna2
except NameError:
raise NameError('Two individual antenna instances must be provided.')
if not isinstance(antenna1, Antenna):
raise TypeError('antenna1 not an instance of class Antenna')
if not isinstance(antenna2, Antenna):
raise TypeError('antenna2 not an instance of class Antenna')
self.A1 = antenna1
self.A2 = antenna2
if (corr_type is None) or (corr_type == 'FX'):
self.corr_type = 'FX'
elif corr_type == 'XF':
self.corr_type = corr_type
else:
raise ValueError('Invalid correlator type')
self.corr_type = corr_type
self.latitude = 0.5 * (self.A1.latitude + self.A2.latitude) # mean latitude of two antennas
self.location = self.A1.location - self.A2.location # Baseline vector
if self.A1.f0 != self.A2.f0:
raise ValueError('The center frequencies of the two antennas must be identical')
self.f0 = self.A1.f0
self.f = self.A1.f
self.label = (self.A1.label, self.A2.label)
self.t = 0.0
self.timestamp = 0.0
self.timestamps = []
if aperture is not None:
if isinstance(aperture, APR.Aperture):
if len(aperture.pol) != 4:
raise ValueError('Interferometer aperture must contain four cross-polarization types')
self.aperture = aperture
else:
raise TypeError('aperture must be an instance of class Aperture found in module {0}'.format(APR.__name__))
else:
self.aperture = APR.Aperture(pol_type='cross')
self.crosspol = CrossPolInfo(self.f.size)
self.Vt_stack = {}
self.Vf_stack = {}
self.flag_stack = {}
self.Vf_avg = {}
self.twts = {}
self.tbinsize = None
self.wtspos = {}
self.wts = {}
self.wtspos_scale = {}
self._gridinfo = {}
for pol in ['P11', 'P12', 'P21', 'P22']:
self.Vt_stack[pol] = None
self.Vf_stack[pol] = None
self.flag_stack[pol] = NP.asarray([])
self.Vf_avg[pol] = None
self.twts[pol] = None
self.wtspos[pol] = []
self.wts[pol] = []
self.wtspos_scale[pol] = None
self._gridinfo[pol] = {}
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1)
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1)
############################################################################
def __str__(self):
return ' Instance of class "{0}" in module "{1}" \n label: ({2[0]}, {2[1]}) \n location: {3}'.format(self.__class__.__name__, self.__module__, self.label, self.location.__str__())
############################################################################
def channels(self):
"""
------------------------------------------------------------------------
Computes the frequency channels from a temporal Fourier Transform
Output(s):
Frequencies corresponding to channels obtained by a Fourier Transform
of the time series.
------------------------------------------------------------------------
"""
return DSP.spectax(self.A1.t.size + self.A2.t.size, resolution=self.A1.t[1]-self.A1.t[0], shift=True)
############################################################################
def FX(self):
"""
------------------------------------------------------------------------
Computes the visibility spectrum using an FX operation, i.e., Fourier
transform (F) followed by multiplication (X). All four cross
polarizations are computed.
------------------------------------------------------------------------
"""
self.t = NP.hstack((self.A1.t.ravel(), self.A1.t.max()+self.A2.t.ravel()))
self.f = self.f0 + self.channels()
self.crosspol.Vf['P11'] = self.A1.antpol.Ef['P1'] * self.A2.antpol.Ef['P1'].conjugate()
self.crosspol.Vf['P12'] = self.A1.antpol.Ef['P1'] * self.A2.antpol.Ef['P2'].conjugate()
self.crosspol.Vf['P21'] = self.A1.antpol.Ef['P2'] * self.A2.antpol.Ef['P1'].conjugate()
self.crosspol.Vf['P22'] = self.A1.antpol.Ef['P2'] * self.A2.antpol.Ef['P2'].conjugate()
self.f2t()
self.crosspol._init_data_on = False
self.update_flags(flags=None, stack=False, verify=True)
############################################################################
def FX_pp(self):
"""
------------------------------------------------------------------------
Computes the visibility spectrum using an FX operation, i.e., Fourier
transform (F) followed by multiplication (X). All four cross
polarizations are computed. To be used internally for parallel
processing and not by the user directly
------------------------------------------------------------------------
"""
self.t = NP.hstack((self.A1.t.ravel(), self.A1.t.max()+self.A2.t.ravel()))
self.f = self.f0 + self.channels()
self.crosspol.Vf['P11'] = self.A1.antpol.Ef['P1'] * self.A2.antpol.Ef['P1'].conjugate()
self.crosspol.Vf['P12'] = self.A1.antpol.Ef['P1'] * self.A2.antpol.Ef['P2'].conjugate()
self.crosspol.Vf['P21'] = self.A1.antpol.Ef['P2'] * self.A2.antpol.Ef['P1'].conjugate()
self.crosspol.Vf['P22'] = self.A1.antpol.Ef['P2'] * self.A2.antpol.Ef['P2'].conjugate()
self.f2t()
self.crosspol._init_data_on = False
self.update_flags(flags=None, stack=False, verify=True)
return self
############################################################################
def XF(self):
"""
------------------------------------------------------------------------
Computes the visibility spectrum using an XF operation, i.e.,
Correlation (X) followed by Fourier transform (X). All four cross
polarizations are computed.
------------------------------------------------------------------------
"""
self.t = NP.hstack((self.A1.t.ravel(), self.A1.t.max()+self.A2.t.ravel()))
self.f = self.f0 + self.channels()
self.crosspol.Vt['P11'] = DSP.XC(self.A1.antpol.Et['P1'], self.A2.antpol.Et['P1'], shift=False)
self.crosspol.Vt['P12'] = DSP.XC(self.A1.antpol.Et['P1'], self.A2.antpol.Et['P2'], shift=False)
self.crosspol.Vt['P21'] = DSP.XC(self.A1.antpol.Et['P2'], self.A2.antpol.Et['P1'], shift=False)
self.crosspol.Vt['P22'] = DSP.XC(self.A1.antpol.Et['P2'], self.A2.antpol.Et['P2'], shift=False)
self.t2f()
self.crosspol._init_data_on = False
self.update_flags(flags=None, stack=False, verify=True)
############################################################################
def f2t(self):
"""
------------------------------------------------------------------------
Computes the visibility time-series from the spectra for each cross-
polarization
------------------------------------------------------------------------
"""
for pol in ['P11', 'P12', 'P21', 'P22']:
self.crosspol.Vt[pol] = DSP.FT1D(NP.fft.fftshift(self.crosspol.Vf[pol]), inverse=True, shift=True, verbose=False)
############################################################################
def t2f(self):
"""
------------------------------------------------------------------------
Computes the visibility spectra from the time-series for each cross-
polarization
------------------------------------------------------------------------
"""
for pol in ['P11', 'P12', 'P21', 'P22']:
self.crosspol.Vf[pol] = DSP.FT1D(NP.fft.ifftshift(self.crosspol.Vt[pol]), shift=True, verbose=False)
############################################################################
def FX_on_stack(self):
"""
------------------------------------------------------------------------
Computes the visibility spectrum using an FX operation on the
time-stacked electric fields in the individual antennas in the pair,
i.e., Fourier transform (F) followed by multiplication (X). All four
cross-polarizations are computed.
------------------------------------------------------------------------
"""
self.t = NP.hstack((self.A1.t.ravel(), self.A1.t.max()+self.A2.t.ravel()))
self.f = self.f0 + self.channels()
ts1 = NP.asarray(self.A1.timestamps)
ts2 = NP.asarray(self.A2.timestamps)
common_ts = NP.intersect1d(ts1, ts2, assume_unique=True)
ind1 = NP.in1d(ts1, common_ts, assume_unique=True)
ind2 = NP.in1d(ts2, common_ts, assume_unique=True)
self.Vf_stack['P11'] = self.A1.Ef_stack['P1'][ind1,:] * self.A2.Ef_stack['P1'][ind2,:].conjugate()
self.Vf_stack['P12'] = self.A1.Ef_stack['P1'][ind1,:] * self.A2.Ef_stack['P2'][ind2,:].conjugate()
self.Vf_stack['P21'] = self.A1.Ef_stack['P2'][ind1,:] * self.A2.Ef_stack['P1'][ind2,:].conjugate()
self.Vf_stack['P22'] = self.A1.Ef_stack['P2'][ind1,:] * self.A2.Ef_stack['P2'][ind2,:].conjugate()
self.f2t_on_stack()
############################################################################
def flags_on_stack(self):
"""
------------------------------------------------------------------------
Computes the visibility flags from the time-stacked electric fields for
the common timestamps between the pair of antennas. All four
cross-polarizations are computed.
------------------------------------------------------------------------
"""
ts1 = NP.asarray(self.A1.timestamps)
ts2 = NP.asarray(self.A2.timestamps)
common_ts = NP.intersect1d(ts1, ts2, assume_unique=True)
ind1 = NP.in1d(ts1, common_ts, assume_unique=True)
ind2 = NP.in1d(ts2, common_ts, assume_unique=True)
self.flag_stack['P11'] = NP.logical_or(self.A1.flag_stack['P1'][ind1],
self.A2.flag_stack['P1'][ind2])
self.flag_stack['P12'] = NP.logical_or(self.A1.flag_stack['P1'][ind1],
self.A2.flag_stack['P2'][ind2])
self.flag_stack['P21'] = NP.logical_or(self.A1.flag_stack['P2'][ind1],
self.A2.flag_stack['P1'][ind2])
self.flag_stack['P22'] = NP.logical_or(self.A1.flag_stack['P2'][ind1],
self.A2.flag_stack['P2'][ind2])
############################################################################
def XF_on_stack(self):
"""
------------------------------------------------------------------------
Computes the visibility lags using an XF operation on the time-stacked
electric fields time-series in the individual antennas in the pair,
i.e., Cross-correlation (X) followed by Fourier transform (F). All four
cross-polarizations are computed.
THIS WILL NOT WORK IN ITS CURRENT FORM BECAUSE THE ENGINE OF THIS IS
THE CORRELATE FUNCTION OF NUMPY WRAPPED INSIDE XC() IN MY_DSP_MODULE
AND CURRENTLY IT CAN HANDLE ONLY 1D ARRAYS. NEEDS SERIOUS DEVELOPMENT!
------------------------------------------------------------------------
"""
self.t = NP.hstack((self.A1.t.ravel(), self.A1.t.max()+self.A2.t.ravel()))
self.f = self.f0 + self.channels()
ts1 = NP.asarray(self.A1.timestamps)
ts2 = NP.asarray(self.A2.timestamps)
common_ts = NP.intersect1d(ts1, ts2, assume_unique=True)
ind1 = NP.in1d(ts1, common_ts, assume_unique=True)
ind2 = NP.in1d(ts2, common_ts, assume_unique=True)
self.Vt_stack['P11'] = DSP.XC(self.A1.Et_stack['P1'], self.A2.Et_stack['P1'], shift=False)
self.Vt_stack['P12'] = DSP.XC(self.A1.Et_stack['P1'], self.A2.Et_stack['P2'], shift=False)
self.Vt_stack['P21'] = DSP.XC(self.A1.Et_stack['P2'], self.A2.Et_stack['P1'], shift=False)
self.Vt_stack['P22'] = DSP.XC(self.A1.Et_stack['P2'], self.A2.Et_stack['P2'], shift=False)
self.t2f_on_stack()
############################################################################
def f2t_on_stack(self):
"""
------------------------------------------------------------------------
Computes the visibility lags from the spectra for each cross-
polarization from time-stacked visibilities
------------------------------------------------------------------------
"""
for pol in ['P11', 'P12', 'P21', 'P22']:
self.Vt_stack[pol] = DSP.FT1D(NP.fft.fftshift(self.Vf_stack[pol]),
ax=1, inverse=True, shift=True,
verbose=False)
############################################################################
def t2f_on_stack(self):
"""
------------------------------------------------------------------------
Computes the visibility spectra from the time-series for each cross-
polarization from time-stacked visibility lags
------------------------------------------------------------------------
"""
for pol in ['P11', 'P12', 'P21', 'P22']:
self.Vf_stack[pol] = DSP.FT1D(NP.fft.ifftshift(self.Vt_stack[pol]),
ax=1, shift=True, verbose=False)
############################################################################
def flip_antenna_pair(self):
"""
------------------------------------------------------------------------
Flip the antenna pair in the interferometer. This inverts the baseline
vector and conjugates the visibility spectra
------------------------------------------------------------------------
"""
self.A1, self.A2 = self.A2, self.A1 # Flip antenna instances
self.location = -1 * self.location # Multiply baseline vector by -1
self.blc *= -1
self.trc *= -1
self.crosspol.flag['P12'], self.crosspol.flag['P21'] = self.crosspol.flag['P21'], self.crosspol.flag['P12']
self.crosspol.Vf['P11'] = self.crosspol.Vf['P11'].conjugate()
self.crosspol.Vf['P22'] = self.crosspol.Vf['P22'].conjugate()
self.crosspol.Vf['P12'], self.crosspol.Vf['P21'] = self.crosspol.Vf['P21'].conjugate(), self.crosspol.Vf['P12'].conjugate()
self.f2t()
############################################################################
def refresh_antenna_pairs(self, A1=None, A2=None):
"""
------------------------------------------------------------------------
Update the individual antenna instances of the antenna pair forming
the interferometer with provided values
Inputs:
A1 [instance of class Antenna] first antenna instance in the
antenna pair corresponding to attribute A1. Default=None (no
update for attribute A1)
A2 [instance of class Antenna] first antenna instance in the
antenna pair corresponding to attribute A2. Default=None (no
update for attribute A2)
------------------------------------------------------------------------
"""
if isinstance(A1, Antenna):
self.A1 = A1
else:
raise TypeError('Input A1 must be an instance of class Antenna')
if isinstance(A2, Antenna):
self.A2 = A2
else:
raise TypeError('Input A2 must be an instance of class Antenna')
############################################################################
def get_visibilities(self, pol, flag=None, tselect=None, fselect=None,
datapool=None):
"""
------------------------------------------------------------------------
Returns the visibilities based on selection criteria on timestamp
flags, timestamps and frequency channel indices and the type of data
(most recent, stack or averaged visibilities)
Inputs:
pol [string] select baselines of this polarization that are either
flagged or unflagged as specified by input parameter flag.
Allowed values are 'P11', 'P12', 'P21', and 'P22'. Only one of
these values must be specified.
flag [boolean] If False, return visibilities of unflagged
timestamps, otherwise return flagged ones. Default=None means
all visibilities independent of flagging are returned. This
flagging refers to that along the timestamp axis under each
polarization
tselect [scalar, list, numpy array] timestamp index for visibilities
selection. For most recent visibility, it must be set to -1.
For all other selections, indices in tselect must be in the
valid range of indices along time axis for stacked and
averaged visibilities. Default=None means most recent data is
selected.
fselect [scalar, list, numpy array] frequency channel index for
visibilities selection. Indices must be in the valid range of
indices along the frequency axis for visibilities.
Default=None selects all frequency channels
datapool [string] denotes the data pool from which visibilities are to
be selected. Accepted values are 'current', 'stack', 'avg' and
None (default, same as 'current'). If set to None or
'current', the value in tselect is ignored and only
visibilities of the most recent timestamp are selected. If set
to None or 'current' the attribute Vf_stack is checked first
and if unavailable, attribute crosspol.Vf is used. For 'stack'
and 'avg', attributes Vf_stack and Vf_avg are used
respectively
Output:
outdict [dictionary] consists of visibilities information under the
following keys:
'label' [tuple] interferometer label as a tuple of
individual antenna labels
'pol' [string] polarization string, one of 'P11',
'P12', 'P21', or 'P22'
'visibilities' [numpy array] selected visibilities spectra
with dimensions n_ts x nchan which
are in time-frequency order. If no
visibilities are found satisfying the selection
criteria, the value under this key is set to
None.
'twts' [numpy array] weights corresponding to the time
axis in the selected visibilities. These
weights are determined by flagging of
timestamps. A zero weight indicates unflagged
visibilities were not found for that timestamp.
A non-zero weight indicates how many unflagged
visibilities were found for that time bin (in
case of averaged visibilities) or timestamp.
If no visibilities are found satisfying the
selection criteria, the value under this key
is set to None.
------------------------------------------------------------------------
"""
try:
pol
except NameError:
raise NameError('Input parameter pol must be specified.')
if not isinstance(pol, str):
raise TypeError('Input parameter must be a string')
if not pol in ['P11', 'P12', 'P21', 'P22']:
raise ValueError('Invalid specification for input parameter pol')
if datapool is None:
n_timestamps = 1
datapool = 'current'
elif datapool == 'stack':
n_timestamps = len(self.timestamps)
elif datapool == 'avg':
n_timestamps = self.Vf_avg[pol].shape[0]
elif datapool == 'current':
n_timestamps = 1
else:
raise ValueError('Invalid datapool specified')
if tselect is None:
tsind = NP.asarray(-1).reshape(-1) # Selects most recent data
elif isinstance(tselect, (int, float, list, NP.ndarray)):
tsind = NP.asarray(tselect).ravel()
tsind = tsind.astype(NP.int)
if tsind.size == 1:
if (tsind < -1) or (tsind >= n_timestamps):
tsind = NP.asarray(-1).reshape(-1)
else:
if NP.any(tsind < 0) or NP.any(tsind >= n_timestamps):
raise IndexError('Timestamp indices outside available range for the specified datapool')
else:
raise TypeError('tselect must be None, integer, float, list or numpy array for visibilities selection')
if fselect is None:
chans = NP.arange(self.f.size) # Selects all channels
elif isinstance(fselect, (int, float, list, NP.ndarray)):
chans = NP.asarray(fselect).ravel()
chans = chans.astype(NP.int)
if NP.any(chans < 0) or NP.any(chans >= self.f.size):
raise IndexError('Channel indices outside available range')
else:
raise TypeError('fselect must be None, integer, float, list or numpy array for visibilities selection')
select_ind = NP.ix_(tsind, chans)
outdict = {}
outdict['pol'] = pol
outdict['twts'] = None
outdict['label'] = self.label
outdict['visibilities'] = None
if datapool == 'current':
if self.Vf_stack[pol] is not None:
outdict['visibilities'] = self.Vf_stack[pol][-1,chans].reshape(1,chans.size)
outdict['twts'] = NP.logical_not(NP.asarray(self.flag_stack[pol][-1]).astype(NP.bool).reshape(-1)).astype(NP.float)
else:
outdict['visibilities'] = self.crosspol.Vf[pol][chans].reshape(1,chans.size)
outdict['twts'] = NP.logical_not(NP.asarray(self.crosspol.flag[pol]).astype(NP.bool).reshape(-1)).astype(NP.float)
elif datapool == 'stack':
if self.Vf_stack[pol] is not None:
outdict['visibilities'] = self.Vf_stack[pol][select_ind].reshape(tsind.size,chans.size)
outdict['twts'] = NP.logical_not(NP.asarray(self.flag_stack[pol][tsind]).astype(NP.bool).reshape(-1)).astype(NP.float)
else:
raise ValueError('Attribute Vf_stack has not been initialized to obtain visibilities from. Consider running method stack()')
else:
if self.Vf_avg[pol] is not None:
outdict['visibilities'] = self.Vf_avg[pol][select_ind].reshape(tsind.size,chans.size)
outdict['twts'] = NP.asarray(self.twts[pol][tsind]).reshape(-1)
else:
raise ValueError('Attribute Vf_avg has not been initialized to obtain visibilities from. Consider running methods stack() and accumulate()')
if flag is not None:
if not isinstance(flag, bool):
raise TypeError('flag keyword has to be a Boolean value.')
if flag:
if NP.sum(outdict['twts'] == 0) == 0:
outdict['twts'] = None
outdict['visibilities'] = None
else:
outdict['visibilities'] = outdict['visibilities'][outdict['twts']==0,:].reshape(-1,chans.size)
outdict['twts'] = outdict['twts'][outdict['twts']==0].reshape(-1,1)
else:
if NP.sum(outdict['twts'] > 0) == 0:
outdict['twts'] = None
outdict['visibilities'] = None
else:
outdict['visibilities'] = outdict['visibilities'][outdict['twts']>0,:].reshape(-1,chans.size)
outdict['twts'] = outdict['twts'][outdict['twts']>0].reshape(-1,1)
return outdict
############################################################################
def update_flags(self, flags=None, stack=False, verify=True):
"""
------------------------------------------------------------------------
Updates flags for cross-polarizations from component antenna
polarization flags and also overrides with flags if provided as input
parameters
Inputs:
flags [dictionary] boolean flags for each of the 4 cross-polarizations
of the interferometer which are stored under keys 'P11', 'P12',
'P21', and 'P22'. Default=None means no updates for flags.
stack [boolean] If True, appends the updated flag to the
end of the stack of flags as a function of timestamp. If False,
updates the last flag in the stack with the updated flag and
does not append. Default=False
verify [boolean] If True, verify and update the flags, if necessary.
Visibilities are checked for NaN values and if found, the
flag in the corresponding polarization is set to True. Flags of
individual antennas forming a pair are checked and transferred
to the visibility flags. Default=True
------------------------------------------------------------------------
"""
# By default carry over the flags from previous timestamp
# unless updated in this timestamp as below
# Flags determined from interferometer level
if flags is None:
if self.crosspol._init_flags_on: # begin with all flags set to False for first time update of flags
flags = {pol: False for pol in ['P11', 'P12', 'P21', 'P22']}
else: # for non-first time updates carry over flags from last timestamp and process
flags = copy.deepcopy(self.crosspol.flag)
# now update flags based on current antenna flags
if self.A1.antpol.flag['P1'] or self.A2.antpol.flag['P1']:
flags['P11'] = True
if self.A1.antpol.flag['P2'] or self.A2.antpol.flag['P1']:
flags['P21'] = True
if self.A1.antpol.flag['P1'] or self.A2.antpol.flag['P2']:
flags['P12'] = True
if self.A1.antpol.flag['P2'] or self.A2.antpol.flag['P2']:
flags['P22'] = True
if verify: # Verify provided flags or default flags created above
if self.A1.antpol.flag['P1'] or self.A2.antpol.flag['P1']:
flags['P11'] = True
if self.A1.antpol.flag['P2'] or self.A2.antpol.flag['P1']:
flags['P21'] = True
if self.A1.antpol.flag['P1'] or self.A2.antpol.flag['P2']:
flags['P12'] = True
if self.A1.antpol.flag['P2'] or self.A2.antpol.flag['P2']:
flags['P22'] = True
self.crosspol.update_flags(flags=flags, verify=verify)
# Stack on to last value or update last value in stack
for pol in ['P11', 'P12', 'P21', 'P22']:
if stack is True:
self.flag_stack[pol] = NP.append(self.flag_stack[pol], self.crosspol.flag[pol])
else:
if self.flag_stack[pol].size > 0:
self.flag_stack[pol][-1] = self.crosspol.flag[pol]
# else:
# self.flag_stack[pol] = NP.asarray(self.crosspol.flag[pol]).reshape(-1)
self.flag_stack[pol] = self.flag_stack[pol].astype(NP.bool)
############################################################################
def update_old(self, label=None, Vt=None, t=None, timestamp=None,
location=None, wtsinfo=None, flags=None, gridfunc_freq=None,
ref_freq=None, do_correlate=None, stack=False,
verify_flags=True, verbose=False):
"""
------------------------------------------------------------------------
Updates the interferometer instance with newer attribute values. Updates
the visibility spectrum and timeseries and applies FX or XF operation.
Inputs:
label [Scalar] A unique identifier (preferably a string) for the
antenna. Default=None means no update to apply
latitude [Scalar] Latitude of the antenna's location. Default=None
means no update to apply
location [Instance of GEOM.Point class] The location of the antenna in
local East, North, Up (ENU) coordinate system. Default=None
means no update to apply
timestamp [Scalar] String or float representing the timestamp for the
current attributes. Default=None means no update to apply
t [vector] The time axis for the visibility time series.
Default=None means no update to apply
flags [dictionary] holds boolean flags for each of the 4 cross-
polarizations which are stored under keys 'P11', 'P12',
'P21', and 'P22'. Default=None means no updates for flags.
Vt [dictionary] holds cross-correlation time series under 4
cross-polarizations which are stored under keys 'P11', 'P12',
'P21', and 'P22'. Default=None implies no updates for Vt.
wtsinfo [dictionary] consists of weights information for each of the
four cross-polarizations under keys 'P11', 'P12', 'P21', and
'P22'. Each of the values under the keys is a list of
dictionaries. Length of list is equal to the number
of frequency channels or one (equivalent to setting
wtspos_scale to 'scale'.). The list is indexed by
the frequency channel number. Each element in the list
consists of a dictionary corresponding to that frequency
channel. Each dictionary consists of these items with the
following keys:
wtspos [2-column Numpy array, optional] u- and v-
positions for the gridding weights. Units
are in number of wavelengths.
wts [Numpy array] Complex gridding weights. Size is
equal to the number of rows in wtspos above
orientation [scalar] Orientation (in radians) of the wtspos
coordinate system relative to the local ENU
coordinate system. It is measured North of East.
lookup [string] If set, refers to a file location
containing the wtspos and wts information above
as columns (x-loc [float], y-loc [float], wts
[real], wts[imag if any]). If set, wtspos and wts
information are obtained from this lookup table
and the wtspos and wts keywords in the dictionary
are ignored. Note that wtspos values are obtained
after dividing x- and y-loc lookup values by the
wavelength
gridfunc_freq
[String scalar] If set to None (not provided) or to 'scale'
assumes that wtspos in wtsinfo are given for a
reference frequency which need to be scaled for the frequency
channels. Will be ignored if the list of dictionaries under
the cross-polarization keys in wtsinfo have number of
elements equal to the number of frequency channels.
ref_freq [Scalar] Positive value (in Hz) of reference frequency (used
if gridfunc_freq is set to None or 'scale') at which
wtspos is provided. If set to None, ref_freq is assumed to be
equal to the center frequency in the class Interferometer's
attribute.
do_correlate
[string] Indicates whether correlation operation is to be
performed after updates. Accepted values are 'FX' (for FX
operation) and 'XF' (for XF operation). Default=None means
no correlating operation is to be performed after updates.
stack [boolean] If True (default), appends the updated flag
and data to the end of the stack as a function of
timestamp. If False, updates the last flag and data in
the stack and does not append
verify_flags
[boolean] If True, verify and update the flags, if necessary.
Visibilities are checked for NaN values and if found, the
flag in the corresponding polarization is set to True. Flags
of individual antennas forming a pair are checked and
transferred to the visibility flags. Default=True
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
if label is not None: self.label = label
if location is not None: self.location = location
if timestamp is not None: self.timestamp = timestamp
# if latitude is not None: self.latitude = latitude
# Proceed with interferometer updates only if timestamps align
if (self.timestamp != self.A1.timestamp) or (self.timestamp != self.A2.timestamp):
if verbose:
print 'Interferometer timestamp does not match with the component antenna timestamp(s). Update for interferometer {0} will be skipped.'.format(self.label)
else:
self.timestamps += [copy.deepcopy(self.timestamp)]
if t is not None:
self.t = t
self.f = self.f0 + self.channels()
if (Vt is not None) or (flags is not None):
self.crosspol.update(Vt=Vt, flags=flags, verify=verify_flags)
if do_correlate is not None:
if do_correlate == 'FX':
self.FX()
elif do_correlate == 'XF':
self.XF()
else:
raise ValueError('Invalid specification for input parameter do_correlate.')
self.update_flags(flags=None, stack=stack, verify=True) # Re-check flags and stack
for pol in ['P11', 'P12', 'P21', 'P22']:
if self.Vt_stack[pol] is None:
self.Vt_stack[pol] = copy.deepcopy(self.crosspol.Vt[pol].reshape(1,-1))
self.Vf_stack[pol] = copy.deepcopy(self.crosspol.Vf[pol].reshape(1,-1))
else:
if stack:
self.Vt_stack[pol] = NP.vstack((self.Vt_stack[pol], self.crosspol.Vt[pol].reshape(1,-1)))
self.Vf_stack[pol] = NP.vstack((self.Vf_stack[pol], self.crosspol.Vf[pol].reshape(1,-1)))
else:
self.Vt_stack[pol][-1,:] = copy.deepcopy(self.crosspol.Vt[pol].reshape(1,-1))
self.Vf_stack[pol][-1,:] = copy.deepcopy(self.crosspol.Vf[pol].reshape(1,-1))
blc_orig = NP.copy(self.blc)
trc_orig = NP.copy(self.trc)
eps = 1e-6
if wtsinfo is not None:
if not isinstance(wtsinfo, dict):
raise TypeError('Input parameter wtsinfo must be a dictionary.')
self.wtspos = {}
self.wts = {}
self.wtspos_scale = {}
angles = []
max_wtspos = []
for pol in ['P11', 'P12', 'P21', 'P22']:
self.wts[pol] = []
self.wtspos[pol] = []
self.wtspos_scale[pol] = None
if pol in wtsinfo:
if len(wtsinfo[pol]) == len(self.f):
angles += [elem['orientation'] for elem in wtsinfo[pol]]
for i in xrange(len(self.f)):
rotation_matrix = NP.asarray([[NP.cos(-angles[i]), NP.sin(-angles[i])],
[-NP.sin(-angles[i]), NP.cos(-angles[i])]])
if ('lookup' not in wtsinfo[pol][i]) or (wtsinfo[pol][i]['lookup'] is None):
self.wts[pol] += [wtsinfo[pol][i]['wts']]
wtspos = wtsinfo[pol][i]['wtspos']
else:
lookupdata = LKP.read_lookup(wtsinfo[pol][i]['lookup'])
wtspos = NP.hstack((lookupdata[0].reshape(-1,1),lookupdata[1].reshape(-1,1))) * (self.f[i]/FCNST.c)
self.wts[pol] += [lookupdata[2]]
self.wtspos[pol] += [ NP.dot(NP.asarray(wtspos), rotation_matrix.T) ]
max_wtspos += [NP.amax(NP.abs(self.wtspos[pol][-1]), axis=0)]
elif len(wtsinfo[pol]) == 1:
if (gridfunc_freq is None) or (gridfunc_freq == 'scale'):
self.wtspos_scale[pol] = 'scale'
if ref_freq is None:
ref_freq = self.f0
angles = wtsinfo[pol][0]['orientation']
rotation_matrix = NP.asarray([[NP.cos(-angles), NP.sin(-angles)],
[-NP.sin(-angles), NP.cos(-angles)]])
if ('lookup' not in wtsinfo[pol][0]) or (wtsinfo[pol][0]['lookup'] is None):
self.wts[pol] += [ wtsinfo[pol][0]['wts'] ]
wtspos = wtsinfo[pol][0]['wtspos']
else:
lookupdata = LKP.read_lookup(wtsinfo[pol][0]['lookup'])
wtspos = NP.hstack((lookupdata[0].reshape(-1,1),lookupdata[1].reshape(-1,1))) * (ref_freq/FCNST.c)
self.wts[pol] += [lookupdata[2]]
self.wtspos[pol] += [ (self.f[0]/ref_freq) * NP.dot(NP.asarray(wtspos), rotation_matrix.T) ]
max_wtspos += [NP.amax(NP.abs(self.wtspos[pol][-1]), axis=0)]
else:
raise ValueError('gridfunc_freq must be set to None, "scale" or "noscale".')
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) - FCNST.c/self.f.min() * NP.amin(NP.abs(self.wtspos[pol][0]), 0)
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) + FCNST.c/self.f.min() * NP.amax(NP.abs(self.wtspos[pol][0]), 0)
else:
raise ValueError('Number of elements in wtsinfo for {0} is incompatible with the number of channels.'.format(pol))
max_wtspos = NP.amax(NP.asarray(max_wtspos).reshape(-1,blc_orig.size), axis=0)
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) - FCNST.c/self.f.min() * max_wtspos
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) + FCNST.c/self.f.min() * max_wtspos
if (NP.abs(NP.linalg.norm(blc_orig)-NP.linalg.norm(self.blc)) > eps) or (NP.abs(NP.linalg.norm(trc_orig)-NP.linalg.norm(self.trc)) > eps):
if verbose:
print 'Grid corner(s) of interferometer {0} have changed. Should re-grid the interferometer array.'.format(self.label)
############################################################################
def update(self, update_dict=None, verbose=False):
"""
------------------------------------------------------------------------
Updates the interferometer instance with newer attribute values. Updates
the visibility spectrum and timeseries and applies FX or XF operation.
Inputs:
update_dict [dictionary] contains the following keys and values:
label [Scalar] A unique identifier (preferably a string) for
the antenna. Default=None means no update to apply
latitude [Scalar] Latitude of the antenna's location. Default=None
means no update to apply
location [Instance of GEOM.Point class] The location of the
antenna in local East, North, Up (ENU) coordinate system.
Default=None means no update to apply
timestamp [Scalar] String or float representing the timestamp for
the current attributes. Default=None means no update to
apply
t [vector] The time axis for the visibility time series.
Default=None means no update to apply
flags [dictionary] holds boolean flags for each of the 4
cross-polarizations which are stored under keys 'P11',
'P12', 'P21', and 'P22'. Default=None means no updates
for flags.
Vt [dictionary] holds cross-correlation time series under 4
cross-polarizations which are stored under keys 'P11',
'P12', 'P21', and 'P22'. Default=None implies no updates
for Vt.
aperture [instance of class APR.Aperture] aperture information for
the interferometer. Read docstring of class Aperture for
details
wtsinfo [dictionary] consists of weights information for each of
the four cross-polarizations under keys 'P11', 'P12',
'P21', and 'P22'. Each of the values under the keys is a
list of dictionaries. Length of list is equal to the
number of frequency channels or one (equivalent to
setting wtspos_scale to 'scale'.). The list is indexed by
the frequency channel number. Each element in the list
consists of a dictionary corresponding to that frequency
channel. Each dictionary consists of these items with the
following keys:
wtspos [2-column Numpy array, optional] u- and v-
positions for the gridding weights. Units
are in number of wavelengths.
wts [Numpy array] Complex gridding weights. Size
is equal to the number of rows in wtspos
above
orientation [scalar] Orientation (in radians) of the
wtspos coordinate system relative to the
local ENU coordinate system. It is measured
North of East.
lookup [string] If set, refers to a file location
containing the wtspos and wts information
above as columns (x-loc [float], y-loc
[float], wts[real], wts[imag if any]). If
set, wtspos and wts information are obtained
from this lookup table and the wtspos and wts
keywords in the dictionary are ignored. Note
that wtspos values are obtained after
dividing x- and y-loc lookup values by the
wavelength
gridfunc_freq
[String scalar] If set to None (not provided) or to
'scale' assumes that wtspos in wtsinfo are given for a
reference frequency which need to be scaled for the
frequency channels. Will be ignored if the list of
dictionaries under the cross-polarization keys in
wtsinfo have number of elements equal to the number of
frequency channels.
ref_freq [Scalar] Positive value (in Hz) of reference frequency
(used if gridfunc_freq is set to None or 'scale') at
which wtspos is provided. If set to None, ref_freq is
assumed to be equal to the center frequency in the class
Interferometer's attribute.
do_correlate
[string] Indicates whether correlation operation is to be
performed after updates. Accepted values are 'FX' (for FX
operation) and 'XF' (for XF operation). Default=None
means no correlating operation is to be performed after
updates.
stack [boolean] If True (default), appends the updated flag
and data to the end of the stack as a function of
timestamp. If False, updates the last flag and data in
the stack and does not append
verify_flags
[boolean] If True, verify and update the flags, if
necessary. Visibilities are checked for NaN values and if
found, the flag in the corresponding polarization is set
to True. Flags of individual antennas forming a pair are
checked and transferred to the visibility flags.
Default=True
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
label = None
location = None
timestamp = None
t = None
flags = None
stack = False
verify_flags = True
Vt = None
do_correlate = None
wtsinfo = None
gridfunc_freq = None
ref_freq = None
aperture = None
if update_dict is not None:
if not isinstance(update_dict, dict):
raise TypeError('Input parameter containing updates must be a dictionary')
if 'label' in update_dict: label = update_dict['label']
if 'location' in update_dict: location = update_dict['location']
if 'timestamp' in update_dict: timestamp = update_dict['timestamp']
if 't' in update_dict: t = update_dict['t']
if 'Vt' in update_dict: Vt = update_dict['Vt']
if 'flags' in update_dict: flags = update_dict['flags']
if 'stack' in update_dict: stack = update_dict['stack']
if 'verify_flags' in update_dict: verify_flags = update_dict['verify_flags']
if 'do_correlate' in update_dict: do_correlate = update_dict['do_correlate']
if 'wtsinfo' in update_dict: wtsinfo = update_dict['wtsinfo']
if 'gridfunc_freq' in update_dict: gridfunc_freq = update_dict['gridfunc_freq']
if 'ref_freq' in update_dict: ref_freq = update_dict['ref_freq']
if 'aperture' in update_dict: aperture = update_dict['aperture']
if label is not None: self.label = label
if location is not None: self.location = location
if timestamp is not None: self.timestamp = timestamp
# if latitude is not None: self.latitude = latitude
# Proceed with interferometer updates only if timestamps align
if (self.timestamp != self.A1.timestamp) or (self.timestamp != self.A2.timestamp):
if verbose:
print 'Interferometer timestamp does not match with the component antenna timestamp(s). Update for interferometer {0} will be skipped.'.format(self.label)
else:
self.timestamps += [copy.deepcopy(self.timestamp)]
if t is not None:
self.t = t
self.f = self.f0 + self.channels()
self.crosspol.update(Vt=Vt, flags=flags, verify=verify_flags)
if do_correlate is not None:
if do_correlate == 'FX':
self.FX()
elif do_correlate == 'XF':
self.XF()
else:
raise ValueError('Invalid specification for input parameter do_correlate.')
self.update_flags(flags=None, stack=stack, verify=False) # Stack flags. Flag verification has already been performed inside FX() or XF()
for pol in ['P11', 'P12', 'P21', 'P22']:
if not self.crosspol._init_data_on:
if self.Vt_stack[pol] is None:
if stack:
self.Vt_stack[pol] = copy.deepcopy(self.crosspol.Vt[pol].reshape(1,-1))
self.Vf_stack[pol] = copy.deepcopy(self.crosspol.Vf[pol].reshape(1,-1))
else:
if stack:
self.Vt_stack[pol] = NP.vstack((self.Vt_stack[pol], self.crosspol.Vt[pol].reshape(1,-1)))
self.Vf_stack[pol] = NP.vstack((self.Vf_stack[pol], self.crosspol.Vf[pol].reshape(1,-1)))
else:
self.Vt_stack[pol][-1,:] = copy.deepcopy(self.crosspol.Vt[pol].reshape(1,-1))
self.Vf_stack[pol][-1,:] = copy.deepcopy(self.crosspol.Vf[pol].reshape(1,-1))
blc_orig = NP.copy(self.blc)
trc_orig = NP.copy(self.trc)
eps = 1e-6
if aperture is not None:
if isinstance(aperture, APR.Aperture):
self.aperture = copy.deepcopy(aperture)
else:
raise TypeError('Update for aperture must be an instance of class Aperture.')
if wtsinfo is not None:
if not isinstance(wtsinfo, dict):
raise TypeError('Input parameter wtsinfo must be a dictionary.')
self.wtspos = {}
self.wts = {}
self.wtspos_scale = {}
angles = []
max_wtspos = []
for pol in ['P11', 'P12', 'P21', 'P22']:
self.wts[pol] = []
self.wtspos[pol] = []
self.wtspos_scale[pol] = None
if pol in wtsinfo:
if len(wtsinfo[pol]) == len(self.f):
angles += [elem['orientation'] for elem in wtsinfo[pol]]
for i in xrange(len(self.f)):
rotation_matrix = NP.asarray([[NP.cos(-angles[i]), NP.sin(-angles[i])],
[-NP.sin(-angles[i]), NP.cos(-angles[i])]])
if ('lookup' not in wtsinfo[pol][i]) or (wtsinfo[pol][i]['lookup'] is None):
self.wts[pol] += [wtsinfo[pol][i]['wts']]
wtspos = wtsinfo[pol][i]['wtspos']
else:
lookupdata = LKP.read_lookup(wtsinfo[pol][i]['lookup'])
wtspos = NP.hstack((lookupdata[0].reshape(-1,1),lookupdata[1].reshape(-1,1))) * (self.f[i]/FCNST.c)
self.wts[pol] += [lookupdata[2]]
self.wtspos[pol] += [ NP.dot(NP.asarray(wtspos), rotation_matrix.T) ]
max_wtspos += [NP.amax(NP.abs(self.wtspos[pol][-1]), axis=0)]
elif len(wtsinfo[pol]) == 1:
if (gridfunc_freq is None) or (gridfunc_freq == 'scale'):
self.wtspos_scale[pol] = 'scale'
if ref_freq is None:
ref_freq = self.f0
angles = wtsinfo[pol][0]['orientation']
rotation_matrix = NP.asarray([[NP.cos(-angles), NP.sin(-angles)],
[-NP.sin(-angles), NP.cos(-angles)]])
if ('lookup' not in wtsinfo[pol][0]) or (wtsinfo[pol][0]['lookup'] is None):
self.wts[pol] += [ wtsinfo[pol][0]['wts'] ]
wtspos = wtsinfo[pol][0]['wtspos']
else:
lookupdata = LKP.read_lookup(wtsinfo[pol][0]['lookup'])
wtspos = NP.hstack((lookupdata[0].reshape(-1,1),lookupdata[1].reshape(-1,1))) * (ref_freq/FCNST.c)
self.wts[pol] += [lookupdata[2]]
self.wtspos[pol] += [ (self.f[0]/ref_freq) * NP.dot(NP.asarray(wtspos), rotation_matrix.T) ]
max_wtspos += [NP.amax(NP.abs(self.wtspos[pol][-1]), axis=0)]
else:
raise ValueError('gridfunc_freq must be set to None, "scale" or "noscale".')
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) - FCNST.c/self.f.min() * NP.amin(NP.abs(self.wtspos[pol][0]), 0)
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) + FCNST.c/self.f.min() * NP.amax(NP.abs(self.wtspos[pol][0]), 0)
else:
raise ValueError('Number of elements in wtsinfo for {0} is incompatible with the number of channels.'.format(pol))
max_wtspos = NP.amax(NP.asarray(max_wtspos).reshape(-1,blc_orig.size), axis=0)
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) - FCNST.c/self.f.min() * max_wtspos
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) + FCNST.c/self.f.min() * max_wtspos
if (NP.abs(NP.linalg.norm(blc_orig)-NP.linalg.norm(self.blc)) > eps) or (NP.abs(NP.linalg.norm(trc_orig)-NP.linalg.norm(self.trc)) > eps):
if verbose:
print 'Grid corner(s) of interferometer {0} have changed. Should re-grid the interferometer array.'.format(self.label)
############################################################################
def update_pp_old(self, update_dict=None, verbose=True):
"""
------------------------------------------------------------------------
Updates the interferometer instance with newer attribute values. Updates
the visibility spectrum and timeseries and applies FX or XF operation.
Used internally when parallel processing is used. Not to be used by the
user directly.
Inputs:
update_dict [dictionary] contains the following keys and values:
label [Scalar] A unique identifier (preferably a string) for
the interferometer. Default=None means no update to apply
latitude [Scalar] Latitude of the interferometer's location.
Default=None means no update to apply
location [Instance of GEOM.Point class] The location of the
interferometer in local East, North, Up (ENU) coordinate
system. Default=None means no update to apply
timestamp [Scalar] String or float representing the timestamp for
the current attributes. Default=None means no update to
apply
t [vector] The time axis for the visibility time series.
Default=None means no update to apply
flags [dictionary] holds boolean flags for each of the 4 cross-
polarizations which are stored under keys 'P11', 'P12',
'P21', and 'P22'. Default=None means no updates for
flags.
Vt [dictionary] holds cross-correlation time series under 4
cross-polarizations which are stored under keys 'P11',
'P12', 'P21', and 'P22'. Default=None implies no updates
for Vt.
wtsinfo [dictionary] consists of weights information for each of
the four cross-polarizations under keys 'P11', 'P12',
'P21', and 'P22'. Each of the values under the keys is a
list of dictionaries. Length of list is equal to the
number of frequency channels or one (equivalent to
setting wtspos_scale to 'scale'.). The list is indexed by
the frequency channel number. Each element in the list
consists of a dictionary corresponding to that frequency
channel. Each dictionary consists of these items with the
following keys:
wtspos [2-column Numpy array, optional] u- and v-
positions for the gridding weights. Units
are in number of wavelengths.
wts [Numpy array] Complex gridding weights. Size
is equal to the number of rows in wtspos
above
orientation [scalar] Orientation (in radians) of the
wtspos coordinate system relative to the
local ENU coordinate system. It is measured
North of East.
lookup [string] If set, refers to a file location
containing the wtspos and wts information
above as columns (x-loc [float], y-loc
[float], wts[real], wts[imag if any]). If
set, wtspos and wts information are obtained
from this lookup table and the wtspos and wts
keywords in the dictionary are ignored. Note
that wtspos values are obtained after
dividing x- and y-loc lookup values by the
wavelength
gridfunc_freq
[String scalar] If set to None (not provided) or to
'scale' assumes that wtspos in wtsinfo are given for a
reference frequency which need to be scaled for the
frequency channels. Will be ignored if the list of
dictionaries under the cross-polarization keys in wtsinfo
have number of elements equal to the number of frequency
channels.
ref_freq [Scalar] Positive value (in Hz) of reference frequency
(used if gridfunc_freq is set to None or 'scale') at
which wtspos is provided. If set to None, ref_freq is
assumed to be equal to the center frequency in the class
Interferometer's attribute.
do_correlate
[string] Indicates whether correlation operation is to be
performed after updates. Accepted values are 'FX' (for FX
operation) and 'XF' (for XF operation). Default=None
means no correlating operation is to be performed after
updates.
stack [boolean] If True (default), appends the updated flag
and data to the end of the stack as a function of
timestamp. If False, updates the last flag and data in
the stack and does not append
verify_flags
[boolean] If True, verify and update the flags, if necessary.
Visibilities are checked for NaN values and if found, the
flag in the corresponding polarization is set to True. Flags
of individual antennas forming a pair are checked and
transferred to the visibility flags. Default=True
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
label = None
location = None
timestamp = None
t = None
flags = None
Vt = None
do_correlate = None
wtsinfo = None
gridfunc_freq = None
ref_freq = None
stack = False
verify_flags = True
if update_dict is not None:
if not isinstance(update_dict, dict):
raise TypeError('Input parameter containing updates must be a dictionary')
if 'label' in update_dict: label = update_dict['label']
if 'location' in update_dict: location = update_dict['location']
if 'timestamp' in update_dict: timestamp = update_dict['timestamp']
if 't' in update_dict: t = update_dict['t']
if 'Vt' in update_dict: Vt = update_dict['Vt']
if 'flags' in update_dict: flags = update_dict['flags']
if 'stack' in update_dict: stack = update_dict['stack']
if 'verify_flags' in update_dict: verify_flags = update_dict['verify_flags']
if 'do_correlate' in update_dict: do_correlate = update_dict['do_correlate']
if 'wtsinfo' in update_dict: wtsinfo = update_dict['wtsinfo']
if 'gridfunc_freq' in update_dict: gridfunc_freq = update_dict['gridfunc_freq']
if 'ref_freq' in update_dict: ref_freq = update_dict['ref_freq']
if label is not None: self.label = label
if location is not None: self.location = location
if timestamp is not None: self.timestamp = timestamp
# Proceed with interferometer updates only if timestamps align
if (self.timestamp != self.A1.timestamp) or (self.timestamp != self.A2.timestamp):
if verbose:
print 'Interferometer timestamp does not match with the component antenna timestamp(s). Update for interferometer {0} will be skipped.'.format(self.label)
else:
self.timestamps += [copy.deepcopy(self.timestamp)]
if t is not None:
self.t = t
self.f = self.f0 + self.channels()
if (Vt is not None) or (flags is not None):
self.crosspol.update(Vt=Vt, flags=flags, verify=verify_flags)
if do_correlate is not None:
if do_correlate == 'FX':
self.FX()
elif do_correlate == 'XF':
self.XF()
else:
raise ValueError('Invalid specification for input parameter do_correlate.')
self.update_flags(flags=None, stack=stack, verify=True) # Re-check flags and stack
for pol in ['P11', 'P12', 'P21', 'P22']:
if self.Vt_stack[pol] is None:
self.Vt_stack[pol] = copy.deepcopy(self.crosspol.Vt[pol].reshape(1,-1))
self.Vf_stack[pol] = copy.deepcopy(self.crosspol.Vf[pol].reshape(1,-1))
else:
if stack:
self.Vt_stack[pol] = NP.vstack((self.Vt_stack[pol], self.crosspol.Vt[pol].reshape(1,-1)))
self.Vf_stack[pol] = NP.vstack((self.Vf_stack[pol], self.crosspol.Vf[pol].reshape(1,-1)))
else:
self.Vt_stack[pol][-1,:] = copy.deepcopy(self.crosspol.Vt[pol].reshape(1,-1))
self.Vf_stack[pol][-1,:] = copy.deepcopy(self.crosspol.Vf[pol].reshape(1,-1))
blc_orig = NP.copy(self.blc)
trc_orig = NP.copy(self.trc)
eps = 1e-6
if wtsinfo is not None:
if not isinstance(wtsinfo, dict):
raise TypeError('Input parameter wtsinfo must be a dictionary.')
self.wtspos = {}
self.wts = {}
self.wtspos_scale = {}
angles = []
max_wtspos = []
for pol in ['P11', 'P12', 'P21', 'P22']:
self.wts[pol] = []
self.wtspos[pol] = []
self.wtspos_scale[pol] = None
if pol in wtsinfo:
if len(wtsinfo[pol]) == len(self.f):
angles += [elem['orientation'] for elem in wtsinfo[pol]]
for i in xrange(len(self.f)):
rotation_matrix = NP.asarray([[NP.cos(-angles[i]), NP.sin(-angles[i])],
[-NP.sin(-angles[i]), NP.cos(-angles[i])]])
if ('lookup' not in wtsinfo[pol][i]) or (wtsinfo[pol][i]['lookup'] is None):
self.wts[pol] += [wtsinfo[pol][i]['wts']]
wtspos = wtsinfo[pol][i]['wtspos']
else:
lookupdata = LKP.read_lookup(wtsinfo[pol][i]['lookup'])
wtspos = NP.hstack((lookupdata[0].reshape(-1,1),lookupdata[1].reshape(-1,1))) * (self.f[i]/FCNST.c)
self.wts[pol] += [lookupdata[2]]
self.wtspos[pol] += [ NP.dot(NP.asarray(wtspos), rotation_matrix.T) ]
max_wtspos += [NP.amax(NP.abs(self.wtspos[pol][-1]), axis=0)]
elif len(wtsinfo[pol]) == 1:
if (gridfunc_freq is None) or (gridfunc_freq == 'scale'):
self.wtspos_scale[pol] = 'scale'
if ref_freq is None:
ref_freq = self.f0
angles = wtsinfo[pol][0]['orientation']
rotation_matrix = NP.asarray([[NP.cos(-angles), NP.sin(-angles)],
[-NP.sin(-angles), NP.cos(-angles)]])
if ('lookup' not in wtsinfo[pol][0]) or (wtsinfo[pol][0]['lookup'] is None):
self.wts[pol] += [ wtsinfo[pol][0]['wts'] ]
wtspos = wtsinfo[pol][0]['wtspos']
else:
lookupdata = LKP.read_lookup(wtsinfo[pol][0]['lookup'])
wtspos = NP.hstack((lookupdata[0].reshape(-1,1),lookupdata[1].reshape(-1,1))) * (ref_freq/FCNST.c)
self.wts[pol] += [lookupdata[2]]
self.wtspos[pol] += [ (self.f[0]/ref_freq) * NP.dot(NP.asarray(wtspos), rotation_matrix.T) ]
max_wtspos += [NP.amax(NP.abs(self.wtspos[pol][-1]), axis=0)]
else:
raise ValueError('gridfunc_freq must be set to None, "scale" or "noscale".')
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) - FCNST.c/self.f.min() * NP.amin(NP.abs(self.wtspos[pol][0]), 0)
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) + FCNST.c/self.f.min() * NP.amax(NP.abs(self.wtspos[pol][0]), 0)
else:
raise ValueError('Number of elements in wtsinfo for {0} is incompatible with the number of channels.'.format(pol))
max_wtspos = NP.amax(NP.asarray(max_wtspos).reshape(-1,blc_orig.size), axis=0)
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) - FCNST.c/self.f.min() * max_wtspos
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) + FCNST.c/self.f.min() * max_wtspos
if (NP.abs(NP.linalg.norm(blc_orig)-NP.linalg.norm(self.blc)) > eps) or (NP.abs(NP.linalg.norm(trc_orig)-NP.linalg.norm(self.trc)) > eps):
if verbose:
print 'Grid corner(s) of interferometer {0} have changed. Should re-grid the interferometer array.'.format(self.label)
return self
############################################################################
def update_pp(self, update_dict=None, verbose=True):
"""
------------------------------------------------------------------------
Updates the interferometer instance with newer attribute values. Updates
the visibility spectrum and timeseries and applies FX or XF operation.
Used internally when parallel processing is used. Not to be used by the
user directly.
See member function update() for details on inputs.
------------------------------------------------------------------------
"""
self.update(update_dict=update_dict, verbose=verbose)
return self
############################################################################
def stack(self, on_flags=True, on_data=True):
"""
------------------------------------------------------------------------
Stacks and computes visibilities and flags from the individual antennas
in the pair.
Inputs:
on_flags [boolean] if set to True (default), combines the time-stacked
electric field flags from individual antennas from the
common timestamps into time-stacked visibility flags
on_data [boolean] if set to True (default), combines the time-stacked
electric fields from individual antennas from the common
timestamps into time-stacked visibilities
------------------------------------------------------------------------
"""
ts1 = NP.asarray(self.A1.timestamps)
ts2 = NP.asarray(self.A2.timestamps)
common_ts = NP.intersect1d(ts1, ts2, assume_unique=True)
ind1 = NP.in1d(ts1, common_ts, assume_unique=True)
ind2 = NP.in1d(ts2, common_ts, assume_unique=True)
self.timestamps = common_ts.tolist()
if on_data:
self.FX_on_stack()
if on_flags:
self.flags_on_stack()
############################################################################
def stack_pp(self, on_flags=True, on_data=True):
"""
------------------------------------------------------------------------
Stacks and computes visibilities and flags from the individual antennas
in the pair. To be used internally as a wrapper for stack() in case of
parallel processing. Not to be used directly by the user.
Inputs:
on_flags [boolean] if set to True (default), combines the time-stacked
electric field flags from individual antennas from the
common timestamps into time-stacked visibility flags
on_data [boolean] if set to True (default), combines the time-stacked
electric fields from individual antennas from the common
timestamps into time-stacked visibilities
------------------------------------------------------------------------
"""
self.stack(on_flags=on_flags, on_data=on_data)
return self
############################################################################
def accumulate(self, tbinsize=None):
"""
------------------------------------------------------------------------
Accumulate and average visibility spectra across timestamps under
different polarizations depending on the time bin size for the
corresponding polarization.
Inputs:
tbinsize [scalar or dictionary] Contains bin size of timestamps while
stacking. Default = None means all visibility spectra over all
timestamps are averaged. If scalar, the same (positive) value
applies to all polarizations. If dictionary, timestamp bin size
(positive) is provided under each key 'P11', 'P12', 'P21',
'P22'. If any of the keys is missing the visibilities for that
polarization are averaged over all timestamps.
------------------------------------------------------------------------
"""
timestamps = NP.asarray(self.timestamps).astype(NP.float)
Vf_acc = {}
twts = {}
Vf_avg = {}
for pol in ['P11', 'P12', 'P21', 'P22']:
Vf_acc[pol] = None
Vf_avg[pol] = None
twts[pol] = []
if tbinsize is None: # Average visibilities across all timestamps
for pol in ['P11', 'P12', 'P21', 'P22']:
unflagged_ind = NP.logical_not(self.flag_stack[pol])
Vf_acc[pol] = NP.nansum(self.Vf_stack[pol][unflagged_ind,:], axis=0, keepdims=True)
twts[pol] = NP.sum(unflagged_ind).astype(NP.float).reshape(-1,1)
# twts[pol] = NP.asarray(len(self.timestamps) - NP.sum(self.flag_stack[pol])).reshape(-1,1)
self.tbinsize = tbinsize
elif isinstance(tbinsize, (int, float)): # Apply same time bin size to all polarizations
eps = 1e-10
tbins = NP.arange(timestamps.min(), timestamps.max(), tbinsize)
tbins = NP.append(tbins, timestamps.max()+eps)
for pol in ['P11', 'P12', 'P21', 'P22']:
counts, tbin_edges, tbinnum, ri = OPS.binned_statistic(timestamps, statistic='count', bins=tbins)
for binnum in range(counts.size):
ind = ri[ri[binnum]:ri[binnum+1]]
unflagged_ind = NP.logical_not(self.flag_stack[pol][ind])
twts[pol] += [NP.sum(unflagged_ind)]
# twts[pol] += [counts[binnum] - NP.sum(self.flag_stack[pol][ind])]
if Vf_acc[pol] is None:
Vf_acc[pol] = NP.nansum(self.Vf_stack[pol][ind[unflagged_ind],:], axis=0, keepdims=True)
else:
Vf_acc[pol] = NP.vstack((Vf_acc[pol], NP.nansum(self.Vf_stack[pol][ind[unflagged_ind],:], axis=0, keepdims=True)))
twts[pol] = NP.asarray(twts[pol]).astype(NP.float).reshape(-1,1)
self.tbinsize = tbinsize
elif isinstance(tbinsize, dict): # Apply different time binsizes to corresponding polarizations
tbsize = {}
for pol in ['P11', 'P12', 'P21', 'P22']:
if pol not in tbinsize:
unflagged_ind = NP.logical_not(self.flag_stack[pol])
Vf_acc[pol] = NP.nansum(self.Vf_stack[pol][unflagged_ind,:], axis=0, keepdims=True)
twts[pol] = NP.sum(unflagged_ind).astype(NP.float).reshape(-1,1)
# twts[pol] = NP.asarray(len(self.timestamps) - NP.sum(self.flag_stack[pol])).reshape(-1,1)
tbsize[pol] = None
elif isinstance(tbinsize[pol], (int,float)):
eps = 1e-10
tbins = NP.arange(timestamps.min(), timestamps.max(), tbinsize[pol])
tbins = NP.append(tbins, timestamps.max()+eps)
counts, tbin_edges, tbinnum, ri = OPS.binned_statistic(timestamps, statistic='count', bins=tbins)
for binnum in range(counts.size):
ind = ri[ri[binnum]:ri[binnum+1]]
unflagged_ind = NP.logical_not(self.flag_stack[pol][ind])
twts[pol] += [NP.sum(unflagged_ind)]
# twts[pol] += [counts[binnum] - NP.sum(self.flag_stack[pol][ind])]
if Vf_acc[pol] is None:
Vf_acc[pol] = NP.nansum(self.Vf_stack[pol][ind[unflagged_ind],:], axis=0, keepdims=True)
else:
Vf_acc[pol] = NP.vstack((Vf_acc[pol], NP.nansum(self.Vf_stack[pol][ind[unflagged_ind],:], axis=0, keepdims=True)))
twts[pol] = NP.asarray(twts[pol]).astype(NP.float).reshape(-1,1)
tbsize[pol] = tbinsize[pol]
else:
unflagged_ind = NP.logical_not(self.flag_stack[pol])
Vf_acc[pol] = NP.nansum(self.Vf_stack[pol][unflagged_ind,:], axis=0, keepdims=True)
twts[pol] = NP.sum(unflagged_ind).astype(NP.float).reshape(-1,1)
# twts[pol] = NP.asarray(len(self.timestamps) - NP.sum(self.flag_stack[pol])).reshape(-1,1)
tbsize[pol] = None
self.tbinsize = tbsize
# Compute the average from the accumulated visibilities
for pol in ['P11', 'P12', 'P21', 'P22']:
Vf_avg[pol] = Vf_acc[pol] / twts[pol]
self.Vf_avg = Vf_avg
self.twts = twts
################################################################################
class InterferometerArray(object):
"""
----------------------------------------------------------------------------
Class to manage interferometer array information.
Attributes:
antenna_array [instance of class AntennaArray] consists of the antenna array
information that determines all the interferometer pairs
interferometers
[dictionary] keys hold instances of class Interferometer. The
keys themselves are identical to the label attributes of the
interferometer instances they hold.
timestamp [Scalar] String or float representing the timestamp for the
current attributes
t [vector] The time axis for the time series of electric fields
f [vector] Frequency axis obtained by a Fourier Transform of
the electric field time series. Same length as attribute t
f0 [Scalar] Center frequency in Hz.
blc [numpy array] 2-element numpy array specifying bottom left
corner of the grid coincident with bottom left interferometer
location in ENU coordinate system
trc [numpy array] 2-element numpy array specifying top right
corner of the grid coincident with top right interferometer
location in ENU coordinate system
grid_blc [numpy array] 2-element numpy array specifying bottom left
corner of the grid in ENU coordinate system including any
padding used
grid_trc [numpy array] 2-element numpy array specifying top right
corner of the grid in ENU coordinate system including any
padding used
gridx [numpy array] two-dimensional numpy meshgrid array specifying
grid x-locations in units of physical distance (in metres) in
the ENU coordinate system whose corners are specified by
attributes grid_blc and grid_trc
gridy [numpy array] two-dimensional numpy meshgrid array specifying
grid y-locations in units of physical distance (in metres) in
the ENU coordinate system whose corners are specified by
attributes grid_blc and grid_trc
grid_ready [boolean] set to True if the gridding has been performed,
False if grid is not available yet. Set to False in case
blc, trc, grid_blc or grid_trc is updated indicating gridding
is to be perfomed again
grid_illumination
[dictionary] Gridded illumination cube for each
cross-polarization is under one of the four keys 'P11', 'P12',
'P21' or 'P22'. Under each of these keys the grid illumination
is a three-dimensional complex numpy array of shape
n_u x n_v x nchan, where, n_u, n_v and nchan are the grid size
along u-axis, v-axis and frequency axis respectively.
grid_Vf [dictionary] Gridded visibility cube for each
cross-polarization is under one of the four keys 'P11', 'P12',
'P21' or 'P22'. Under each of these keys the grid illumination
is a three-dimensional complex numpy array of shape
n_u x n_v x nchan, where, n_u, n_v and nchan are the grid size
along u-axis, v-axis and frequency axis respectively.
ordered_labels
[list] list of interferometer labels sorted by the first
antenna label
grid_mapper [dictionary] baseline-to-grid mapping information for each of
four cross-polarizations under keys 'P11', 'P12', 'P21', and
'P22'. Under each cross-polarization, it is a dictionary with
values under the following keys:
'refind' [list] each element in the list corresponds to a
sequential frequency channel and is another list
with indices to the lookup locations that map to
the grid locations (indices in 'gridind') for this
frequency channel. These indices index the array
in 'refwts'
'gridind' [list] each element in the list corresponds to a
sequential frequency channel and is another list
with indices to the grid locations that map to
the lookup locations (indices in 'refind') for
this frequency channel.
'refwts' [numpy array] interferometer weights of size
n_bl x n_wts flattened to be a vector. Indices in
'refind' index to this array. Currently only valid
when lookup weights scale with frequency.
'labels' [dictionary] contains mapping information from
interferometer (specified by key which is the
interferometer label). The value under each label
key is another dictionary with the following keys
and information:
'twts' [scalar] if positive, indicates
the number of timestamps that
have gone into the measurement of
complex Vf made by the
interferometer under the
specific polarization. If zero, it
indicates no unflagged timestamp
data was found for the
interferometer and will not
contribute to the complex grid
illumination and visibilities
'twts' [scalar] denotes the number of
timestamps for which the
interferometer data was not flagged
which were used in stacking and
averaging
'gridind' [numpy vector] one-dimensional
index into the three-dimensional
grid locations where the
interferometer contributes
illumination and visibilities. The
one-dimensional indices are
obtained using numpy's
multi_ravel_index() using the grid
shape, n_u x n_v x nchan
'illumination' [numpy vector] complex grid
illumination contributed by the
interferometer to different grid
locations in 'gridind'. It is
mapped to the
grid as specified by indices in
key 'gridind'
'Vf' [numpy vector] complex grid
visibilities contributed by the
interferometer. It is mapped to the
grid as specified by indices in
key 'gridind'
'bl' [dictionary] dictionary with information on
contribution of all baseline lookup weights. This
contains another dictionary with the following
keys:
'ind_freq' [list] each element in the list is
for a frequency channel and
consists of a numpy vector which
consists of indices of the
contributing interferometers
'ind_all' [numpy vector] consists of numpy
vector which consists of indices
of the contributing interferometers
for all frequencies appended
together. Effectively, this is just
values in 'ind_freq' of all
frequencies appended together.
'uniq_ind_all' [numpy vector] consists of numpy
vector which consists of unique
indices of contributing baselines
for all frequencies.
'rev_ind_all' [numpy vector] reverse indices of
'ind_all' with reference to bins of
'uniq_ind_all'
'illumination' [numpy vector] complex grid
illumination weights contributed by
each baseline (including associated
kernel weight locations) and has a
size equal to that in 'ind_all'
'grid' [dictionary] contains information about populated
portions of the grid. It consists of values in the
following keys:
'ind_all' [numpy vector] indices of all grid
locations raveled to one dimension
from three dimensions of size
n_u x n_v x nchan
'per_bl2grid'
[list] each element in the list is a dictionary
corresponding to an interferometer with information
on its mapping and contribution to the grid. Each
dictionary has the following keys and values:
'label' [tuple of two strings]
interferometer label
'f_gridind' [numpy array] mapping information
with indices to the frequency axis
of the grid
'u_gridind' [numpy array] mapping information
with indices to the u-axis
of the grid. Must be of same size
as array under 'f_gridind'
'v_gridind' [numpy array] mapping information
with indices to the v-axis
of the grid. Must be of same size
as array under 'f_gridind'
'per_bl_per_freq_norm_wts'
[numpy array] mapping information
on the (complex) normalizing
multiplicative factor required to
make the sum of illumination/weights
per interferometer per frequency on
the grid equal to unity. Must be of
same size as array under 'f_gridind'
'illumination' [numpy array] Complex aperture
illumination/weights contributed
by the interferometer onto the grid.
The grid pixels to which it
contributes is given by 'f_gridind',
'u_gridind', 'v_gridind'. Must be of
same size as array under 'f_gridind'
'Vf' [numpy array] Complex visibilities
contributed by the
interferometer onto the grid. The
grid pixels to which it contributes
is given by 'f_gridind',
'u_gridind', 'v_gridind'. Must be of
same size as array under 'f_gridind'
'all_bl2grid'
[dictionary] contains the combined information of
mapping of all interferometers to the grid. It
consists of the following keys and values:
'blind' [numpy array] all interferometer
indices (to attribute ordered
labels) that map to the uvf-grid
'u_gridind' [numpy array] all indices to the
u-axis of the uvf-grid mapped to by
all interferometers whose indices
are given in key 'blind'. Must be
of same size as the array under key
'blind'
'v_gridind' [numpy array] all indices to the
v-axis of the uvf-grid mapped to by
all interferometers whose indices
are given in key 'blind'. Must be
of same size as the array under key
'blind'
'f_gridind' [numpy array] all indices to the
f-axis of the uvf-grid mapped to by
all interferometers whose indices
are given in key 'blind'. Must be
of same size as the array under key
'blind'
'indNN_list' [list of lists] Each item in the
top level list corresponds to an
interferometer in the same order as
in the attribute ordered_labels.
Each of these items is another list
consisting of the unraveled grid
indices it contributes to. The
unraveled indices are what are used
to obtain the u-, v- and f-indices
in the grid using a conversion
assuming f is the first axis, v is
the second and u is the third
'illumination' [numpy array] complex values of
aperture illumination contributed
by all interferometers to the grid.
The interferometer indices are in
'blind' and the grid indices are
in 'u_gridind', 'v_gridind' and
'f_gridind'. Must be of same size as
these indices
'per_bl_per_freq_norm_wts'
[numpy array] mapping information
on the (complex) normalizing
multiplicative factor required to
make the sum of illumination or
weights per interferometer per
frequency on the grid equal to
unity. This is appended for all
interferometers together. Must be of
same size as array under
'illumination'
'Vf' [numpy array] Complex visibilities
contributed by all
interferometers onto the grid. The
grid pixels to which it contributes
is given by 'f_gridind',
'u_gridind', 'v_gridind'. Must be of
same size as array under 'f_gridind'
and 'illumination'
bl2grid_mapper
[sparse matrix] contains the interferometer array to grid
mapping information in sparse matrix format. When converted
to a dense array, it will have dimensions nrows equal to size
of the 3D cube and ncols equal to number of visibility spectra
of all interferometers over all channels. In other words,
nrows = nu x nv x nchan and ncols = n_bl x nchan. Dot product
of this matrix with flattened visibility spectra or
interferometer weights will give the 3D cubes of gridded
visibilities and interferometer array illumination
respectively
Member Functions:
__init__() Initializes an instance of class InterferometerArray
__str__() Prints summary of an instance of this class
__add__() Operator overloading for adding interferometer(s)
__radd__() Operator overloading for adding interferometer(s)
__sub__() Operator overloading for removing interferometer(s)
add_interferometers()
Routine to add interferometer(s) to the interferometer
array instance. A wrapper for operator overloading
__add__() and __radd__()
remove_interferometers()
Routine to remove interferometer(s) from the interferometer
array instance. A wrapper for operator overloading __sub__()
interferometers_containing_antenna()
Find interferometer pairs which contain the specified
antenna labels
baseline_vectors()
Routine to return the interferometer label and baseline
vectors (sorted by interferometer label if specified)
refresh_antenna_pairs()
Refresh the individual antennas in the interferometer(s)
with the information in the Antenna instances in the
attribute antenna_array which is an instance of class
AntennaArray
FX() Computes the Fourier transform of the cross-correlated time
series of the interferometer pairs in the interferometer
array to compute the visibility spectra
XF() Computes the visibility spectra by cross-multiplying the
electric field spectra for all the interferometer pairs in
the interferometer array
get_visibilities()
Routine to return the interferometer labels, time-based
weights and visibilities (sorted by interferometer label
if specified) based on selection criteria specified by
flags, timestamps, frequency channels, labels and data pool
(most recent, stack, averaged, etc.)
stack() Stacks and computes visibilities and flags for all the
interferometers in the interferometer array from the
individual antennas in the pair.
accumulate() Accumulate and average visibility spectra across timestamps
under different polarizations depending on the time bin
size for the corresponding polarization for all
interferometers in the interferometer array
grid() Routine to produce a grid based on the interferometer array
grid_convolve() Routine to project the complex illumination power pattern
and the visibilities on the grid. It can operate on the
entire interferometer array or incrementally project the
visibilities and complex illumination power patterns from
specific interferometers on to an already existing grid.
(The latter is not implemented yet)
grid_convolve_old()
Routine to project the visibility illumination pattern and
the visibilities on the grid. It can operate on the entire
antenna array or incrementally project the visibilities and
illumination patterns from specific antenna pairs on to an
already existing grid.
grid_convolve_new()
Routine to project the complex illumination power pattern
and the visibilities on the grid from the interferometer
array
make_grid_cube()
Constructs the grid of complex power illumination and
visibilities using the gridding information determined for
every baseline. Flags are taken into account while
constructing this grid.
grid_unconvolve()
[Needs to be re-written] Routine to de-project the
visibility illumination pattern and the visibilities on the
grid. It can operate on the entire interferometer array or
incrementally de-project the visibilities and illumination
patterns of specific antenna pairs from an already existing
grid.
quick_beam_synthesis()
A quick generator of synthesized beam using interferometer
array grid illumination pattern using the center frequency.
Not intended to be used rigorously but rather for comparison
purposes and making quick plots
update_flags() Updates all flags in the interferometer array followed by
any flags that need overriding through inputs of specific
flag information
update() Updates the interferometer array instance with newer
attribute values. Can also be used to add and/or remove
interferometers with/without affecting the existing grid.
----------------------------------------------------------------------------
"""
def __init__(self, antenna_pairs=None, antenna_array=None):
"""
------------------------------------------------------------------------
Initializes an instance of class InterferometerArray
Class attributes initialized are:
antenna_array, interferometers, timestamp, t, f, f0, blc, trc, grid_blc,
grid_trc, gridx, gridy, grid_ready, grid_illumination, grid_Vf,
ordered_labels, grid_mapper
------------------------------------------------------------------------
"""
self.antenna_array = AntennaArray()
self.interferometers = {}
self.blc = NP.zeros(2)
self.trc = NP.zeros(2)
self.grid_blc = NP.zeros(2)
self.grid_trc = NP.zeros(2)
self.gridx, self.gridy = None, None
self.gridu, self.gridv = None, None
self.grid_ready = False
self.grid_illumination = {}
self.grid_Vf = {}
self._bl_contribution = {}
self.ordered_labels = [] # Usually output from member function baseline_vectors() or get_visibilities()
self.grid_mapper = {}
self.bl2grid_mapper = {} # contains the sparse mapping matrix
for pol in ['P11', 'P12', 'P21', 'P22']:
self.grid_mapper[pol] = {}
self.grid_mapper[pol]['labels'] = {}
self.grid_mapper[pol]['refind'] = []
# self.grid_mapper[pol]['bl_ind'] = []
self.grid_mapper[pol]['gridind'] = []
self.grid_mapper[pol]['refwts'] = None
self.grid_mapper[pol]['bl'] = {}
self.grid_mapper[pol]['bl']['ind_freq'] = []
self.grid_mapper[pol]['bl']['ind_all'] = None
self.grid_mapper[pol]['bl']['uniq_ind_all'] = None
self.grid_mapper[pol]['bl']['rev_ind_all'] = None
self.grid_mapper[pol]['bl']['illumination'] = None
self.grid_mapper[pol]['grid'] = {}
self.grid_mapper[pol]['grid']['ind_all'] = None
self.grid_mapper[pol]['per_bl2grid'] = []
self.grid_mapper[pol]['all_bl2grid'] = {}
self.grid_illumination[pol] = None
self.grid_Vf[pol] = None
self._bl_contribution[pol] = {}
self.bl2grid_mapper[pol] = None
if (antenna_array is not None) and (antenna_pairs is not None):
raise ValueError('InterferometerArray instance cannot be initialized with both inputs antenna_array and antenna_pairs.')
if antenna_array is not None:
if isinstance(antenna_array, AntennaArray):
self.antenna_array = antenna_array
else: # if antenna_array is just a list of antennas (Check this piece of code again)
self.antenna_array = self.antenna_array + antenna_array
ant_labels = self.antenna_array.antennas.keys()
for i in xrange(len(ant_labels)-1):
for j in xrange(i+1,len(ant_labels)):
ant_pair = Interferometer(self.antenna_array.antennas[ant_labels[i]], self.antenna_array.antennas[ant_labels[j]])
self.interferometers[ant_pair.label] = ant_pair
if antenna_pairs is not None:
if isinstance(antenna_pairs, Interferometer):
self.interferometers[antenna_pairs.label] = antenna_pairs
elif isinstance(antenna_pairs, dict):
for key,value in antenna_pairs.items():
if isinstance(key, tuple):
if len(key) == 2:
if isinstance(value, Interferometer):
self.interferometers[key] = value
else:
print 'An item found not to be an instance of class Interferometer. Discarding and proceeding ahead.'
else:
print 'Invalid interferometer label found. Discarding and proceeding ahead.'
else:
print 'Invalid interferometer label found. Discarding and proceeding ahead.'
elif isinstance(antenna_pairs, list):
for value in antenna_pairs:
if isinstance(value, Interferometer):
self.interferometers[value.label] = value
else:
print 'An item found not to be an instance of class Interferometer. Discarding and proceeding ahead.'
else:
raise TypeError('Input parameter antenna_pairs found to be of compatible type, namely, instance of class Interferometer, list of instances of class Interferometer or dictionary of interferometers.')
for label, interferometer in self.interferometers.items():
if label[0] not in self.antenna_array.antennas:
self.antenna_array = self.antenna_array + interferometer.A1
# self.antenna_array.add_antennas(interferometer.A1)
if label[1] not in self.antenna_array.antennas:
self.antenna_array = self.antenna_array + interferometer.A2
# self.antenna_array.add_antennas(interferometer.A2)
self.f = self.antenna_array.f
self.f0 = self.antenna_array.f0
self.t = None
self.timestamp = self.antenna_array.timestamp
############################################################################
def __str__(self):
printstr = '\n-----------------------------------------------------------------'
printstr += '\n Instance of class "{0}" in module "{1}".\n Holds the following "Interferometer" class instances with labels:\n '.format(self.__class__.__name__, self.__module__)
printstr += str(self.interferometers.keys()).strip('[]')
# printstr += ' '.join(sorted(self.interferometers.keys()))
printstr += '\n Interferometer array bounds: blc = [{0[0]}, {0[1]}],\n\ttrc = [{1[0]}, {1[1]}]'.format(self.blc, self.trc)
printstr += '\n Grid bounds: blc = [{0[0]}, {0[1]}],\n\ttrc = [{1[0]}, {1[1]}]'.format(self.grid_blc, self.grid_trc)
printstr += '\n-----------------------------------------------------------------'
return printstr
############################################################################
def __add__(self, others):
"""
------------------------------------------------------------------------
Operator overloading for adding interferometer(s)
Inputs:
others [Instance of class InterferometerArray, dictionary holding
instance(s) of class Interferometer, list of instances of
class Interferometer, or a single instance of class
Interferometer] If a dictionary is provided, the keys should
be the antenna labels and the values should be instances of
class Interferometer. If a list is provided, it should be a
list of valid instances of class Interferometer. These
instance(s) of class Interferometer will be added to the
existing instance of InterferometerArray class.
------------------------------------------------------------------------
"""
retval = self
if isinstance(others, InterferometerArray):
# for k,v in others.interferometers.items():
for k,v in others.interferometers.iteritems():
if k in retval.interferometers:
print "Interferometer {0} already included in the list of interferometers.".format(k)
print "For updating, use the update() method. Ignoring interferometer {0}".format(k)
else:
retval.interferometers[k] = v
print 'Interferometer "{0}" added to the list of interferometers.'.format(k)
elif isinstance(others, dict):
# for item in others.values():
for item in others.itervalues():
if isinstance(item, Interferometer):
if item.label in retval.interferometers:
print "Interferometer {0} already included in the list of interferometers.".format(item.label)
print "For updating, use the update() method. Ignoring interferometer {0}".format(item.label)
else:
retval.interferometers[item.label] = item
print 'Interferometer "{0}" added to the list of interferometers.'.format(item.label)
elif isinstance(others, list):
for i in range(len(others)):
if isinstance(others[i], Interferometer):
if others[i].label in retval.interferometers:
print "Interferometer {0} already included in the list of interferometers.".format(others[i].label)
print "For updating, use the update() method. Ignoring interferometer {0}".format(others[i].label)
else:
retval.interferometers[others[i].label] = others[i]
print 'Interferometer "{0}" added to the list of interferometers.'.format(others[i].label)
else:
print 'Element \# {0} is not an instance of class Interferometer.'.format(i)
elif isinstance(others, Interferometer):
if others.label in retval.interferometers:
print "Interferometer {0} already included in the list of interferometers.".format(others.label)
print "For updating, use the update() method. Ignoring interferometer {0}".format(others[i].label)
else:
retval.interferometers[others.label] = others
print 'Interferometer "{0}" added to the list of interferometers.'.format(others.label)
else:
print 'Input(s) is/are not instance(s) of class Interferometer.'
return retval
############################################################################
def __radd__(self, others):
"""
------------------------------------------------------------------------
Operator overloading for adding interferometer(s)
Inputs:
others [Instance of class InterferometerArray, dictionary holding
instance(s) of class Interferometer, list of instances of
class Interferometer, or a single instance of class
Interferometer] If a dictionary is provided, the keys should
be the interferometer labels and the values should be
instances of class Interferometer. If a list is provided, it
should be a list of valid instances of class Interferometer.
These instance(s) of class Interferometer will be added to
the existing instance of InterferometerArray class.
------------------------------------------------------------------------
"""
return self.__add__(others)
############################################################################
def __sub__(self, others):
"""
------------------------------------------------------------------------
Operator overloading for removing interferometer(s)
Inputs:
others [Instance of class InterferometerArray, dictionary holding
instance(s) of class Interferometer, list of instances of
class Interferometer, list of strings containing
interferometer labels or a single instance of class
Interferometer] If a dictionary is provided, the keys should
be the interferometer labels and the values should be
instances of class Interferometer. If a list is provided, it
should be a list of valid instances of class Interferometer.
These instance(s) of class Interferometer will be removed
from the existing instance of InterferometerArray class.
------------------------------------------------------------------------
"""
retval = self
if isinstance(others, dict):
for item in others.values():
if isinstance(item, Interferometer):
if item.label not in retval.interferometers:
print "Interferometer {0} does not exist in the list of interferometers.".format(item.label)
else:
del retval.interferometers[item.label]
print 'Interferometer "{0}" removed from the list of interferometers.'.format(item.label)
elif isinstance(others, list):
for i in range(0,len(others)):
if isinstance(others[i], str):
if others[i] in retval.interferometers:
del retval.interferometers[others[i]]
print 'Interferometer {0} removed from the list of interferometers.'.format(others[i])
elif isinstance(others[i], Interferometer):
if others[i].label in retval.interferometers:
del retval.interferometers[others[i].label]
print 'Interferometer {0} removed from the list of interferometers.'.format(others[i].label)
else:
print "Interferometer {0} does not exist in the list of interferometers.".format(others[i].label)
else:
print 'Element \# {0} has no matches in the list of interferometers.'.format(i)
elif others in retval.interferometers:
del retval.interferometers[others]
print 'Interferometer "{0}" removed from the list of interferometers.'.format(others)
elif isinstance(others, Interferometer):
if others.label in retval.interferometers:
del retval.interferometers[others.label]
print 'Interferometer "{0}" removed from the list of interferometers.'.format(others.label)
else:
print "Interferometer {0} does not exist in the list of interferometers.".format(others.label)
else:
print 'No matches found in existing list of interferometers.'
return retval
############################################################################
def add_interferometers(self, A=None):
"""
------------------------------------------------------------------------
Routine to add interferometer(s) to the interferometer array instance.
A wrapper for operator overloading __add__() and __radd__()
Inputs:
A [Instance of class InterferometerArray, dictionary holding
instance(s) of class Interferometer, list of instances of
class Interferometer, or a single instance of class
Interferometer] If a dictionary is provided, the keys should
be the interferometer labels and the values should be
instances of class Interferometer. If a list is provided, it
should be a list of valid instances of class Interferometer.
These instance(s) of class Interferometer will be added to
the existing instance of InterferometerArray class.
------------------------------------------------------------------------
"""
if A is None:
print 'No interferometer(s) supplied.'
elif isinstance(A, (list, Interferometer)):
self = self.__add__(A)
else:
print 'Input(s) is/are not instance(s) of class Interferometer.'
############################################################################
def remove_interferometers(self, A=None):
"""
------------------------------------------------------------------------
Routine to remove interferometer(s) from the interferometer array
instance. A wrapper for operator overloading __sub__()
Inputs:
A [Instance of class InterferometerArray, dictionary holding
instance(s) of class Interferometer, list of instances of
class Interferometer, or a single instance of class
Interferometer] If a dictionary is provided, the keys should
be the interferometer labels and the values should be
instances of class Interferometer. If a list is provided, it
should be a list of valid instances of class Interferometer.
These instance(s) of class Interferometer will be removed
from the existing instance of InterferometerArray class.
------------------------------------------------------------------------
"""
if A is None:
print 'No interferometer specified for removal.'
else:
self = self.__sub__(A)
############################################################################
def interferometers_containing_antenna(self, antenna_label):
"""
------------------------------------------------------------------------
Find interferometer pairs which contain the specified antenna labels
Inputs:
antenna_label [list] List of antenna labels which will be searched for
in the interferometer pairs in the interferometer array.
Outputs:
ant_pair_labels
[list] List of interferometer pair labels containing one
of more of the specified antenna labels
ant_order [list] List of antenna order of antenna labels found in
the interferometer pairs of the interferometer array. If
the antenna label appears as the first antenna in the
antenna pair, ant_order is assigned to 1 and if it is
the second antenna in the pair, it is assigned to 2.
------------------------------------------------------------------------
"""
ant_pair_labels = [ant_pair_label for ant_pair_label in self.interferometers if antenna_label in ant_pair_label]
ant_order = [1 if ant_pair_label[0] == antenna_label else 2 for ant_pair_label in ant_pair_labels]
return (ant_pair_labels, ant_order)
############################################################################
def baseline_vectors(self, pol=None, flag=False, sort=True):
"""
------------------------------------------------------------------------
Routine to return the interferometer label and baseline vectors (sorted
by interferometer label if specified)
Keyword Inputs:
pol [string] select baselines of this polarization that are either
flagged or unflagged as specified by input parameter flag.
Allowed values are 'P11', 'P12', 'P21', and 'P22'.
Default=None. This means all baselines are returned
irrespective of the flags
flag [boolean] If False, return unflagged baselines, otherwise
return flagged ones. Default=None means return all baselines
independent of flagging or polarization
sort [boolean] If True, returned interferometer information is
sorted by interferometer's first antenna label. Default = True.
Output:
outdict [dictionary] Output consists of a dictionary with the following
keys and information:
'labels': list of tuples of strings of interferometer labels
'baselines': baseline vectors of interferometers (3-column
array)
------------------------------------------------------------------------
"""
if not isinstance(sort, bool):
raise TypeError('sort keyword has to be a Boolean value.')
if flag is not None:
if not isinstance(flag, bool):
raise TypeError('flag keyword has to be a Boolean value.')
if pol is None:
if sort: # sort by first antenna label
xyz = NP.asarray([[self.interferometers[label].location.x, self.interferometers[label].location.y, self.interferometers[label].location.z] for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0])])
labels = sorted(self.interferometers.keys(), key=lambda tup: tup[0])
else:
xyz = NP.asarray([[self.interferometers[label].location.x, self.interferometers[label].location.y, self.interferometers[label].location.z] for label in self.interferometers.keys()])
labels = self.interferometers.keys()
else:
if not isinstance(pol, str):
raise TypeError('Input parameter must be a string')
if not pol in ['P11', 'P12', 'P21', 'P22']:
raise ValueError('Invalid specification for input parameter pol')
if sort: # sort by first antenna label
if flag is None: # get all baselines
xyz = NP.asarray([[self.interferometers[label].location.x, self.interferometers[label].location.y, self.interferometers[label].location.z] for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0])])
labels = [label for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0])]
else:
if flag: # get flagged baselines
xyz = NP.asarray([[self.interferometers[label].location.x, self.interferometers[label].location.y, self.interferometers[label].location.z] for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0]) if self.interferometers[label].crosspol.flag[pol]])
labels = [label for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0]) if self.interferometers[label].crosspol.flag[pol]]
else: # get unflagged baselines
xyz = NP.asarray([[self.interferometers[label].location.x, self.interferometers[label].location.y, self.interferometers[label].location.z] for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0]) if not self.interferometers[label].crosspol.flag[pol]])
labels = [label for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0]) if not self.interferometers[label].crosspol.flag[pol]]
else: # no sorting
if flag is None: # get all baselines
xyz = NP.asarray([[self.interferometers[label].location.x, self.interferometers[label].location.y, self.interferometers[label].location.z] for label in self.interferometers.keys()])
labels = [label for label in self.interferometers.keys()]
else:
if flag: # get flagged baselines
xyz = NP.asarray([[self.interferometers[label].location.x, self.interferometers[label].location.y, self.interferometers[label].location.z] for label in self.interferometers.keys() if self.interferometers[label].crosspol.flag[pol]])
labels = [label for label in self.interferometers.keys() if self.interferometers[label].crosspol.flag[pol]]
else: # get unflagged baselines
xyz = NP.asarray([[self.interferometers[label].location.x, self.interferometers[label].location.y, self.interferometers[label].location.z] for label in self.interferometers.keys() if not self.interferometers[label].crosspol.flag[pol]])
labels = [label for label in self.interferometers.keys() if not self.interferometers[label].crosspol.flag[pol]]
outdict = {}
outdict['labels'] = labels
outdict['baselines'] = xyz
return outdict
############################################################################
def refresh_antenna_pairs(self, interferometer_labels=None,
antenna_labels=None):
"""
------------------------------------------------------------------------
Refresh the individual antennas in the interferometer(s) with the
information in the Antenna instances in the attribute antenna_array
which is an instance of class AntennaArray
Inputs:
interferometer_labels
[list] list of interferometer labels each given as a tuple
of antenna labels. The antennas in these pairs are refreshed
using the corresponding antenna instances in the attribute
antenna_array. Default = None.
antenna_labels
[list] list of antenna labels to determine which
interferometers they contribute to. The antenna pairs in
these interferometers are refreshed based on the current
antenna instances in the attribute antenna_array.
Default = None.
If both input keywords interferometer_labels and antenna_labels are
set to None, all the interferometer instances are refreshed.
------------------------------------------------------------------------
"""
ilabels = []
if interferometer_labels is not None:
if not isinstance(interferometer_labels, list):
raise TypeError('Input keyword interferometer_labels must be a list')
ilabels = antenna_labels
if antenna_labels is not None:
if not isinstance(interferometer_labels, list):
raise TypeError('Input keyword interferometer_labels must be a list')
ant_pair_labels, = self.interferometers_containing_antenna(antenna_labels)
ilabels += ant_pair_labels
if len(ilabels) == 0:
ilabels = self.interferometers.keys()
for antpair_label in ilabels:
if antpair_label in self.interferometers:
self.interferometers[antpair_label].refresh_antenna_pairs(A1=self.antenna_array.antennas[antpair_label[0]], A2=self.antenna_array.antennas[antpair_label[1]])
############################################################################
def FX(self, parallel=False, nproc=None):
"""
------------------------------------------------------------------------
Computes the Fourier transform of the cross-correlated time series of
the interferometer pairs in the interferometer array to compute the
visibility spectra
Inputs:
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
------------------------------------------------------------------------
"""
if self.t is None:
self.t = self.interferometers.itervalues().next().t
if self.f is None:
self.f = self.interferometers.itervalues().next().f
if self.f0 is None:
self.f0 = self.interferometers.itervalues().next().f0
# for label in self.interferometers: # Start processes in parallel
# self.interferometers[label].start()
if not parallel:
for label in self.interferometers:
self.interferometers[label].FX()
elif parallel or (nproc is not None):
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
pool = MP.Pool(processes=nproc)
updated_interferometers = pool.map(unwrap_interferometer_FX, IT.izip(self.interferometers.values()))
pool.close()
pool.join()
for interferometer in updated_interferometers:
self.interferometers[interferometer.label] = interferometer
del updated_interferometers
############################################################################
def XF(self):
"""
------------------------------------------------------------------------
Computes the visibility spectra by cross-multiplying the electric field
spectra for all the interferometer pairs in the interferometer array
------------------------------------------------------------------------
"""
if self.t is None:
self.t = self.interferometers.itervalues().next().t
if self.f is None:
self.f = self.interferometers.itervalues().next().f
if self.f0 is None:
self.f0 = self.interferometers.itervalues().next().f0
for label in self.interferometers:
self.interferometers[label].XF()
############################################################################
def get_visibilities_old(self, pol, flag=None, sort=True):
"""
------------------------------------------------------------------------
Routine to return the interferometer label and visibilities (sorted by
interferometer label if specified)
Keyword Inputs:
pol [string] select baselines of this polarization that are either
flagged or unflagged as specified by input parameter flag.
Allowed values are 'P11', 'P12', 'P21', and 'P22'. Only one of
these values must be specified.
flag [boolean] If False, return visibilities of unflagged baselines,
otherwise return flagged ones. Default=None means all
visibilities independent of flagging are returned.
sort [boolean] If True, returned interferometer information is
sorted by interferometer's first antenna label. Default = True.
Output:
outdict [dictionary] Output consists of a dictionary with the following
keys and information:
'labels': Contains a numpy array of strings of
interferometer labels
'visibilities':
interferometer visibilities (n_bl x nchan array)
------------------------------------------------------------------------
"""
try:
pol
except NameError:
raise NameError('Input parameter pol must be specified.')
if not isinstance(pol, str):
raise TypeError('Input parameter must be a string')
if not pol in ['P11', 'P12', 'P21', 'P22']:
raise ValueError('Invalid specification for input parameter pol')
if not isinstance(sort, bool):
raise TypeError('sort keyword has to be a Boolean value.')
if flag is not None:
if not isinstance(flag, bool):
raise TypeError('flag keyword has to be a Boolean value.')
if sort: # sort by first antenna label
if flag is None: # get all baselines
vis = NP.asarray([self.interferometers[label].crosspol.Vf[pol] for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0])])
labels = [label for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0])]
else:
if flag: # get flagged baselines
vis = NP.asarray([self.interferometers[label].crosspol.Vf[pol] for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0]) if self.interferometers[label].crosspol.flag[pol]])
labels = [label for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0]) if self.interferometers[label].crosspol.flag[pol]]
else: # get unflagged baselines
vis = NP.asarray([self.interferometers[label].crosspol.Vf[pol] for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0]) if not self.interferometers[label].crosspol.flag[pol]])
labels = [label for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0]) if not self.interferometers[label].crosspol.flag[pol]]
else: # no sorting
if flag is None:
vis = NP.asarray([self.interferometers[label].crosspol.Vf[pol] for label in self.interferometers.keys()])
labels = [label for label in self.interferometers.keys()]
else:
if flag: # get flagged baselines
vis = NP.asarray([self.interferometers[label].crosspol.Vf[pol] for label in self.interferometers.keys() if self.interferometers[label].crosspol.flag[pol]])
labels = [label for label in self.interferometers.keys() if self.interferometers[label].crosspol.flag[pol]]
else: # get unflagged baselines
vis = NP.asarray([self.interferometers[label].crosspol.Vf[pol] for label in self.interferometers.keys() if not self.interferometers[label].crosspol.flag[pol]])
labels = [label for label in sorted(self.interferometers.keys(), key=lambda tup: tup[0]) if not self.interferometers[label].crosspol.flag[pol]]
outdict = {}
outdict['labels'] = labels
outdict['visibilities'] = vis
return outdict
############################################################################
def get_visibilities(self, pol, flag=None, tselect=None, fselect=None,
bselect=None, datapool=None, sort=True):
"""
------------------------------------------------------------------------
Routine to return the interferometer labels, time-based weights and
visibilities (sorted by interferometer label if specified) based on
selection criteria specified by flags, timestamps, frequency channels,
labels and data pool (most recent, stack, averaged, etc.)
Keyword Inputs:
pol [string] select baselines of this polarization that are either
flagged or unflagged as specified by input parameter flag.
Allowed values are 'P11', 'P12', 'P21', and 'P22'. Only one of
these values must be specified.
flag [boolean] If False, return visibilities of unflagged baselines,
otherwise return flagged ones. Default=None means all
visibilities independent of flagging are returned.
tselect [scalar, list, numpy array] timestamp index for visibilities
selection. For most recent visibility, it must be set to -1.
For all other selections, indices in tselect must be in the
valid range of indices along time axis for stacked and
averaged visibilities. Default=None means most recent data is
selected.
fselect [scalar, list, numpy array] frequency channel index for
visibilities selection. Indices must be in the valid range of
indices along the frequency axis for visibilities.
Default=None selects all frequency channels
bselect [list of tuples] labels of interferometers to select. If set
to None (default) all interferometers are selected.
datapool [string] denotes the data pool from which visibilities are to
be selected. Accepted values are 'current', 'stack', 'avg' and
None (default, same as 'current'). If set to None or
'current', the value in tselect is ignored and only
visibilities of the most recent timestamp are selected. If set
to None or 'current' the attribute Vf_stack is checked first
and if unavailable, attribute crosspol.Vf is used. For 'stack'
and 'avg', attributes Vf_stack and Vf_avg are used
respectively
sort [boolean] If True, returned interferometer information is
sorted by interferometer's first antenna label. Default=True.
Output:
outdict [dictionary] Output consists of a dictionary with the following
keys and information:
'labels' [list of tuples] Contains a list of
interferometer labels
'visibilities' [list or numpy array] interferometer
visibilities under the specified polarization.
In general, it is a list of
numpy arrays where each array in the list
corresponds to
an individual interferometer and the size of
each numpy array is n_ts x nchan. If input
keyword flag is set to None, the visibilities
are rearranged into a numpy array of size
n_ts x n_bl x nchan.
'twts' [list or numpy array] weights based on flags
along time axis under the specified
polarization. In general it is a list of numpy
arrays where each array in the list corresponds
to an individual interferometer and the size
of each array is n_ts x 1. If input
keyword flag is set to None, the time weights
are rearranged into a numpy array of size
n_ts x n_bl x 1
------------------------------------------------------------------------
"""
if not isinstance(sort, bool):
raise TypeError('sort keyword has to be a Boolean value.')
if bselect is None:
labels = self.interferometers.keys()
elif isinstance(bselect, list):
labels = [label for label in bselect if label in self.interferometers]
if sort:
labels_orig = copy.deepcopy(labels)
labels = [label for label in sorted(labels_orig, key=lambda tup: tup[0])]
visinfo = [self.interferometers[label].get_visibilities(pol, flag=flag, tselect=tselect, fselect=fselect, datapool=datapool) for label in labels]
outdict = {}
outdict['labels'] = labels
outdict['twts'] = [vinfo['twts'] for vinfo in visinfo]
outdict['visibilities'] = [vinfo['visibilities'] for vinfo in visinfo]
if flag is None:
outdict['visibilities'] = NP.swapaxes(NP.asarray(outdict['visibilities']), 0, 1)
outdict['twts'] = NP.swapaxes(NP.asarray(outdict['twts']), 0, 1)
outdict['twts'] = outdict['twts'][:,:,NP.newaxis]
return outdict
############################################################################
def stack(self, on_flags=True, on_data=True, parallel=False, nproc=None):
"""
------------------------------------------------------------------------
Stacks and computes visibilities and flags for all the interferometers
in the interferometer array from the individual antennas in the pair.
Inputs:
on_flags [boolean] if set to True (default), combines the time-stacked
electric field flags from individual antennas from the
common timestamps into time-stacked visibility flags
on_data [boolean] if set to True (default), combines the time-stacked
electric fields from individual antennas from the common
timestamps into time-stacked visibilities
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
------------------------------------------------------------------------
"""
if parallel:
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
list_of_perform_flag_stack = [on_flags] * len(self.interferometers)
list_of_perform_data_stack = [on_data] * len(self.interferometers)
pool = MP.Pool(processes=nproc)
updated_interferometers = pool.map(unwrap_interferometer_stack, IT.izip(self.interferometers.values(), list_of_perform_flag_stack, list_of_perform_data_stack))
pool.close()
pool.join()
for interferometer in updated_interferometers:
self.interferometers[interferometer.label] = interferometer
del updated_interferometers
else:
for label in self.interferometers:
self.interferometers[label].stack(on_flags=on_flags, on_data=on_data)
############################################################################
def accumulate(self, tbinsize=None):
"""
------------------------------------------------------------------------
Accumulate and average visibility spectra across timestamps under
different polarizations depending on the time bin size for the
corresponding polarization for all interferometers in the
interferometer array
Inputs:
tbinsize [scalar or dictionary] Contains bin size of timestamps while
stacking. Default = None means all visibility spectra over all
timestamps are averaged. If scalar, the same (positive) value
applies to all polarizations. If dictionary, timestamp bin size
(positive) is provided under each key 'P11', 'P12', 'P21',
'P22'. If any of the keys is missing the visibilities for that
polarization are averaged over all timestamps.
------------------------------------------------------------------------
"""
for label in self.interferometers:
self.interferometers[label].accumulate(tbinsize=tbinsize)
############################################################################
def grid(self, uvspacing=0.5, uvpad=None, pow2=True):
"""
------------------------------------------------------------------------
Routine to produce a grid based on the interferometer array
Inputs:
uvspacing [Scalar] Positive value indicating the maximum uv-spacing
desirable at the lowest wavelength (max frequency).
Default = 0.5
xypad [List] Padding to be applied around the interferometer
locations before forming a grid. List elements should be
positive. If it is a one-element list, the element is
applicable to both x and y axes. If list contains three or
more elements, only the first two elements are considered
one for each axis. Default = None.
pow2 [Boolean] If set to True, the grid is forced to have a size
a next power of 2 relative to the actual sie required. If
False, gridding is done with the appropriate size as
determined by uvspacing. Default = True.
------------------------------------------------------------------------
"""
if self.f is None:
self.f = self.interferometers.itervalues().next().f
if self.f0 is None:
self.f0 = self.interferometers.itervalues().next().f0
wavelength = FCNST.c / self.f
min_lambda = NP.abs(wavelength).min()
# Change itervalues() to values() when porting to Python 3.x
# May have to change *blc and *trc with zip(*blc) and zip(*trc) when using Python 3.x
blc = [[self.interferometers[label].blc[0,0], self.interferometers[label].blc[0,1]] for label in self.interferometers]
trc = [[self.interferometers[label].trc[0,0], self.interferometers[label].trc[0,1]] for label in self.interferometers]
self.trc = NP.amax(NP.abs(NP.vstack((NP.asarray(blc), NP.asarray(trc)))), axis=0).ravel() / min_lambda
self.blc = -1 * self.trc
self.gridu, self.gridv = GRD.grid_2d([(self.blc[0], self.trc[0]), (self.blc[1], self.trc[1])], pad=uvpad, spacing=uvspacing, pow2=True)
self.grid_blc = NP.asarray([self.gridu.min(), self.gridv.min()])
self.grid_trc = NP.asarray([self.gridu.max(), self.gridv.max()])
self.grid_ready = True
############################################################################
def grid_convolve(self, pol=None, antpairs=None, unconvolve_existing=False,
normalize=False, method='NN', distNN=NP.inf, tol=None,
maxmatch=None, identical_interferometers=True,
gridfunc_freq=None, mapping='weighted', wts_change=False,
parallel=False, nproc=None, pp_method='pool', verbose=True):
"""
------------------------------------------------------------------------
Routine to project the complex illumination power pattern and the
visibilities on the grid. It can operate on the entire interferometer
array or incrementally project the visibilities and complex illumination
power patterns from specific interferometers on to an already existing
grid. (The latter is not implemented yet)
Inputs:
pol [String] The polarization to be gridded. Can be set to
'P11', 'P12', 'P21' or 'P22'. If set to None, gridding for
all the polarizations is performed. Default = None
antpairs [instance of class InterferometerArray, single instance or
list of instances of class Interferometer, or a dictionary
holding instances of class Interferometer] If a dictionary
is provided, the keys should be the interferometer labels
and the values should be instances of class Interferometer.
If a list is provided, it should be a list of valid
instances of class Interferometer. These instance(s) of
class Interferometer will be merged to the existing grid
contained in the instance of InterferometerArray class. If
ants is not provided (set to None), the gridding operations
will be performed on the entire set of interferometers
contained in the instance of class InterferometerArray.
Default=None.
unconvolve_existing
[Boolean] Default = False. If set to True, the effects of
gridding convolution contributed by the interferometer(s)
specified will be undone before updating the interferometer
measurements on the grid, if the interferometer(s) is/are
already found to in the set of interferometers held by the
instance of InterferometerArray. If False and if one or more
interferometer instances specified are already found to be
held in the instance of class InterferometerArray, the code
will stop raising an error indicating the gridding oepration
cannot proceed.
normalize [Boolean] Default = False. If set to True, the gridded
weights are divided by the sum of weights so that the
gridded weights add up to unity. (Need to work on
normaliation)
method [string] The gridding method to be used in applying the
interferometer weights on to the interferometer array grid.
Accepted values are 'NN' (nearest neighbour - default), 'CS'
(cubic spline), or 'BL' (Bi-linear). In case of applying grid
weights by 'NN' method, an optional distance upper bound for
the nearest neighbour can be provided in the parameter distNN
to prune the search and make it efficient. Currently, only
the nearest neighbour method is operational.
distNN [scalar] A positive value indicating the upper bound on
distance to the nearest neighbour in the gridding process. It
has units of distance, the same units as the interferometer
attribute location and interferometer array attribute gridx
and gridy. Default is NP.inf (infinite distance). It will be
internally converted to have same units as interferometer
attributes wtspos (units in number of wavelengths)
maxmatch [scalar] A positive value indicating maximum number of input
locations in the interferometer grid to be assigned.
Default = None. If set to None, all the interferometer array
grid elements specified are assigned values for each
interferometer. For instance, to have only one interferometer
array grid element to be populated per interferometer, use
maxmatch=1.
tol [scalar] If set, only lookup data with abs(val) > tol will be
considered for nearest neighbour lookup. Default = None
implies all lookup values will be considered for nearest
neighbour determination. tol is to be interpreted as a
minimum value considered as significant in the lookup table.
identical_interferometers
[boolean] indicates if all interferometer elements are to be
treated as identical. If True (default), they are identical
and their gridding kernels are identical. If False, they are
not identical and each one has its own gridding kernel.
gridfunc_freq
[String scalar] If set to None (not provided) or to 'scale'
assumes that attribute wtspos is given for a
reference frequency which need to be scaled for the frequency
channels. Will be ignored if the number of elements of list
in this attribute under the specific polarization are the
same as the number of frequency channels.
mapping [string] indicates the type of mapping between baseline
locations and the grid locations. Allowed values are
'sampled' and 'weighted' (default). 'sampled' means only the
baseline measurement closest ot a grid location contributes
to that grid location, whereas, 'weighted' means that all the
baselines contribute in a weighted fashion to their nearest
grid location. The former is faster but possibly discards
baseline data whereas the latter is slower but includes all
data along with their weights.
wts_change [boolean] indicates if weights and/or their lcoations have
changed from the previous intergration or snapshot.
Default=False means they have not changed. In such a case the
baseline-to-grid mapping and grid illumination pattern do not
have to be determined, and mapping and values from the
previous snapshot can be used. If True, a new mapping has to
be determined.
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
pp_method [string] specifies if the parallelization method is handled
automatically using multirocessing pool or managed manually
by individual processes and collecting results in a queue.
The former is specified by 'pool' (default) and the latter
by 'queue'. These are the two allowed values. The pool method
has easier bookkeeping and can be fast if the computations
not expected to be memory bound. The queue method is more
suited for memory bound processes but can be slower or
inefficient in terms of CPU management.
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
eps = 1.0e-10
if pol is None:
pol = ['P11', 'P12', 'P21', 'P22']
elif not isinstance(pol, list):
pol = [pol]
if not self.grid_ready:
self.grid()
crosspol = ['P11', 'P12', 'P21', 'P22']
for cpol in crosspol:
if cpol in pol:
if antpairs is not None:
if isinstance(antpairs, Interferometer):
antpairs = [antpairs]
if isinstance(antpairs, (dict, InterferometerArray)):
# Check if these interferometers are new or old and compatible
for key in antpairs:
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if key in self.interferometers:
if unconvolve_existing: # Effects on the grid of interferometers already existing must be removed
if self.interferometers[key]._gridinfo[cpol]: # if gridding info is not empty
for i in range(len(self.f)):
self.grid_unconvolve(antpairs[key].label)
else:
raise KeyError('Interferometer {0} already found to exist in the dictionary of interferometers but cannot proceed grid_convolve() without unconvolving first.'.format(antpairs[key].label))
else:
del antpairs[key] # remove the dictionary element since it is not an Interferometer instance
if identical_interferometers and (gridfunc_freq == 'scale'):
bl_dict = self.baseline_vectors(pol=cpol, flag=False, sort=True)
bl_xy = bl_dict['baselines'][:,:2]
self.ordered_labels = bl_dict['labels']
n_bl = bl_xy.shape[0]
vis_dict = self.get_visibilities_old(cpol, flag=False, sort=True)
vis = vis_dict['visibilities'].astype(NP.complex64)
# Since antenna pairs are identical, read from first antenna pair, since wtspos are scaled with frequency, read from first frequency channel
wtspos_xy = antpairs[0].wtspos[cpol][0] * FCNST.c/self.f[0]
wts = antpairs[0].wts[cpol][0]
n_wts = wts.size
reflocs_xy = bl_xy[:,NP.newaxis,:] + wtspos_xy[NP.newaxis,:,:]
refwts_xy = wts.reshape(1,-1) * NP.ones((n_bl,1))
reflocs_xy = reflocs_xy.reshape(-1,bl_xy.shape[1])
refwts_xy = refwts_xy.reshape(-1,1).astype(NP.complex64)
reflocs_uv = reflocs_xy[:,NP.newaxis,:] * self.f.reshape(1,-1,1) / FCNST.c
refwts_uv = refwts_xy * NP.ones((1,self.f.size))
reflocs_uv = reflocs_uv.reshape(-1,bl_xy.shape[1])
refwts_uv = refwts_uv.reshape(-1,1).ravel()
inplocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs_uv, refwts_uv, inplocs,
distance_ULIM=distNN*self.f.max()/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
else:
bl_dict = self.baseline_vectors(pol=cpol, flag=None, sort=True)
self.ordered_labels = bl_dict['labels']
bl_xy = bl_dict['baselines'][:,:2] # n_bl x 2
n_bl = bl_xy.shape[0]
# Vf_dict = self.get_visibilities_old(cpol, flag=None, sort=True)
# Vf = Vf_dict['visibilities'].astype(NP.complex64) # n_bl x nchan
Vf_dict = self.get_visibilities(cpol, flag=None, tselect=-1, fselect=None, bselect=None, datapool='avg', sort=True)
Vf = Vf_dict['visibilities'].astype(NP.complex64) # (n_ts=1) x n_bl x nchan
Vf = NP.squeeze(Vf, axis=0) # n_bl x nchan
if Vf.shape[0] != n_bl:
raise ValueError('Encountered unexpected behavior. Need to debug.')
bl_labels = Vf_dict['labels']
twts = Vf_dict['twts'] # (n_ts=1) x n_bl x (nchan=1)
twts = NP.squeeze(twts, axis=(0,2)) # n_bl
if verbose:
print 'Gathered baseline data for gridding convolution for timestamp {0}'.format(self.timestamp)
if wts_change or (not self.grid_mapper[cpol]['labels']):
if gridfunc_freq == 'scale':
if identical_interferometers:
wts_tol = 1e-6
# Since antenna pairs are identical, read from first antenna pair, since wtspos are scaled with frequency, read from first frequency channel
wtspos_xy = self.interferometers.itervalues().next().wtspos[cpol][0] * FCNST.c/self.f[0]
wts = self.interferometers.itervalues().next().wts[cpol][0].astype(NP.complex64)
wtspos_xy = wtspos_xy[NP.abs(wts) >= wts_tol, :]
wts = wts[NP.abs(wts) >= wts_tol]
n_wts = wts.size
reflocs_xy = bl_xy[:,NP.newaxis,:] + wtspos_xy[NP.newaxis,:,:] # n_bl x n_wts x 2
refwts = wts.reshape(1,-1) * NP.ones((n_bl,1)) # n_bl x n_wts
else:
for i,label in enumerate(self.ordered_labels):
bl_wtspos = self.interferometers[label].wtspos[cpol][0]
bl_wts = self.interferometers[label].wts[cpol][0].astype(NP.complex64)
if i == 0:
wtspos = bl_wtspos[NP.newaxis,:,:] # 1 x n_wts x 2
refwts = bl_wts.reshape(1,-1) # 1 x n_wts
else:
wtspos = NP.vstack((wtspos, bl_wtspos[NP.newaxis,:,:])) # n_bl x n_wts x 2
refwts = NP.vstack((refwts, bl_wts.reshape(1,-1))) # n_bl x n_wts
reflocs_xy = bl_xy[:,NP.newaxis,:] + wtspos * FCNST.c/self.f[0] # n_bl x n_wts x 2
reflocs_xy = reflocs_xy.reshape(-1,bl_xy.shape[1]) # (n_bl x n_wts) x 2
refwts = refwts.ravel()
self.grid_mapper[cpol]['refwts'] = NP.copy(refwts.ravel()) # (n_bl x n_wts)
else: # Weights do not scale with frequency (needs serious development)
pass
gridlocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
contributed_bl_grid_Vf = None
if parallel: # Use parallelization over frequency to determine gridding convolution
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
if pp_method == 'queue': ## Use MP.Queue(): useful for memory intensive parallelizing but can be slow
job_chunk_begin = range(0,self.f.size,nproc)
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} job chunks '.format(len(job_chunk_begin)), PGB.ETA()], maxval=len(job_chunk_begin)).start()
for ijob, job_start in enumerate(job_chunk_begin):
pjobs = []
out_q = MP.Queue()
for job_ind in xrange(job_start, min(job_start+nproc, self.f.size)): # Start the processes and store outputs in the queue
if mapping == 'weighted':
pjob = MP.Process(target=LKP.find_1NN_pp, args=(gridlocs, reflocs_xy * self.f[job_ind]/FCNST.c, job_ind, out_q, distNN*self.f.max()/FCNST.c, True), name='process-{0:0d}-channel-{1:0d}'.format(job_ind-job_start, job_ind))
else:
pjob = MP.Process(target=LKP.find_1NN_pp, args=(reflocs_xy * self.f[job_ind]/FCNST.c, gridlocs, job_ind, out_q, distNN*self.f.max()/FCNST.c, True), name='process-{0:0d}-channel-{1:0d}'.format(job_ind-job_start, job_ind))
pjob.start()
pjobs.append(pjob)
for p in xrange(len(pjobs)): # Unpack the queue output
outdict = out_q.get()
chan = outdict.keys()[0]
if mapping == 'weighted':
refind, gridind = outdict[chan]['inpind'], outdict[chan]['refind']
else:
gridind, refind = outdict[chan]['inpind'], outdict[chan]['refind']
self.grid_mapper[cpol]['refind'] += [refind]
self.grid_mapper[cpol]['gridind'] += [gridind]
bl_ind, lkp_ind = NP.unravel_index(refind, (n_bl, n_wts))
self.grid_mapper[cpol]['bl']['ind_freq'] += [bl_ind]
gridind_unraveled = NP.unravel_index(gridind, self.gridu.shape) + (chan+NP.zeros(gridind.size,dtype=int),)
gridind_raveled = NP.ravel_multi_index(gridind_unraveled, self.gridu.shape+(self.f.size,))
if self.grid_mapper[cpol]['bl']['ind_all'] is None:
self.grid_mapper[cpol]['bl']['ind_all'] = NP.copy(bl_ind)
self.grid_mapper[cpol]['bl']['illumination'] = refwts[refind]
contributed_bl_grid_Vf = refwts[refind] * Vf[bl_ind,chan]
self.grid_mapper[cpol]['grid']['ind_all'] = NP.copy(gridind_raveled)
else:
self.grid_mapper[cpol]['bl']['ind_all'] = NP.append(self.grid_mapper[cpol]['bl']['ind_all'], bl_ind)
self.grid_mapper[cpol]['bl']['illumination'] = NP.append(self.grid_mapper[cpol]['bl']['illumination'], refwts[refind])
contributed_bl_grid_Vf = NP.append(contributed_bl_grid_Vf, refwts[refind] * Vf[bl_ind,chan])
self.grid_mapper[cpol]['grid']['ind_all'] = NP.append(self.grid_mapper[cpol]['grid']['ind_all'], gridind_raveled)
for pjob in pjobs:
pjob.join()
del out_q
if verbose:
progress.update(ijob+1)
if verbose:
progress.finish()
elif pp_method == 'pool': ## Using MP.Pool.map(): Can be faster if parallelizing is not memory intensive
list_of_gridlocs = [gridlocs] * self.f.size
list_of_reflocs = [reflocs_xy * f/FCNST.c for f in self.f]
list_of_dist_NN = [distNN*self.f.max()/FCNST.c] * self.f.size
list_of_remove_oob = [True] * self.f.size
pool = MP.Pool(processes=nproc)
if mapping == 'weighted':
list_of_NNout = pool.map(find_1NN_arg_splitter, IT.izip(list_of_gridlocs, list_of_reflocs, list_of_dist_NN, list_of_remove_oob))
else:
list_of_NNout = pool.map(find_1NN_arg_splitter, IT.izip(list_of_reflocs, list_of_gridlocs, list_of_dist_NN, list_of_remove_oob))
pool.close()
pool.join()
for chan, NNout in enumerate(list_of_NNout): # Unpack the pool output
if mapping == 'weighted':
refind, gridind = NNout[0], NNout[1]
else:
gridind, refind = NNout[0], NNout[1]
self.grid_mapper[cpol]['refind'] += [refind]
self.grid_mapper[cpol]['gridind'] += [gridind]
bl_ind, lkp_ind = NP.unravel_index(refind, (n_bl, n_wts))
self.grid_mapper[cpol]['bl']['ind_freq'] += [bl_ind]
gridind_unraveled = NP.unravel_index(gridind, self.gridu.shape) + (chan+NP.zeros(gridind.size,dtype=int),)
gridind_raveled = NP.ravel_multi_index(gridind_unraveled, self.gridu.shape+(self.f.size,))
if chan == 0:
self.grid_mapper[cpol]['bl']['ind_all'] = NP.copy(bl_ind)
self.grid_mapper[cpol]['bl']['illumination'] = refwts[refind]
contributed_bl_grid_Vf = refwts[refind] * Vf[bl_ind,chan]
self.grid_mapper[cpol]['grid']['ind_all'] = NP.copy(gridind_raveled)
else:
self.grid_mapper[cpol]['bl']['ind_all'] = NP.append(self.grid_mapper[cpol]['bl']['ind_all'], bl_ind)
self.grid_mapper[cpol]['bl']['illumination'] = NP.append(self.grid_mapper[cpol]['bl']['illumination'], refwts[refind])
contributed_bl_grid_Vf = NP.append(contributed_bl_grid_Vf, refwts[refind] * Vf[bl_ind,chan])
self.grid_mapper[cpol]['grid']['ind_all'] = NP.append(self.grid_mapper[cpol]['grid']['ind_all'], gridind_raveled)
else:
raise ValueError('Parallel processing method specified by input parameter ppmethod has to be "pool" or "queue"')
else: # Use serial processing over frequency to determine gridding convolution
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Frequency channels '.format(self.f.size), PGB.ETA()], maxval=self.f.size).start()
for i in xrange(self.f.size):
if mapping == 'weighted':
refind, gridind = LKP.find_1NN(gridlocs, reflocs_xy * self.f[i]/FCNST.c,
distance_ULIM=distNN*self.f.max()/FCNST.c,
remove_oob=True)[:2]
else:
gridind, refind = LKP.find_1NN(reflocs_xy * self.f[i]/FCNST.c, gridlocs,
distance_ULIM=distNN*self.f.max()/FCNST.c,
remove_oob=True)[:2]
self.grid_mapper[cpol]['refind'] += [refind]
self.grid_mapper[cpol]['gridind'] += [gridind]
bl_ind, lkp_ind = NP.unravel_index(refind, (n_bl, n_wts))
self.grid_mapper[cpol]['bl']['ind_freq'] += [bl_ind]
gridind_unraveled = NP.unravel_index(gridind, self.gridu.shape) + (i+NP.zeros(gridind.size,dtype=int),)
gridind_raveled = NP.ravel_multi_index(gridind_unraveled, self.gridu.shape+(self.f.size,))
if i == 0:
self.grid_mapper[cpol]['bl']['ind_all'] = NP.copy(bl_ind)
self.grid_mapper[cpol]['bl']['illumination'] = refwts[refind]
contributed_bl_grid_Vf = refwts[refind] * Vf[bl_ind,i]
self.grid_mapper[cpol]['grid']['ind_all'] = NP.copy(gridind_raveled)
else:
self.grid_mapper[cpol]['bl']['ind_all'] = NP.append(self.grid_mapper[cpol]['bl']['ind_all'], bl_ind)
self.grid_mapper[cpol]['bl']['illumination'] = NP.append(self.grid_mapper[cpol]['bl']['illumination'], refwts[refind])
contributed_bl_grid_Vf = NP.append(contributed_bl_grid_Vf, refwts[refind] * Vf[bl_ind,i])
self.grid_mapper[cpol]['grid']['ind_all'] = NP.append(self.grid_mapper[cpol]['grid']['ind_all'], gridind_raveled)
if verbose:
progress.update(i+1)
if verbose:
progress.finish()
self.grid_mapper[cpol]['bl']['uniq_ind_all'] = NP.unique(self.grid_mapper[cpol]['bl']['ind_all'])
self.grid_mapper[cpol]['bl']['rev_ind_all'] = OPS.binned_statistic(self.grid_mapper[cpol]['bl']['ind_all'], statistic='count', bins=NP.append(self.grid_mapper[cpol]['bl']['uniq_ind_all'], self.grid_mapper[cpol]['bl']['uniq_ind_all'].max()+1))[3]
if parallel and (mapping == 'weighted'): # Use parallel processing over baselines to determine baseline-grid mapping of gridded aperture illumination and visibilities
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
if pp_method == 'queue': ## Use MP.Queue(): useful for memory intensive parallelizing but can be slow
num_bl = self.grid_mapper[cpol]['bl']['uniq_ind_all'].size
job_chunk_begin = range(0,num_bl,nproc)
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} job chunks '.format(len(job_chunk_begin)), PGB.ETA()], maxval=len(job_chunk_begin)).start()
for ijob, job_start in enumerate(job_chunk_begin):
pjobs1 = []
pjobs2 = []
out_q1 = MP.Queue()
out_q2 = MP.Queue()
for job_ind in xrange(job_start, min(job_start+nproc, num_bl)): # Start the parallel processes and store the output in the queue
label = self.ordered_labels[self.grid_mapper[cpol]['bl']['uniq_ind_all'][job_ind]]
if self.grid_mapper[cpol]['bl']['rev_ind_all'][job_ind] < self.grid_mapper[cpol]['bl']['rev_ind_all'][job_ind+1]:
self.grid_mapper[cpol]['labels'][label] = {}
self.grid_mapper[cpol]['labels'][label]['twts'] = twts[bl_labels.index(label)]
# self.grid_mapper[cpol]['labels'][label]['flag'] = self.interferometers[label].crosspol.flag[cpol]
select_bl_ind = self.grid_mapper[cpol]['bl']['rev_ind_all'][self.grid_mapper[cpol]['bl']['rev_ind_all'][job_ind]:self.grid_mapper[cpol]['bl']['rev_ind_all'][job_ind+1]]
gridind_raveled_around_bl = self.grid_mapper[cpol]['grid']['ind_all'][select_bl_ind]
uniq_gridind_raveled_around_bl = NP.unique(gridind_raveled_around_bl)
self.grid_mapper[cpol]['labels'][label]['gridind'] = uniq_gridind_raveled_around_bl
pjob1 = MP.Process(target=baseline_grid_mapper, args=(gridind_raveled_around_bl, contributed_bl_grid_Vf[select_bl_ind], NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1), label, out_q1), name='process-{0:0d}-{1}-visibility'.format(job_ind, label))
pjob2 = MP.Process(target=baseline_grid_mapper, args=(gridind_raveled_around_bl, self.grid_mapper[cpol]['bl']['illumination'][select_bl_ind], NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1), label, out_q2), name='process-{0:0d}-{1}-illumination'.format(job_ind, label))
pjob1.start()
pjob2.start()
pjobs1.append(pjob1)
pjobs2.append(pjob2)
for p in xrange(len(pjobs1)): # Unpack the gridded visibility and aperture illumination information from the pool output
outdict = out_q1.get()
label = outdict.keys()[0]
self.grid_mapper[cpol]['labels'][label]['Vf'] = outdict[label]
outdict = out_q2.get()
label = outdict.keys()[0]
self.grid_mapper[cpol]['labels'][label]['illumination'] = outdict[label]
for pjob in pjobs1:
pjob1.join()
for pjob in pjobs2:
pjob2.join()
del out_q1, out_q2
if verbose:
progress.update(ijob+1)
if verbose:
progress.finish()
elif pp_method == 'pool': ## Using MP.Pool.map(): Can be faster if parallelizing is not memory intensive
list_of_gridind_raveled_around_bl = []
list_of_bl_grid_values = []
list_of_bl_Vf_contribution = []
list_of_bl_illumination = []
list_of_uniq_gridind_raveled_around_bl = []
list_of_bl_labels = []
for j in xrange(self.grid_mapper[cpol]['bl']['uniq_ind_all'].size): # re-determine gridded visibilities due to each baseline
label = self.ordered_labels[self.grid_mapper[cpol]['bl']['uniq_ind_all'][j]]
if self.grid_mapper[cpol]['bl']['rev_ind_all'][j] < self.grid_mapper[cpol]['bl']['rev_ind_all'][j+1]:
self.grid_mapper[cpol]['labels'][label] = {}
self.grid_mapper[cpol]['labels'][label]['twts'] = twts[bl_labels.index(label)]
# self.grid_mapper[cpol]['labels'][label]['flag'] = self.interferometers[label].crosspol.flag[cpol]
select_bl_ind = self.grid_mapper[cpol]['bl']['rev_ind_all'][self.grid_mapper[cpol]['bl']['rev_ind_all'][j]:self.grid_mapper[cpol]['bl']['rev_ind_all'][j+1]]
gridind_raveled_around_bl = self.grid_mapper[cpol]['grid']['ind_all'][select_bl_ind]
uniq_gridind_raveled_around_bl = NP.unique(gridind_raveled_around_bl)
self.grid_mapper[cpol]['labels'][label]['gridind'] = uniq_gridind_raveled_around_bl
list_of_bl_labels += [label]
list_of_gridind_raveled_around_bl += [gridind_raveled_around_bl]
list_of_uniq_gridind_raveled_around_bl += [NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1)]
list_of_bl_Vf_contribution += [contributed_bl_grid_Vf[select_bl_ind]]
list_of_bl_illumination += [self.grid_mapper[cpol]['bl']['illumination'][select_bl_ind]]
pool = MP.Pool(processes=nproc)
list_of_bl_grid_values = pool.map(baseline_grid_mapping_arg_splitter, IT.izip(list_of_gridind_raveled_around_bl, list_of_bl_Vf_contribution, list_of_uniq_gridind_raveled_around_bl))
pool.close()
pool.join()
for label,grid_values in IT.izip(list_of_bl_labels, list_of_bl_grid_values): # Unpack the gridded visibility information from the pool output
self.grid_mapper[cpol]['labels'][label]['Vf'] = grid_values
if nproc is not None:
pool = MP.Pool(processes=nproc)
else:
pool = MP.Pool()
list_of_bl_grid_values = pool.map(baseline_grid_mapping_arg_splitter, IT.izip(list_of_gridind_raveled_around_bl, list_of_bl_illumination, list_of_uniq_gridind_raveled_around_bl))
pool.close()
pool.join()
for label,grid_values in IT.izip(list_of_bl_labels, list_of_bl_grid_values): # Unpack the gridded visibility and aperture illumination information from the pool output
self.grid_mapper[cpol]['labels'][label]['illumination'] = grid_values
del list_of_bl_grid_values, list_of_gridind_raveled_around_bl, list_of_bl_Vf_contribution, list_of_bl_illumination, list_of_uniq_gridind_raveled_around_bl, list_of_bl_labels
else:
raise ValueError('Parallel processing method specified by input parameter ppmethod has to be "pool" or "queue"')
else: # Use serial processing over baselines to determine baseline-grid mapping of gridded aperture illumination and visibilities
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Baselines '.format(self.grid_mapper[cpol]['bl']['uniq_ind_all'].size), PGB.ETA()], maxval=self.grid_mapper[cpol]['bl']['uniq_ind_all'].size).start()
for j in xrange(self.grid_mapper[cpol]['bl']['uniq_ind_all'].size):
label = self.ordered_labels[self.grid_mapper[cpol]['bl']['uniq_ind_all'][j]]
if self.grid_mapper[cpol]['bl']['rev_ind_all'][j] < self.grid_mapper[cpol]['bl']['rev_ind_all'][j+1]:
select_bl_ind = self.grid_mapper[cpol]['bl']['rev_ind_all'][self.grid_mapper[cpol]['bl']['rev_ind_all'][j]:self.grid_mapper[cpol]['bl']['rev_ind_all'][j+1]]
self.grid_mapper[cpol]['labels'][label] = {}
self.grid_mapper[cpol]['labels'][label]['twts'] = twts[bl_labels.index(label)]
# self.grid_mapper[cpol]['labels'][label]['flag'] = self.interferometers[label].crosspol.flag[cpol]
if mapping == 'weighted':
gridind_raveled_around_bl = self.grid_mapper[cpol]['grid']['ind_all'][select_bl_ind]
uniq_gridind_raveled_around_bl = NP.unique(gridind_raveled_around_bl)
self.grid_mapper[cpol]['labels'][label]['gridind'] = uniq_gridind_raveled_around_bl
self.grid_mapper[cpol]['labels'][label]['Vf'] = OPS.binned_statistic(gridind_raveled_around_bl, contributed_bl_grid_Vf[select_bl_ind].real, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1))[0]
self.grid_mapper[cpol]['labels'][label]['Vf'] = self.grid_mapper[cpol]['labels'][label]['Vf'].astype(NP.complex64)
self.grid_mapper[cpol]['labels'][label]['Vf'] += 1j * OPS.binned_statistic(gridind_raveled_around_bl, contributed_bl_grid_Vf[select_bl_ind].imag, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1))[0]
self.grid_mapper[cpol]['labels'][label]['illumination'] = OPS.binned_statistic(gridind_raveled_around_bl, self.grid_mapper[cpol]['bl']['illumination'][select_bl_ind].real, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1))[0]
self.grid_mapper[cpol]['labels'][label]['illumination'] = self.grid_mapper[cpol]['labels'][label]['illumination'].astype(NP.complex64)
self.grid_mapper[cpol]['labels'][label]['illumination'] += 1j * OPS.binned_statistic(gridind_raveled_around_bl, self.grid_mapper[cpol]['bl']['illumination'][select_bl_ind].imag, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1))[0]
else:
self.grid_mapper[cpol]['labels'][label]['gridind'] = self.grid_mapper[cpol]['grid']['ind_all'][select_bl_ind]
self.grid_mapper[cpol]['labels'][label]['Vf'] = contributed_bl_grid_Vf[select_bl_ind]
self.grid_mapper[cpol]['labels'][label]['illumination'] = self.grid_mapper[cpol]['bl']['illumination'][select_bl_ind]
if verbose:
progress.update(j+1)
if verbose:
progress.finish()
else: # Only re-determine gridded visibilities
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Frequency channels '.format(self.f.size), PGB.ETA()], maxval=self.f.size).start()
for i in xrange(self.f.size): # Only re-estimate visibilities contributed by baselines
bl_refwts = self.grid_mapper[cpol]['refwts'][self.grid_mapper[cpol]['refind'][i]]
bl_Vf = Vf[self.grid_mapper[cpol]['bl']['ind_freq'][i],i]
if i == 0:
contributed_bl_grid_Vf = bl_refwts * bl_Vf
else:
contributed_bl_grid_Vf = NP.append(contributed_bl_grid_Vf, bl_refwts * bl_Vf)
if verbose:
progress.update(i+1)
if verbose:
progress.finish()
if parallel and (mapping == 'weighted'): # Use parallel processing
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
if pp_method == 'queue': ## Use MP.Queue(): useful for memory intensive parallelizing but can be slow
num_bl = self.grid_mapper[cpol]['bl']['uniq_ind_all'].size
job_chunk_begin = range(0,num_bl,nproc)
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} job chunks '.format(len(job_chunk_begin)), PGB.ETA()], maxval=len(job_chunk_begin)).start()
for ijob, job_start in enumerate(job_chunk_begin):
pjobs = []
out_q = MP.Queue()
for job_ind in xrange(job_start, min(job_start+nproc, num_bl)): # Start the parallel processes and store the outputs in a queue
label = self.ordered_labels[self.grid_mapper[cpol]['bl']['uniq_ind_all'][job_ind]]
self.grid_mapper[cpol]['labels'][label]['twts'] = twts[bl_labels.index(label)]
if self.grid_mapper[cpol]['bl']['rev_ind_all'][job_ind] < self.grid_mapper[cpol]['bl']['rev_ind_all'][job_ind+1]:
select_bl_ind = self.grid_mapper[cpol]['bl']['rev_ind_all'][self.grid_mapper[cpol]['bl']['rev_ind_all'][job_ind]:self.grid_mapper[cpol]['bl']['rev_ind_all'][job_ind+1]]
gridind_raveled_around_bl = self.grid_mapper[cpol]['grid']['ind_all'][select_bl_ind]
uniq_gridind_raveled_around_bl = self.grid_mapper[cpol]['labels'][label]['gridind']
pjob = MP.Process(target=baseline_grid_mapper, args=(gridind_raveled_around_bl, contributed_bl_grid_Vf[select_bl_ind], NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1), label, out_q), name='process-{0:0d}-{1}-visibility'.format(job_ind, label))
pjob.start()
pjobs.append(pjob)
for p in xrange(len(pjobs)): # Unpack the gridded visibility information from the queue
outdict = out_q.get()
label = outdict.keys()[0]
self.grid_mapper[cpol]['labels'][label]['Vf'] = outdict[label]
for pjob in pjobs:
pjob.join()
del out_q
if verbose:
progress.update(ijob+1)
if verbose:
progress.finish()
else: ## Use MP.Pool.map(): Can be faster if parallelizing is not memory intensive
list_of_gridind_raveled_around_bl = []
list_of_bl_Vf_contribution = []
list_of_uniq_gridind_raveled_around_bl = []
list_of_bl_labels = []
for j in xrange(self.grid_mapper[cpol]['bl']['uniq_ind_all'].size): # re-determine gridded visibilities due to each baseline
if self.grid_mapper[cpol]['bl']['rev_ind_all'][j] < self.grid_mapper[cpol]['bl']['rev_ind_all'][j+1]:
select_bl_ind = self.grid_mapper[cpol]['bl']['rev_ind_all'][self.grid_mapper[cpol]['bl']['rev_ind_all'][j]:self.grid_mapper[cpol]['bl']['rev_ind_all'][j+1]]
label = self.ordered_labels[self.grid_mapper[cpol]['bl']['uniq_ind_all'][j]]
self.grid_mapper[cpol]['labels'][label]['twts'] = twts[bl_labels.index(label)]
gridind_raveled_around_bl = self.grid_mapper[cpol]['grid']['ind_all'][select_bl_ind]
uniq_gridind_raveled_around_bl = NP.unique(gridind_raveled_around_bl)
list_of_bl_labels += [label]
list_of_gridind_raveled_around_bl += [gridind_raveled_around_bl]
list_of_uniq_gridind_raveled_around_bl += [NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1)]
list_of_bl_Vf_contribution += [contributed_bl_grid_Vf[select_bl_ind]]
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
pool = MP.Pool(processes=nproc)
list_of_grid_Vf = pool.map(baseline_grid_mapping_arg_splitter, IT.izip(list_of_gridind_raveled_around_bl, list_of_bl_Vf_contribution, list_of_uniq_gridind_raveled_around_bl))
pool.close()
pool.join()
for label,grid_Vf in IT.izip(list_of_bl_labels, list_of_grid_Vf): # Unpack the gridded visibility information from the pool output
self.grid_mapper[cpol]['labels'][label]['Vf'] = grid_Vf
del list_of_gridind_raveled_around_bl, list_of_grid_Vf, list_of_bl_Vf_contribution, list_of_uniq_gridind_raveled_around_bl, list_of_bl_labels
else: # use serial processing
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Baselines '.format(self.grid_mapper[cpol]['bl']['uniq_ind_all'].size), PGB.ETA()], maxval=self.grid_mapper[cpol]['bl']['uniq_ind_all'].size).start()
for j in xrange(self.grid_mapper[cpol]['bl']['uniq_ind_all'].size): # re-determine gridded visibilities due to each baseline
if self.grid_mapper[cpol]['bl']['rev_ind_all'][j] < self.grid_mapper[cpol]['bl']['rev_ind_all'][j+1]:
select_bl_ind = self.grid_mapper[cpol]['bl']['rev_ind_all'][self.grid_mapper[cpol]['bl']['rev_ind_all'][j]:self.grid_mapper[cpol]['bl']['rev_ind_all'][j+1]]
label = self.ordered_labels[self.grid_mapper[cpol]['bl']['uniq_ind_all'][j]]
self.grid_mapper[cpol]['labels'][label]['twts'] = twts[bl_labels.index(label)]
self.grid_mapper[cpol]['labels'][label]['Vf'] = {}
if mapping == 'weighted':
gridind_raveled_around_bl = self.grid_mapper[cpol]['grid']['ind_all'][select_bl_ind]
uniq_gridind_raveled_around_bl = self.grid_mapper[cpol]['labels'][label]['gridind']
# uniq_gridind_raveled_around_bl = NP.unique(gridind_raveled_around_bl)
self.grid_mapper[cpol]['labels'][label]['Vf'] = OPS.binned_statistic(gridind_raveled_around_bl, contributed_bl_grid_Vf[select_bl_ind].real, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1))[0]
self.grid_mapper[cpol]['labels'][label]['Vf'] = self.grid_mapper[cpol]['labels'][label]['Vf'].astype(NP.complex64)
self.grid_mapper[cpol]['labels'][label]['Vf'] += 1j * OPS.binned_statistic(gridind_raveled_around_bl, contributed_bl_grid_Vf[select_bl_ind].imag, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_bl, uniq_gridind_raveled_around_bl.max()+1))[0]
else:
self.grid_mapper[cpol]['labels'][label]['Vf'] = contributed_bl_grid_Vf[select_bl_ind]
if verbose:
progress.update(j+1)
if verbose:
progress.finish()
############################################################################
def grid_convolve_new(self, pol=None, normalize=False, method='NN',
distNN=NP.inf, identical_interferometers=True,
cal_loop=False, gridfunc_freq=None, wts_change=False,
parallel=False, nproc=None, pp_method='pool',
verbose=True):
"""
------------------------------------------------------------------------
Routine to project the complex illumination power pattern and the
visibilities on the grid from the interferometer array
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1'
or 'P2'. If set to None, gridding for all the polarizations
is performed. Default = None
normalize [Boolean] Default = False. If set to True, the gridded
weights are divided by the sum of weights so that the gridded
weights add up to unity. (Need to work on normaliation)
method [string] The gridding method to be used in applying the
interferometer weights on to the interferometer array grid.
Accepted values are 'NN' (nearest neighbour - default), 'CS'
(cubic spline), or 'BL' (Bi-linear). In case of applying grid
weights by 'NN' method, an optional distance upper bound for
the nearest neighbour can be provided in the parameter distNN
to prune the search and make it efficient. Currently, only
the nearest neighbour method is operational.
distNN [scalar] A positive value indicating the upper bound on
distance to the nearest neighbour in the gridding process. It
has units of distance, the same units as the interferometer
attribute location and interferometer array attribute gridx
and gridy. Default is NP.inf (infinite distance). It will be
internally converted to have same units as interferometer
attributes wtspos (units in number of wavelengths). To ensure
all relevant pixels in the grid, the search distance used
internally will be a fraction more than distNN
identical_interferometers
[boolean] indicates if all interferometer elements are to be
treated as identical. If True (default), they are identical
and their gridding kernels are identical. If False, they are
not identical and each one has its own gridding kernel.
cal_loop [boolean] If True, the calibration loop is assumed to be ON
and hence the calibrated electric fields are set in the
calibration loop. If False (default), the calibration loop is
assumed to be OFF and the current electric fields are assumed
to be the calibrated data to be mapped to the grid
via gridding convolution.
gridfunc_freq
[String scalar] If set to None (not provided) or to 'scale'
assumes that attribute wtspos is given for a
reference frequency which need to be scaled for the frequency
channels. Will be ignored if the number of elements of list
in this attribute under the specific polarization are the
same as the number of frequency channels.
wts_change [boolean] indicates if weights and/or their lcoations have
changed from the previous intergration or snapshot.
Default=False means they have not changed. In such a case the
interferometer-to-grid mapping and grid illumination pattern do not
have to be determined, and mapping and values from the
previous snapshot can be used. If True, a new mapping has to
be determined.
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
pp_method [string] specifies if the parallelization method is handled
automatically using multirocessing pool or managed manually
by individual processes and collecting results in a queue.
The former is specified by 'pool' (default) and the latter
by 'queue'. These are the two allowed values. The pool method
has easier bookkeeping and can be fast if the computations
not expected to be memory bound. The queue method is more
suited for memory bound processes but can be slower or
inefficient in terms of CPU management.
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
eps = 1.0e-10
if pol is None:
pol = ['P1', 'P2']
elif not isinstance(pol, list):
pol = [pol]
if not self.grid_ready:
self.grid()
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
wavelength = FCNST.c / self.f
min_lambda = NP.abs(wavelength).min()
rmaxNN = 0.5 * NP.sqrt(du**2 + dv**2) * min_lambda
krn = {}
crosspol = ['P11', 'P12', 'P21', 'P22']
for cpol in crosspol:
krn[cpol] = None
if cpol in pol:
bl_dict = self.baseline_vectors(pol=cpol, flag=None, sort=True)
self.ordered_labels = bl_dict['labels']
bl_xy = bl_dict['baselines'][:,:2] # n_bl x 2
n_bl = bl_xy.shape[0]
Vf_dict = self.get_visibilities(cpol, flag=None, tselect=-1, fselect=None, bselect=None, datapool='avg', sort=True)
Vf = Vf_dict['visibilities'].astype(NP.complex64) # (n_ts=1) x n_bl x nchan
Vf = NP.squeeze(Vf, axis=0) # n_bl x nchan
if Vf.shape[0] != n_bl:
raise ValueError('Encountered unexpected behavior. Need to debug.')
bl_labels = Vf_dict['labels']
twts = Vf_dict['twts'] # (n_ts=1) x n_bl x (nchan=1)
twts = NP.squeeze(twts, axis=(0,2)) # n_bl
if verbose:
print 'Gathered interferometer data for gridding convolution for timestamp {0}'.format(self.timestamp)
if wts_change or (not self.grid_mapper[cpol]['all_bl2grid']):
self.grid_mapper[cpol]['per_bl2grid'] = []
self.grid_mapper[cpol]['all_bl2grid'] = {}
gridlocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
if gridfunc_freq == 'scale':
grid_xy = gridlocs[NP.newaxis,:,:] * wavelength.reshape(-1,1,1) # nchan x nv x nu
wl = NP.ones(gridlocs.shape[0])[NP.newaxis,:] * wavelength.reshape(-1,1)
grid_xy = grid_xy.reshape(-1,2)
wl = wl.reshape(-1)
indNN_list, blind, fvu_gridind = LKP.find_NN(bl_xy, grid_xy, distance_ULIM=2.0*distNN, flatten=True, parallel=False)
dxy = grid_xy[fvu_gridind,:] - bl_xy[blind,:]
fvu_gridind_unraveled = NP.unravel_index(fvu_gridind, (self.f.size,)+self.gridu.shape) # f-v-u order since temporary grid was created as nchan x nv x nu
self.grid_mapper[cpol]['all_bl2grid']['blind'] = NP.copy(blind)
self.grid_mapper[cpol]['all_bl2grid']['u_gridind'] = NP.copy(fvu_gridind_unraveled[2])
self.grid_mapper[cpol]['all_bl2grid']['v_gridind'] = NP.copy(fvu_gridind_unraveled[1])
self.grid_mapper[cpol]['all_bl2grid']['f_gridind'] = NP.copy(fvu_gridind_unraveled[0])
self.grid_mapper[cpol]['all_bl2grid']['indNN_list'] = copy.deepcopy(indNN_list)
self.grid_mapper[cpol]['all_bl2grid']['twts'] = copy.deepcopy(twts)
if identical_interferometers:
arbitrary_interferometer_aperture = self.interferometers.itervalues().next().aperture
krn = arbitrary_interferometer_aperture.compute(dxy, wavelength=wl[fvu_gridind], pol=cpol, rmaxNN=rmaxNN, load_lookup=False)
else:
# This block #1 is one way to go about per interferometer
for bi,gi in enumerate(indNN_list):
if len(gi) > 0:
label = self.ordered_labels[bi]
ind = NP.asarray(gi)
diffxy = grid_xy[ind,:].reshape(-1,2) - bl_xy[bi,:].reshape(-1,2)
krndict = self.interferometers[label].aperture.compute(diffxy, wavelength=wl[ind], pol=cpol, rmaxNN=rmaxNN, load_lookup=False)
if krn[cpol] is None:
krn[cpol] = NP.copy(krndict[cpol])
else:
krn[cpol] = NP.append(krn[cpol], krndict[cpol])
# # This block #2 is another way equivalent to above block #1
# uniq_blind = NP.unique(blind)
# blhist, blbe, blbn, blri = OPS.binned_statistic(blind, statistic='count', bins=NP.append(uniq_blind, uniq_blind.max()+1))
# for i,ublind in enumerate(uniq_blind):
# label = self.ordered_labels[ublind]
# ind = blri[blri[i]:blri[i+1]]
# krndict = self.interferometers[label].aperture.compute(dxy[ind,:], wavelength=wl[ind], pol=cpol, rmaxNN=rmaxNN, load_lookup=False)
# if krn[cpol] is None:
# krn[cpol] = NP.copy(krndict[cpol])
# else:
# krn[cpol] = NP.append(krn[cpol], krndict[cpol])
self.grid_mapper[cpol]['all_bl2grid']['illumination'] = NP.copy(krn[cpol])
else: # Weights do not scale with frequency (needs serious development)
pass
# Determine weights that can normalize sum of kernel per interferometer per frequency to unity
# per_bl_per_freq_norm_wts = NP.ones(blind.size, dtype=NP.complex64)
per_bl_per_freq_norm_wts = NP.zeros(blind.size, dtype=NP.complex64)
runsum = 0
for bi,gi in enumerate(indNN_list):
if len(gi) > 0:
fvu_ind = NP.asarray(gi)
unraveled_fvu_ind = NP.unravel_index(fvu_ind, (self.f.size,)+self.gridu.shape)
f_ind = unraveled_fvu_ind[0]
v_ind = unraveled_fvu_ind[1]
u_ind = unraveled_fvu_ind[2]
chanhist, chanbe, chanbn, chanri = OPS.binned_statistic(f_ind, statistic='count', bins=NP.arange(self.f.size+1))
for ci in xrange(self.f.size):
if chanhist[ci] > 0.0:
select_chan_ind = chanri[chanri[ci]:chanri[ci+1]]
per_bl_per_freq_kernel_sum = NP.sum(krn[cpol][runsum:runsum+len(gi)][select_chan_ind])
per_bl_per_freq_norm_wts[runsum:runsum+len(gi)][select_chan_ind] = 1.0 / per_bl_per_freq_kernel_sum
per_bl2grid_info = {}
per_bl2grid_info['label'] = self.ordered_labels[bi]
per_bl2grid_info['twts'] = twts[bi]
per_bl2grid_info['f_gridind'] = NP.copy(f_ind)
per_bl2grid_info['u_gridind'] = NP.copy(u_ind)
per_bl2grid_info['v_gridind'] = NP.copy(v_ind)
# per_bl2grid_info['fvu_gridind'] = NP.copy(gi)
per_bl2grid_info['per_bl_per_freq_norm_wts'] = per_bl_per_freq_norm_wts[runsum:runsum+len(gi)]
per_bl2grid_info['illumination'] = krn[cpol][runsum:runsum+len(gi)]
self.grid_mapper[cpol]['per_bl2grid'] += [copy.deepcopy(per_bl2grid_info)]
runsum += len(gi)
self.grid_mapper[cpol]['all_bl2grid']['per_bl_per_freq_norm_wts'] = NP.copy(per_bl_per_freq_norm_wts)
# Determine the gridded electric fields
Vf_on_grid = Vf[(self.grid_mapper[cpol]['all_bl2grid']['blind'], self.grid_mapper[cpol]['all_bl2grid']['f_gridind'])]
self.grid_mapper[cpol]['all_bl2grid']['Vf'] = copy.deepcopy(Vf_on_grid)
runsum = 0
for bi,gi in enumerate(self.grid_mapper[cpol]['all_bl2grid']['indNN_list']):
if len(gi) > 0:
self.grid_mapper[cpol]['per_bl2grid'][bi]['Vf'] = Vf_on_grid[runsum:runsum+len(gi)]
runsum += len(gi)
############################################################################
def genMappingMatrix(self, pol=None, normalize=True, method='NN',
distNN=NP.inf, identical_interferometers=True,
gridfunc_freq=None, wts_change=False, parallel=False,
nproc=None, verbose=True):
"""
------------------------------------------------------------------------
Routine to construct sparse interferometer-to-grid mapping matrix that
will be used in projecting illumination and visibilities from the
array of interferometers onto the grid. It has elements very common to
grid_convolve_new()
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P11',
'P12', 'P21', or 'P2'. If set to None, gridding for all the
polarizations is performed. Default = None
normalize [Boolean] Default = False. If set to True, the gridded
weights are divided by the sum of weights so that the gridded
weights add up to unity. (Need to work on normaliation)
method [string] The gridding method to be used in applying the
interferometer weights on to the interferometer array grid.
Accepted values are 'NN' (nearest neighbour - default), 'CS'
(cubic spline), or 'BL' (Bi-linear). In case of applying grid
weights by 'NN' method, an optional distance upper bound for
the nearest neighbour can be provided in the parameter distNN
to prune the search and make it efficient. Currently, only
the nearest neighbour method is operational.
distNN [scalar] A positive value indicating the upper bound on
distance to the nearest neighbour in the gridding process. It
has units of distance, the same units as the interferometer
attribute location and interferometer array attribute gridx
and gridy. Default is NP.inf (infinite distance). It will be
internally converted to have same units as interferometer
attributes wtspos (units in number of wavelengths). To ensure
all relevant pixels in the grid, the search distance used
internally will be a fraction more than distNN
identical_interferometers
[boolean] indicates if all interferometer elements are to be
treated as identical. If True (default), they are identical
and their gridding kernels are identical. If False, they are
not identical and each one has its own gridding kernel.
gridfunc_freq
[String scalar] If set to None (not provided) or to 'scale'
assumes that attribute wtspos is given for a
reference frequency which need to be scaled for the frequency
channels. Will be ignored if the number of elements of list
in this attribute under the specific polarization are the
same as the number of frequency channels.
wts_change [boolean] indicates if weights and/or their lcoations have
changed from the previous intergration or snapshot.
Default=False means they have not changed. In such a case the
interferometer-to-grid mapping and grid illumination pattern
do not have to be determined, and mapping and values from the
previous snapshot can be used. If True, a new mapping has to
be determined.
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
NOTE: Although certain portions are parallelizable, the overheads in
these processes seem to make it worse than serial processing. It is
advisable to stick to serialized version unless testing with larger
data sets clearly indicates otherwise.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
elif not isinstance(pol, list):
pol = [pol]
if not self.grid_ready:
self.grid()
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
wavelength = FCNST.c / self.f
min_lambda = NP.abs(wavelength).min()
rmaxNN = 0.5 * NP.sqrt(du**2 + dv**2) * min_lambda
krn = {}
self.bl2grid_mapper = {}
crosspol = ['P11', 'P12', 'P21', 'P22']
for cpol in crosspol:
krn[cpol] = None
self.bl2grid_mapper[cpol] = None
if cpol in pol:
bl_dict = self.baseline_vectors(pol=cpol, flag=None, sort=True)
self.ordered_labels = bl_dict['labels']
bl_xy = bl_dict['baselines'][:,:2] # n_bl x 2
n_bl = bl_xy.shape[0]
if verbose:
print 'Gathered interferometer data for gridding convolution for timestamp {0}'.format(self.timestamp)
if wts_change or (not self.grid_mapper[cpol]['all_bl2grid']):
self.grid_mapper[cpol]['per_bl2grid'] = []
self.grid_mapper[cpol]['all_bl2grid'] = {}
gridlocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
if gridfunc_freq == 'scale':
grid_xy = gridlocs[NP.newaxis,:,:] * wavelength.reshape(-1,1,1) # nchan x nv x nu
wl = NP.ones(gridlocs.shape[0])[NP.newaxis,:] * wavelength.reshape(-1,1)
grid_xy = grid_xy.reshape(-1,2)
wl = wl.reshape(-1)
indNN_list, blind, fvu_gridind = LKP.find_NN(bl_xy, grid_xy, distance_ULIM=2.0*distNN, flatten=True, parallel=False)
dxy = grid_xy[fvu_gridind,:] - bl_xy[blind,:]
fvu_gridind_unraveled = NP.unravel_index(fvu_gridind, (self.f.size,)+self.gridu.shape) # f-v-u order since temporary grid was created as nchan x nv x nu
self.grid_mapper[cpol]['all_bl2grid']['blind'] = NP.copy(blind)
self.grid_mapper[cpol]['all_bl2grid']['u_gridind'] = NP.copy(fvu_gridind_unraveled[2])
self.grid_mapper[cpol]['all_bl2grid']['v_gridind'] = NP.copy(fvu_gridind_unraveled[1])
self.grid_mapper[cpol]['all_bl2grid']['f_gridind'] = NP.copy(fvu_gridind_unraveled[0])
# self.grid_mapper[cpol]['all_bl2grid']['indNN_list'] = copy.deepcopy(indNN_list)
if identical_interferometers:
arbitrary_interferometer_aperture = self.interferometers.itervalues().next().aperture
krn = arbitrary_interferometer_aperture.compute(dxy, wavelength=wl[fvu_gridind], pol=cpol, rmaxNN=rmaxNN, load_lookup=False)
else:
# This block #1 is one way to go about per interferometer
for ai,gi in enumerate(indNN_list):
if len(gi) > 0:
label = self.ordered_labels[ai]
ind = NP.asarray(gi)
diffxy = grid_xy[ind,:].reshape(-1,2) - bl_xy[ai,:].reshape(-1,2)
krndict = self.interferometers[label].aperture.compute(diffxy, wavelength=wl[ind], pol=cpol, rmaxNN=rmaxNN, load_lookup=False)
if krn[cpol] is None:
krn[cpol] = NP.copy(krndict[cpol])
else:
krn[cpol] = NP.append(krn[cpol], krndict[cpol])
# # This block #2 is another way equivalent to above block #1
# uniq_blind = NP.unique(blind)
# blhist, blbe, blbn, blri = OPS.binned_statistic(blind, statistic='count', bins=NP.append(uniq_blind, uniq_blind.max()+1))
# for i,ublind in enumerate(uniq_blind):
# label = self.ordered_labels[ublind]
# ind = blri[blri[i]:blri[i+1]]
# krndict = self.interferometers[label].aperture.compute(dxy[ind,:], wavelength=wl[ind], pol=cpol, rmaxNN=rmaxNN, load_lookup=False)
# if krn[cpol] is None:
# krn[cpol] = NP.copy(krndict[cpol])
# else:
# krn[cpol] = NP.append(krn[cpol], krndict[cpol])
self.grid_mapper[cpol]['all_bl2grid']['illumination'] = NP.copy(krn[cpol])
else: # Weights do not scale with frequency (needs serious development)
pass
# Determine weights that can normalize sum of kernel per interferometer per frequency to unity
per_bl_per_freq_norm_wts = NP.zeros(blind.size, dtype=NP.complex64)
# per_bl_per_freq_norm_wts = NP.ones(blind.size, dtype=NP.complex64)
if parallel or (nproc is not None):
list_of_val = []
list_of_rowcol_tuple = []
else:
spval = []
sprow = []
spcol = []
runsum = 0
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Baselines '.format(n_bl), PGB.ETA()], maxval=n_bl).start()
for bi,gi in enumerate(indNN_list):
if len(gi) > 0:
fvu_ind = NP.asarray(gi)
unraveled_fvu_ind = NP.unravel_index(fvu_ind, (self.f.size,)+self.gridu.shape)
f_ind = unraveled_fvu_ind[0]
v_ind = unraveled_fvu_ind[1]
u_ind = unraveled_fvu_ind[2]
chanhist, chanbe, chanbn, chanri = OPS.binned_statistic(f_ind, statistic='count', bins=NP.arange(self.f.size+1))
for ci in xrange(self.f.size):
if chanhist[ci] > 0.0:
select_chan_ind = chanri[chanri[ci]:chanri[ci+1]]
per_bl_per_freq_kernel_sum = NP.sum(krn[cpol][runsum:runsum+len(gi)][select_chan_ind])
per_bl_per_freq_norm_wts[runsum:runsum+len(gi)][select_chan_ind] = 1.0 / per_bl_per_freq_kernel_sum
per_bl2grid_info = {}
per_bl2grid_info['label'] = self.ordered_labels[bi]
per_bl2grid_info['f_gridind'] = NP.copy(f_ind)
per_bl2grid_info['u_gridind'] = NP.copy(u_ind)
per_bl2grid_info['v_gridind'] = NP.copy(v_ind)
# per_bl2grid_info['fvu_gridind'] = NP.copy(gi)
per_bl2grid_info['per_bl_per_freq_norm_wts'] = per_bl_per_freq_norm_wts[runsum:runsum+len(gi)]
per_bl2grid_info['illumination'] = krn[cpol][runsum:runsum+len(gi)]
self.grid_mapper[cpol]['per_bl2grid'] += [copy.deepcopy(per_bl2grid_info)]
runsum += len(gi)
# determine the sparse interferometer-to-grid mapping matrix pre-requisites
val = per_bl2grid_info['per_bl_per_freq_norm_wts']*per_bl2grid_info['illumination']
vuf_gridind_unraveled = (per_bl2grid_info['v_gridind'],per_bl2grid_info['u_gridind'],per_bl2grid_info['f_gridind'])
vuf_gridind_raveled = NP.ravel_multi_index(vuf_gridind_unraveled, (self.gridu.shape+(self.f.size,)))
if (not parallel) and (nproc is None):
spval += val.tolist()
sprow += vuf_gridind_raveled.tolist()
spcol += (per_bl2grid_info['f_gridind'] + bi*self.f.size).tolist()
else:
list_of_val += [per_bl2grid_info['per_bl_per_freq_norm_wts']*per_bl2grid_info['illumination']]
list_of_rowcol_tuple += [(vuf_gridind_raveled, per_bl2grid_info['f_gridind'])]
if verbose:
progress.update(bi+1)
if verbose:
progress.finish()
# determine the sparse interferometer-to-grid mapping matrix
if parallel or (nproc is not None):
list_of_shapes = [(self.gridu.size*self.f.size, self.f.size)] * n_bl
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
pool = MP.Pool(processes=nproc)
list_of_spmat = pool.map(genMatrixMapper_arg_splitter, IT.izip(list_of_val, list_of_rowcol_tuple, list_of_shapes))
self.bl2grid_mapper[cpol] = SpM.hstack(list_of_spmat, format='csr')
else:
spval = NP.asarray(spval)
sprowcol = (NP.asarray(sprow), NP.asarray(spcol))
self.bl2grid_mapper[cpol] = SpM.csr_matrix((spval, sprowcol), shape=(self.gridu.size*self.f.size, n_bl*self.f.size))
self.grid_mapper[cpol]['all_bl2grid']['per_bl_per_freq_norm_wts'] = NP.copy(per_bl_per_freq_norm_wts)
############################################################################
def applyMappingMatrix(self, pol=None, verbose=True):
"""
------------------------------------------------------------------------
Constructs the grid of complex illumination and visibilities
using the sparse baseline-to-grid mapping matrix. Intended to serve as a
"matrix" alternative to make_grid_cube_new()
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P11',
'P12', 'P21', or 'P22'. If set to None, gridding for all the
polarizations is performed. Default=None
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P11', 'P12', 'P21', 'P22']
pol = NP.unique(NP.asarray(pol))
for cpol in pol:
if verbose:
print 'Gridding aperture illumination and visibilities for polarization {0} ...'.format(cpol)
if cpol not in ['P11', 'P12', 'P21', 'P22']:
raise ValueError('Invalid specification for input parameter pol')
Vf_dict = self.get_visibilities(cpol, flag=None, tselect=-1, fselect=None, bselect=None, datapool='avg', sort=True)
Vf = Vf_dict['visibilities'].astype(NP.complex64) # (n_ts=1) x n_bl x nchan
Vf = NP.squeeze(Vf, axis=0) # n_bl x nchan
twts = Vf_dict['twts'] # (n_ts=1) x n_ant x 1
twts = NP.squeeze(twts, axis=0) # n_ant x 1
unflagged = twts > 0.0
unflagged = unflagged.astype(int)
Vf = Vf * unflagged # applies antenna flagging, n_ant x nchan
wts = unflagged * NP.ones(self.f.size).reshape(1,-1) # n_ant x nchan
wts[NP.isnan(Vf)] = 0.0
Vf[NP.isnan(Vf)] = 0.0
Vf = Vf.ravel()
wts = wts.ravel()
sparse_Vf = SpM.csr_matrix(Vf)
sparse_wts = SpM.csr_matrix(wts)
# Store as sparse matrices
self.grid_illumination[cpol] = self.bl2grid_mapper[cpol].dot(sparse_wts.T)
self.grid_Vf[cpol] = self.bl2grid_mapper[cpol].dot(sparse_Vf.T)
# # Store as dense matrices
# self.grid_illumination[cpol] = self.bl2grid_mapper[cpol].dot(wts).reshape(self.gridu.shape+(self.f.size,))
# self.grid_Vf[cpol] = self.bl2grid_mapper[cpol].dot(Vf).reshape(self.gridu.shape+(self.f.size,))
if verbose:
print 'Gridded aperture illumination and electric fields for polarization {0} from {1:0d} unflagged contributing antennas'.format(cpol, NP.sum(unflagged).astype(int))
############################################################################
def make_grid_cube(self, pol=None, verbose=True):
"""
------------------------------------------------------------------------
Constructs the grid of complex power illumination and visibilities using
the gridding information determined for every baseline. Flags are taken
into account while constructing this grid.
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P11',
'P12', 'P21' or 'P22'. If set to None, gridding for all the
polarizations is performed. Default = None
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P11', 'P12', 'P21', 'P22']
pol = NP.unique(NP.asarray(pol))
for cpol in pol:
if verbose:
print 'Gridding aperture illumination and visibilities for polarization {0} ...'.format(cpol)
if cpol not in ['P11', 'P12', 'P21', 'P22']:
raise ValueError('Invalid specification for input parameter pol')
if cpol not in self._bl_contribution:
raise KeyError('Key {0} not found in attribute _bl_contribution'.format(cpol))
self.grid_illumination[cpol] = NP.zeros((self.gridu.shape + (self.f.size,)), dtype=NP.complex_)
self.grid_Vf[cpol] = NP.zeros((self.gridu.shape + (self.f.size,)), dtype=NP.complex_)
labels = self.grid_mapper[cpol]['labels'].keys()
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Antennas '.format(len(labels)), PGB.ETA()], maxval=len(labels)).start()
loopcount = 0
num_unflagged = 0
sum_twts = 0.0
for bllabel, blinfo in self.grid_mapper[cpol]['labels'].iteritems():
# if not self.interferometers[bllabel].crosspol.flag[cpol]:
if blinfo['twts'] > 0.0:
num_unflagged += 1
sum_twts += blinfo['twts']
gridind_unraveled = NP.unravel_index(blinfo['gridind'], self.gridu.shape+(self.f.size,))
# self.grid_illumination[cpol][gridind_unraveled] += blinfo['illumination'] * blinfo['twts']
# self.grid_Vf[cpol][gridind_unraveled] += blinfo['Vf'] * blinfo['twts']
self.grid_illumination[cpol][gridind_unraveled] += blinfo['illumination']
self.grid_Vf[cpol][gridind_unraveled] += blinfo['Vf']
progress.update(loopcount+1)
loopcount += 1
progress.finish()
# self.grid_Vf[cpol] *= num_unflagged/sum_twts
if verbose:
print 'Gridded aperture illumination and visibilities for polarization {0} from {1:0d} unflagged contributing baselines'.format(cpol, num_unflagged)
############################################################################
def make_grid_cube_new(self, pol=None, verbose=True):
"""
------------------------------------------------------------------------
Constructs the grid of complex power illumination and visibilities using
the gridding information determined for every baseline. Flags are taken
into account while constructing this grid.
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P11',
'P12', 'P21' or 'P22'. If set to None, gridding for all the
polarizations is performed. Default = None
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P11', 'P12', 'P21', 'P22']
pol = NP.unique(NP.asarray(pol))
for cpol in pol:
if verbose:
print 'Gridding aperture illumination and visibilities for polarization {0} ...'.format(cpol)
if cpol not in ['P11', 'P12', 'P21', 'P22']:
raise ValueError('Invalid specification for input parameter pol')
if cpol not in self._bl_contribution:
raise KeyError('Key {0} not found in attribute _bl_contribution'.format(cpol))
self.grid_illumination[cpol] = NP.zeros((self.gridu.shape + (self.f.size,)), dtype=NP.complex_)
self.grid_Vf[cpol] = NP.zeros((self.gridu.shape + (self.f.size,)), dtype=NP.complex_)
nlabels = len(self.grid_mapper[cpol]['per_bl2grid'])
loopcount = 0
num_unflagged = 0
sum_twts = 0.0
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Antennas '.format(nlabels), PGB.ETA()], maxval=nlabels).start()
for bi, per_bl2grid_info in enumerate(self.grid_mapper[cpol]['per_bl2grid']):
bllabel = per_bl2grid_info['label']
if per_bl2grid_info['twts'] > 0.0:
num_unflagged += 1
sum_twts += per_bl2grid_info['twts']
vuf_gridind_unraveled = (per_bl2grid_info['v_gridind'],per_bl2grid_info['u_gridind'],per_bl2grid_info['f_gridind'])
self.grid_illumination[cpol][vuf_gridind_unraveled] += per_bl2grid_info['per_bl_per_freq_norm_wts'] * per_bl2grid_info['illumination']
self.grid_Vf[cpol][vuf_gridind_unraveled] += per_bl2grid_info['per_bl_per_freq_norm_wts'] * per_bl2grid_info['Vf'] * per_bl2grid_info['illumination']
# self.grid_illumination[cpol][vuf_gridind_unraveled] += per_bl2grid_info['per_bl_per_freq_norm_wts'] * per_bl2grid_info['illumination'] * per_bl2grid_info['twts']
# self.grid_Vf[cpol][vuf_gridind_unraveled] += per_bl2grid_info['per_bl_per_freq_norm_wts'] * per_bl2grid_info['Vf'] * per_bl2grid_info['twts']
if verbose:
progress.update(loopcount+1)
loopcount += 1
if verbose:
progress.finish()
# self.grid_illumination[cpol] *= num_unflagged/sum_twts
# self.grid_Vf[cpol] *= num_unflagged/sum_twts
if verbose:
print 'Gridded aperture illumination and visibilities for polarization {0} from {1:0d} unflagged contributing baselines'.format(cpol, num_unflagged)
############################################################################
def quick_beam_synthesis(self, pol=None):
"""
------------------------------------------------------------------------
A quick generator of synthesized beam using interferometer array grid
illumination pattern using the center frequency. Not intended to be used
rigorously but rather for comparison purposes and making quick plots
Inputs:
pol [String] The polarization of the synthesized beam. Can be set
to 'P11', 'P12', 'P21' or 'P2'. If set to None, synthesized beam
for all the polarizations are generated. Default=None
Outputs:
Dictionary with the following keys and information:
'syn_beam' [numpy array] synthesized beam of same size as that of the
interferometer array grid. It is FFT-shifted to place the
origin at the center of the array. The peak value of the
synthesized beam is fixed at unity
'grid_power_illumination'
[numpy array] complex grid illumination obtained from
inverse fourier transform of the synthesized beam in
'syn_beam' and has size same as that of the interferometer
array grid. It is FFT-shifted to have the origin at the
center. The sum of this array is set to unity to match the
peak of the synthesized beam
'l' [numpy vector] x-values of the direction cosine grid
corresponding to x-axis (axis=1) of the synthesized beam
'm' [numpy vector] y-values of the direction cosine grid
corresponding to y-axis (axis=0) of the synthesized beam
------------------------------------------------------------------------
"""
if not self.grid_ready:
raise ValueError('Need to perform gridding of the antenna array before an equivalent UV grid can be simulated')
if pol is None:
pol = ['P11', 'P12', 'P21', 'P22']
elif isinstance(pol, str):
if pol in ['P11', 'P12', 'P21', 'P22']:
pol = [pol]
else:
raise ValueError('Invalid polarization specified')
elif isinstance(pol, list):
p = [cpol for cpol in pol if cpol in ['P11', 'P12', 'P21', 'P22']]
if len(p) == 0:
raise ValueError('Invalid polarization specified')
pol = p
else:
raise TypeError('Input keyword pol must be string, list or set to None')
pol = sorted(pol)
for cpol in pol:
if self.grid_illumination[cpol] is None:
raise ValueError('Grid illumination for the specified polarization is not determined yet. Must use make_grid_cube()')
chan = NP.argmin(NP.abs(self.f - self.f0))
orig_syn_beam_in_uv = NP.empty(self.gridu.shape+(len(pol),), dtype=NP.complex)
for pind, cpol in enumerate(pol):
orig_syn_beam_in_uv[:,:,pind] = self.grid_illumination[cpol][:,:,chan]
# # Pad it with zeros to be twice the size
# padded_syn_beam_in_uv = NP.pad(orig_syn_beam_in_uv, ((0,orig_syn_beam_in_uv.shape[0]),(0,orig_syn_beam_in_uv.shape[1]),(0,0)), mode='constant', constant_values=0)
# # The NP.roll statements emulate a fftshift but by 1/4 of the size of the padded array
# padded_syn_beam_in_uv = NP.roll(padded_syn_beam_in_uv, -orig_syn_beam_in_uv.shape[0]/2, axis=0)
# padded_syn_beam_in_uv = NP.roll(padded_syn_beam_in_uv, -orig_syn_beam_in_uv.shape[1]/2, axis=1)
# Pad it with zeros on either side to be twice the size
padded_syn_beam_in_uv = NP.pad(orig_syn_beam_in_uv, ((orig_syn_beam_in_uv.shape[0]/2,orig_syn_beam_in_uv.shape[0]/2),(orig_syn_beam_in_uv.shape[1]/2,orig_syn_beam_in_uv.shape[1]/2),(0,0)), mode='constant', constant_values=0)
# Shift to be centered
padded_syn_beam_in_uv = NP.fft.ifftshift(padded_syn_beam_in_uv)
# Compute the synthesized beam. It is at a finer resolution due to padding
syn_beam = NP.fft.fft2(padded_syn_beam_in_uv, axes=(0,1))
# Select only the real part, equivalent to adding conjugate baselines
syn_beam = 2 * syn_beam.real
syn_beam /= syn_beam.max()
# Inverse Fourier Transform to obtain real and symmetric uv-grid illumination
syn_beam_in_uv = NP.fft.ifft2(syn_beam, axes=(0,1))
# shift the array to be centered
syn_beam_in_uv = NP.fft.ifftshift(syn_beam_in_uv, axes=(0,1))
# Discard pads at either end and select only the central values of original size
syn_beam_in_uv = syn_beam_in_uv[orig_syn_beam_in_uv.shape[0]/2:orig_syn_beam_in_uv.shape[0]/2+orig_syn_beam_in_uv.shape[0],orig_syn_beam_in_uv.shape[1]/2:orig_syn_beam_in_uv.shape[1]/2+orig_syn_beam_in_uv.shape[1],:]
syn_beam = NP.fft.fftshift(syn_beam[::2,::2,:], axes=(0,1)) # Downsample by factor 2 to get native resolution and shift to be centered
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
l = DSP.spectax(self.gridu.shape[1], resolution=du, shift=True)
m = DSP.spectax(self.gridv.shape[0], resolution=dv, shift=True)
return {'syn_beam': syn_beam, 'grid_power_illumination': syn_beam_in_uv, 'l': l, 'm': m}
############################################################################
def grid_convolve_old(self, pol=None, antpairs=None, unconvolve_existing=False,
normalize=False, method='NN', distNN=NP.inf, tol=None,
maxmatch=None):
"""
------------------------------------------------------------------------
Routine to project the visibility illumination pattern and the
visibilities on the grid. It can operate on the entire antenna array or
incrementally project the visibilities and illumination patterns from
specific antenna pairs on to an already existing grid.
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1'
or 'P2'. If set to None, gridding for both 'P1' and 'P2'
is performed. Default = None
ants [instance of class AntennaArray, single instance or list of
instances of class Antenna, or a dictionary holding instances
of class Antenna] If a dictionary is provided, the keys
should be the antenna labels and the values should be
instances of class Antenna. If a list is provided, it should
be a list of valid instances of class Antenna. These
instance(s) of class Antenna will be merged to the existing
grid contained in the instance of AntennaArray class. If ants
is not provided (set to None), the gridding operations will
be performed on the entire set of antennas contained in the
instance of class AntennaArray. Default = None.
unconvolve_existing
[Boolean] Default = False. If set to True, the effects of
gridding convolution contributed by the antenna(s) specified
will be undone before updating the antenna measurements on
the grid, if the antenna(s) is/are already found to in the
set of antennas held by the instance of AntennaArray. If
False and if one or more antenna instances specified are
already found to be held in the instance of class
AntennaArray, the code will stop raising an error indicating
the gridding operation cannot proceed.
normalize [Boolean] Default = False. If set to True, the gridded
weights are divided by the sum of weights so that the gridded
weights add up to unity.
method [string] The gridding method to be used in applying the
antenna weights on to the antenna array grid. Accepted values
are 'NN' (nearest neighbour - default), 'CS' (cubic spline),
or 'BL' (Bi-linear). In case of applying grid weights by 'NN'
method, an optional distance upper bound for the nearest
neighbour can be provided in the parameter distNN to prune
the search and make it efficient
distNN [scalar] A positive value indicating the upper bound on
distance to the nearest neighbour in the gridding process.
It has units of distance, the same units as the antenna
attribute location and antenna array attribute gridx and
gridy. Default is NP.inf (infinite distance). It will be
internally converted to have same units as antenna attributes
wtspos (units in number of wavelengths)
maxmatch [scalar] A positive value indicating maximum number of input
locations in the antenna grid to be assigned. Default = None.
If set to None, all the antenna array grid elements specified
are assigned values for each antenna. For instance, to have
only one antenna array grid element to be populated per
antenna, use maxmatch=1.
tol [scalar] If set, only lookup data with abs(val) > tol will be
considered for nearest neighbour lookup. Default = None
implies all lookup values will be considered for nearest
neighbour determination. tol is to be interpreted as a
minimum value considered as significant in the lookup table.
------------------------------------------------------------------------
"""
eps = 1.0e-10
if not self.grid_ready:
self.grid()
if (pol is None) or (pol == 'P11'):
if antpairs is not None:
if isinstance(antpairs, Interferometer):
antpairs = [antpairs]
if isinstance(antpairs, (dict, InterferometerArray)):
# Check if these interferometers are new or old and compatible
for key in antpairs:
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if key in self.interferometers:
if unconvolve_existing: # Effects on the grid of interferometers already existing must be removed
if self.interferometers[key]._gridinfo['P11']: # if gridding info is not empty
for i in range(len(self.f)):
self.grid_unconvolve(antpairs[key].label)
else:
raise KeyError('Interferometer {0} already found to exist in the dictionary of interferometers but cannot proceed grid_convolve() without unconvolving first.'.format(antpairs[key].label))
else:
del antpairs[key] # remove the dictionary element since it is not an Interferometer instance
for key in antpairs:
if not antpairs[key].crosspol.flag['P11']:
for i in range(len(self.f)):
if method == 'NN':
if antpairs[key].wtspos_scale['P11'] is None:
reflocs = antpairs[key].wtspos['P11'][i] + (self.f[i]/FCNST.c) * NP.asarray([antpairs[key].location.x, antpairs[key].location.y]).reshape(1,-1)
inplocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, antpairs[key].wts['P11'][i], inplocs,
distance_ULIM=distNN*self.f[i]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
elif antpairs[key].wtspos_scale['P11'] == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
reflocs = antpairs[key].wtspos['P11'][0] + (self.f[0]/FCNST.c) * NP.asarray([antpairs[key].location.x, antpairs[key].location.y]).reshape(1,-1)
inplocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, antpairs[key].wts['P11'][0], inplocs,
distance_ULIM=distNN*self.f[0]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination['P11'][roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += nnval
self.grid_Vf['P11'][roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += antpairs[key].crosspol.Vf['P11'][i] * nnval
else:
if antpairs[key].wtspos_scale['P11'] is None:
grid_illumination['P11'] = GRD.conv_grid2d(antpairs[key].location.x * (self.f[i]/FCNST.c),
antpairs[key].location.y * (self.f[i]/FCNST.c),
antpairs[key].wtspos['P11'][i][:,0],
antpairs[key].wtspos['P11'][i][:,1],
antpairs[key].wts['P11'][i],
self.gridu,
self.gridv,
method=method)
grid_illumination['P11'] = grid_illumination['P11'].reshape(self.gridu.shape)
if normalize:
grid_illumination['P11'] = grid_illumination['P11'] / NP.sum(grid_illumination['P11'])
roi_ind = NP.where(NP.abs(grid_illumination['P11']) >= eps)
elif antpairs[key].wtspos_scale['P11'] == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
grid_illumination['P11'] = GRD.conv_grid2d(antpairs[key].location.x * (self.f[0]/FCNST.c),
antpairs[key].location.y * (self.f[0]/FCNST.c),
antpairs[key].wtspos['P11'][0][:,0],
antpairs[key].wtspos['P11'][0][:,1],
antpairs[key].wts['P11'][0],
self.gridu,
self.gridv,
method=method)
grid_illumination['P11'] = grid_illumination['P11'].reshape(self.gridu.shape)
if normalize:
grid_illumination['P11'] = grid_illumination['P11'] / NP.sum(grid_illumination['P11'])
roi_ind = NP.where(NP.abs(grid_illumination['P11']) >= eps)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination['P11'][:,:,i] += grid_illumination['P11']
self.grid_Vf['P11'][:,:,i] += antpairs[key].crosspol.Vf['P11'][i] * grid_illumination['P11']
if key in self.interferometers:
if i not in self.interferometers[key]._gridinfo['P11']:
self.interferometers[key]._gridinfo['P11'] = {} # Create an empty dictionary for each channel to hold grid info
self.interferometers[key]._gridinfo['P11'][i]['f'] = self.f[i]
self.interferometers[key]._gridinfo['P11'][i]['flag'] = False
self.interferometers[key]._gridinfo['P11'][i]['gridxy_ind'] = zip(*roi_ind)
self.interferometers[key].wtspos_scale['P11'] = antpairs[key].wtspos_scale['P11']
if method == 'NN':
self.interferometers[key]._gridinfo['P11'][i]['illumination'] = nnval
self.interferometers[key]._gridinfo['P11'][i]['Vf'] = antpairs[key].crosspol.Vf['P11'][i] * nnval
else:
self.interferometers[key]._gridinfo['P11'][i]['illumination'] = grid_illumination['P11'][roi_ind]
self.interferometers[key]._gridinfo['P11'][i]['Vf'] = antpairs[key].crosspol.Vf['P11'][i] * grid_illumination['P11'][roi_ind]
elif isinstance(antpairs, list):
# Check if these interferometers are new or old and compatible
for key in range(len(antpairs)):
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if antpairs[key].label in self.interferometers:
if unconvolve_existing: # Effects on the grid of interferometers already existing must be removed
if self.interferometers[antpairs[key].label]._gridinfo['P11']: # if gridding info is not empty
for i in range(len(self.f)):
self.grid_unconvolve(antpairs[key].label)
else:
raise KeyError('Interferometer {0} already found to exist in the dictionary of interferometers but cannot proceed grid_convolve() without unconvolving first.'.format(antpairs[key].label))
else:
del antpairs[key] # remove the dictionary element since it is not an Interferometer instance
for key in range(len(antpairs)):
if not antpairs[key].crosspol.flag['P11']:
for i in range(len(self.f)):
if method == 'NN':
if antpairs[key].wtspos_scale['P11'] is None:
reflocs = antpairs[key].wtspos['P11'][i] + (self.f[i]/FCNST.c) * NP.asarray([antpairs[key].location.x, antpairs[key].location.y]).reshape(1,-1)
inplocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, antpairs[key].wts['P11'][i], inplocs,
distance_ULIM=distNN*self.f[i]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
elif antpairs[key].wtspos_scale['P11'] == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
reflocs = antpairs[key].wtspos['P11'][0] + (self.f[0]/FCNST.c) * NP.asarray([antpairs[key].location.x, antpairs[key].location.y]).reshape(1,-1)
inplocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, antpairs[key].wts['P11'][0], inplocs,
distance_ULIM=distNN*self.f[0]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination['P11'][roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += nnval
self.grid_Vf['P11'][roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += antpairs[key].crosspol.Vf['P11'][i] * nnval
else:
if antpairs[key].wtspos_scale['P11'] is None:
grid_illumination['P11'] = GRD.conv_grid2d(antpairs[key].location.x * (self.f[i]/FCNST.c),
antpairs[key].location.y * (self.f[i]/FCNST.c),
antpairs[key].wtspos['P11'][i][:,0],
antpairs[key].wtspos['P11'][i][:,1],
antpairs[key].wts['P11'][i],
self.gridu,
self.gridv,
method=method)
grid_illumination['P11'] = grid_illumination['P11'].reshape(self.gridu.shape)
if normalize:
grid_illumination['P11'] = grid_illumination['P11'] / NP.sum(grid_illumination['P11'])
roi_ind = NP.where(NP.abs(grid_illumination['P11']) >= eps)
elif antpairs[key].wtspos_scale['P11'] == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
grid_illumination['P11'] = GRD.conv_grid2d(antpairs[key].location.x * (self.f[0]/FCNST.c),
antpairs[key].location.y * (self.f[0]/FCNST.c),
antpairs[key].wtspos['P11'][0][:,0],
antpairs[key].wtspos['P11'][0][:,1],
antpairs[key].wts['P11'][0],
self.gridu,
self.gridv,
method=method)
grid_illumination['P11'] = grid_illumination['P11'].reshape(self.gridu.shape)
if normalize:
grid_illumination['P11'] = grid_illumination['P11'] / NP.sum(grid_illumination['P11'])
roi_ind = NP.where(NP.abs(grid_illumination['P11']) >= eps)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination['P11'][:,:,i] += grid_illumination['P11']
self.grid_Vf['P11'][:,:,i] += antpairs[key].crosspol.Vf['P11'][i] * grid_illumination['P11']
if antpairs[key].label in self.interferometers:
if i not in self.interferometers[key]._gridinfo['P11']:
self.interferometers[key]._gridinfo['P11'] = {} # Create an empty dictionary for each channel to hold grid info
self.interferometers[antpairs[key].label]._gridinfo['P11'][i]['f'] = self.f[i]
self.interferometers[antpairs[key].label]._gridinfo['P11'][i]['flag'] = False
self.interferometers[antpairs[key].label]._gridinfo['P11'][i]['gridxy_ind'] = zip(*roi_ind)
self.interferometers[key].wtspos_scale['P11'] = antpairs[key].wtspos_scale['P11']
if method == 'NN':
self.interferometers[antpairs[key].label]._gridinfo['P11'][i]['illumination'] = nnval
self.interferometers[antpairs[key].label]._gridinfo['P11'][i]['Vf'] = antpairs[key].crosspol.Vf['P11'][i] * nnval
else:
self.interferometers[antpairs[key].label]._gridinfo['P11'][i]['illumination'] = grid_illumination['P11'][roi_ind]
self.interferometers[antpairs[key].label]._gridinfo['P11'][i]['Vf'] = antpairs[key].crosspol.Vf['P11'][i] * grid_illumination['P11'][roi_ind]
else:
raise TypeError('antpairs must be an instance of InterferometerArray, a dictionary of Interferometer instances, a list of Interferometer instances or an Interferometer instance.')
else:
self.grid_illumination['P11'] = NP.zeros((self.gridu.shape[0],
self.gridu.shape[1],
len(self.f)),
dtype=NP.complex_)
self.grid_Vf['P11'] = NP.zeros((self.gridu.shape[0],
self.gridu.shape[1],
len(self.f)), dtype=NP.complex_)
for key in self.interferometers:
if not self.interferometers[key].crosspol.flag['P11']:
for i in range(len(self.f)):
if method == 'NN':
if self.interferometers[key].wtspos_scale['P11'] is None:
reflocs = self.interferometers[key].wtspos['P11'][i] + (self.f[i]/FCNST.c) * NP.asarray([self.interferometers[key].location.x, self.interferometers[key].location.y]).reshape(1,-1)
inplocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, self.interferometers[key].wts['P11'][i], inplocs,
distance_ULIM=distNN*self.f[i]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
elif self.interferometers[key].wtspos_scale['P11'] == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
reflocs = self.interferometers[key].wtspos['P11'][0] + (self.f[0]/FCNST.c) * NP.asarray([self.interferometers[key].location.x, self.interferometers[key].location.y]).reshape(1,-1)
inplocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, self.interferometers[key].wts['P11'][0], inplocs,
distance_ULIM=distNN*self.f[0]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination['P11'][roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += nnval
self.grid_Vf['P11'][roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += self.interferometers[key].crosspol.Vf['P11'][i] * nnval
else:
if self.interferometers[key].wtspos_scale['P11'] is None:
grid_illumination['P11'] = GRD.conv_grid2d(self.interferometers[key].location.x * (self.f[i]/FCNST.c),
self.interferometers[key].location.y * (self.f[i]/FCNST.c),
self.interferometers[key].wtspos['P11'][i][:,0],
self.interferometers[key].wtspos['P11'][i][:,1],
self.interferometers[key].wts['P11'][i],
self.gridu,
self.gridv,
method=method)
grid_illumination['P11'] = grid_illumination['P11'].reshape(self.gridu.shape)
if normalize:
grid_illumination['P11'] = grid_illumination['P11'] / NP.sum(grid_illumination['P11'])
roi_ind = NP.where(NP.abs(grid_illumination['P11']) >= eps)
elif self.interferometers[key].wtspos_scale['P11'] == 'scale':
if i == 0:
grid_illumination['P11'] = GRD.conv_grid2d(self.interferometers[key].location.x * (self.f[0]/FCNST.c),
self.interferometers[key].location.y * (self.f[0]/FCNST.c),
self.interferometers[key].wtspos['P11'][0][:,0],
self.interferometers[key].wtspos['P11'][0][:,1],
self.interferometers[key].wts['P11'][0],
self.gridu,
self.gridv,
method=method)
grid_illumination['P11'] = grid_illumination['P11'].reshape(self.gridu.shape)
if normalize:
grid_illumination['P11'] = grid_illumination['P11'] / NP.sum(grid_illumination['P11'])
roi_ind = NP.where(NP.abs(grid_illumination['P11']) >= eps)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination['P11'][:,:,i] += grid_illumination['P11']
self.grid_Vf['P11'][:,:,i] += self.interferometers[key].crosspol.Vf['P11'][i] * grid_illumination['P11']
self.interferometers[key]._gridinfo['P11'][i] = {} # Create a nested dictionary to hold channel info
self.interferometers[key]._gridinfo['P11'][i]['f'] = self.f[i]
self.interferometers[key]._gridinfo['P11'][i]['flag'] = False
self.interferometers[key]._gridinfo['P11'][i]['gridxy_ind'] = zip(*roi_ind)
if method == 'NN':
self.interferometers[key]._gridinfo['P11'][i]['illumination'] = nnval
self.interferometers[key]._gridinfo['P11'][i]['Vf'] = self.interferometers[key].crosspol.Vf['P11'][i] * nnval
else:
self.interferometers[key]._gridinfo['P11'][i]['illumination'] = grid_illumination['P11'][roi_ind]
self.interferometers[key]._gridinfo['P11'][i]['Vf'] = self.interferometers[key].crosspol.Vf['P11'][i] * grid_illumination['P11'][roi_ind]
if (pol is None) or (pol == 'P22'):
if antpairs is not None:
if isinstance(antpairs, Interferometer):
antpairs = [antpairs]
if isinstance(antpairs, (dict, InterferometerArray)):
# Check if these interferometers are new or old and compatible
for key in antpairs:
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if key in self.interferometers:
if unconvolve_existing: # Effects on the grid of interferometers already existing must be removed
if self.interferometers[key]._gridinfo_P22: # if gridding info is not empty
for i in range(len(self.f)):
self.grid_unconvolve(antpairs[key].label)
else:
raise KeyError('Interferometer {0} already found to exist in the dictionary of interferometers but cannot proceed grid_convolve() without unconvolving first.'.format(antpairs[key].label))
else:
del antpairs[key] # remove the dictionary element since it is not an Interferometer instance
for key in antpairs:
if not antpairs[key].crosspol.flag_P22:
for i in range(len(self.f)):
if method == 'NN':
if antpairs[key].wtspos_P22_scale is None:
reflocs = antpairs[key].wtspos_P22[i] + (self.f[i]/FCNST.c) * NP.asarray([antpairs[key].location.x, antpairs[key].location.y]).reshape(1,-1)
inplocs = (self.f[i]/FCNST.c) * NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, antpairs[key].wts_P22[i], inplocs,
distance_ULIM=distNN*self.f[i]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
elif antpairs[key].wtspos_P22_scale == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
reflocs = antpairs[key].wtspos_P22[0] + (self.f[0]/FCNST.c) * NP.asarray([antpairs[key].location.x, antpairs[key].location.y]).reshape(1,-1)
inplocs = (self.f[0]/FCNST.c) * NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, antpairs[key].wts_P22[0], inplocs,
distance_ULIM=distNN*self.f[0]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination_P22[roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += nnval
self.grid_Vf_P22[roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += antpairs[key].crosspol.Vf_P22[i] * nnval
else:
if antpairs[key].wtspos_P22_scale is None:
grid_illumination_P22 = GRD.conv_grid2d(antpairs[key].location.x * (self.f[i]/FCNST.c),
antpairs[key].location.y * (self.f[i]/FCNST.c),
antpairs[key].wtspos_P22[i][:,0],
antpairs[key].wtspos_P22[i][:,1],
antpairs[key].wts_P22[i],
self.gridu * (self.f[i]/FCNST.c),
self.gridv * (self.f[i]/FCNST.c),
method=method)
grid_illumination_P22 = grid_illumination_P22.reshape(self.gridu.shape)
if normalize:
grid_illumination_P22 = grid_illumination_P22 / NP.sum(grid_illumination_P22)
roi_ind = NP.where(NP.abs(grid_illumination_P22) >= eps)
elif antpairs[key].wtspos_P22_scale == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
grid_illumination_P22 = GRD.conv_grid2d(antpairs[key].location.x * (self.f[0]/FCNST.c),
antpairs[key].location.y * (self.f[0]/FCNST.c),
antpairs[key].wtspos_P22[0][:,0],
antpairs[key].wtspos_P22[0][:,1],
antpairs[key].wts_P22[0],
self.gridu * (self.f[0]/FCNST.c),
self.gridv * (self.f[0]/FCNST.c),
method=method)
grid_illumination_P22 = grid_illumination_P22.reshape(self.gridu.shape)
if normalize:
grid_illumination_P22 = grid_illumination_P22 / NP.sum(grid_illumination_P22)
roi_ind = NP.where(NP.abs(grid_illumination_P22) >= eps)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination_P22[:,:,i] += grid_illumination_P22
self.grid_Vf_P22[:,:,i] += antpairs[key].crosspol.Vf_P22[i] * grid_illumination_P22
if key in self.interferometers:
if i not in self.interferometers[key]._gridinfo_P22:
self.interferometers[key]._gridinfo_P22 = {} # Create an empty dictionary for each channel to hold grid info
self.interferometers[key]._gridinfo_P22[i]['f'] = self.f[i]
self.interferometers[key]._gridinfo_P22[i]['flag'] = False
self.interferometers[key]._gridinfo_P22[i]['gridxy_ind'] = zip(*roi_ind)
self.interferometers[key].wtspos_P22_scale = antpairs[key].wtspos_P22_scale
if method == 'NN':
self.interferometers[key]._gridinfo_P22[i]['illumination'] = nnval
self.interferometers[key]._gridinfo_P22[i]['Vf'] = antpairs[key].crosspol.Vf_P22[i] * nnval
else:
self.interferometers[key]._gridinfo_P22[i]['illumination'] = grid_illumination_P22[roi_ind]
self.interferometers[key]._gridinfo_P22[i]['Vf'] = antpairs[key].crosspol.Vf_P22[i] * grid_illumination_P22[roi_ind]
elif isinstance(antpairs, list):
# Check if these interferometers are new or old and compatible
for key in range(len(antpairs)):
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if antpairs[key].label in self.interferometers:
if unconvolve_existing: # Effects on the grid of interferometers already existing must be removed
if self.interferometers[antpairs[key].label]._gridinfo_P22: # if gridding info is not empty
for i in range(len(self.f)):
self.grid_unconvolve(antpairs[key].label)
else:
raise KeyError('Interferometer {0} already found to exist in the dictionary of interferometers but cannot proceed grid_convolve() without unconvolving first.'.format(antpairs[key].label))
else:
del antpairs[key] # remove the dictionary element since it is not an Interferometer instance
for key in range(len(antpairs)):
if not antpairs[key].crosspol.flag_P22:
for i in range(len(self.f)):
if method == 'NN':
if antpairs[key].wtspos_P22_scale is None:
reflocs = antpairs[key].wtspos_P22[i] + (self.f[i]/FCNST.c) * NP.asarray([antpairs[key].location.x, antpairs[key].location.y]).reshape(1,-1)
inplocs = (self.f[i]/FCNST.c) * NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, antpairs[key].wts_P22[i], inplocs,
distance_ULIM=distNN*self.f[i]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
elif antpairs[key].wtspos_P22_scale == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
reflocs = antpairs[key].wtspos_P22[0] + (self.f[0]/FCNST.c) * NP.asarray([antpairs[key].location.x, antpairs[key].location.y]).reshape(1,-1)
inplocs = (self.f[0]/FCNST.c) * NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, antpairs[key].wts_P22[0], inplocs,
distance_ULIM=distNN*self.f[0]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination_P22[roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += nnval
self.grid_Vf_P22[roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += antpairs[key].crosspol.Vf_P22[i] * nnval
else:
if antpairs[key].wtspos_P22_scale is None:
grid_illumination_P22 = GRD.conv_grid2d(antpairs[key].location.x * (self.f[i]/FCNST.c),
antpairs[key].location.y * (self.f[i]/FCNST.c),
antpairs[key].wtspos_P22[i][:,0],
antpairs[key].wtspos_P22[i][:,1],
antpairs[key].wts_P22[i],
self.gridu * (self.f[i]/FCNST.c),
self.gridv * (self.f[i]/FCNST.c),
method=method)
grid_illumination_P22 = grid_illumination_P22.reshape(self.gridu.shape)
if normalize:
grid_illumination_P22 = grid_illumination_P22 / NP.sum(grid_illumination_P22)
roi_ind = NP.where(NP.abs(grid_illumination_P22) >= eps)
elif antpairs[key].wtspos_P22_scale == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
grid_illumination_P22 = GRD.conv_grid2d(antpairs[key].location.x * (self.f[0]/FCNST.c),
antpairs[key].location.y * (self.f[0]/FCNST.c),
antpairs[key].wtspos_P22[0][:,0],
antpairs[key].wtspos_P22[0][:,1],
antpairs[key].wts_P22[0],
self.gridu * (self.f[0]/FCNST.c),
self.gridv * (self.f[0]/FCNST.c),
method=method)
grid_illumination_P22 = grid_illumination_P22.reshape(self.gridu.shape)
if normalize:
grid_illumination_P22 = grid_illumination_P22 / NP.sum(grid_illumination_P22)
roi_ind = NP.where(NP.abs(grid_illumination_P22) >= eps)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination_P22[:,:,i] += grid_illumination_P22
self.grid_Vf_P22[:,:,i] += antpairs[key].crosspol.Vf_P22[i] * grid_illumination_P22
if antpairs[key].label in self.interferometers:
if i not in self.interferometers[key]._gridinfo_P22:
self.interferometers[key]._gridinfo_P22 = {} # Create an empty dictionary for each channel to hold grid info
self.interferometers[antpairs[key].label]._gridinfo_P22[i]['f'] = self.f[i]
self.interferometers[antpairs[key].label]._gridinfo_P22[i]['flag'] = False
self.interferometers[antpairs[key].label]._gridinfo_P22[i]['gridxy_ind'] = zip(*roi_ind)
self.interferometers[key].wtspos_P22_scale = antpairs[key].wtspos_P22_scale
if method == 'NN':
self.interferometers[antpairs[key].label]._gridinfo_P22[i]['illumination'] = nnval
self.interferometers[antpairs[key].label]._gridinfo_P22[i]['Vf'] = antpairs[key].crosspol.Vf_P22[i] * nnval
else:
self.interferometers[antpairs[key].label]._gridinfo_P22[i]['illumination'] = grid_illumination_P22[roi_ind]
self.interferometers[antpairs[key].label]._gridinfo_P22[i]['Vf'] = antpairs[key].crosspol.Vf_P22[i] * grid_illumination_P22[roi_ind]
else:
raise TypeError('antpairs must be an instance of InterferometerArray, a dictionary of Interferometer instances, a list of Interferometer instances or an Interferometer instance.')
else:
self.grid_illumination_P22 = NP.zeros((self.gridu.shape[0],
self.gridu.shape[1],
len(self.f)),
dtype=NP.complex_)
self.grid_Vf_P22 = NP.zeros((self.gridu.shape[0],
self.gridu.shape[1],
len(self.f)), dtype=NP.complex_)
for key in self.interferometers:
if not self.interferometers[key].crosspol.flag_P22:
for i in range(len(self.f)):
if method == 'NN':
if self.interferometers[key].wtspos_P22_scale is None:
reflocs = self.interferometers[key].wtspos_P22[i] + (self.f[i]/FCNST.c) * NP.asarray([self.interferometers[key].location.x, self.interferometers[key].location.y]).reshape(1,-1)
inplocs = (self.f[i]/FCNST.c) * NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, self.interferometers[key].wts_P22[i], inplocs,
distance_ULIM=distNN*self.f[i]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
elif self.interferometers[key].wtspos_P22_scale == 'scale':
if i == 0: # Determine some parameters only for zeroth channel if scaling is set
reflocs = self.interferometers[key].wtspos_P22[0] + (self.f[0]/FCNST.c) * NP.asarray([self.interferometers[key].location.x, self.interferometers[key].location.y]).reshape(1,-1)
inplocs = (self.f[0]/FCNST.c) * NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs, self.interferometers[key].wts_P22[0], inplocs,
distance_ULIM=distNN*self.f[0]/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
roi_ind = NP.unravel_index(ibind, self.gridu.shape)
if normalize:
nnval /= NP.sum(nnval)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination_P22[roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += nnval
self.grid_Vf_P22[roi_ind+(i+NP.zeros(ibind.size, dtype=NP.int),)] += self.interferometers[key].crosspol.Vf_P22[i] * nnval
else:
if self.interferometers[key].wtspos_P22_scale is None:
grid_illumination_P22 = GRD.conv_grid2d(self.interferometers[key].location.x * (self.f[i]/FCNST.c),
self.interferometers[key].location.y * (self.f[i]/FCNST.c),
self.interferometers[key].wtspos_P22[i][:,0],
self.interferometers[key].wtspos_P22[i][:,1],
self.interferometers[key].wts_P22[i],
self.gridu * (self.f[i]/FCNST.c),
self.gridv * (self.f[i]/FCNST.c),
method=method)
grid_illumination_P22 = grid_illumination_P22.reshape(self.gridu.shape)
if normalize:
grid_illumination_P22 = grid_illumination_P22 / NP.sum(grid_illumination_P22)
roi_ind = NP.where(NP.abs(grid_illumination_P22) >= eps)
elif self.interferometers[key].wtspos_P22_scale == 'scale':
if i == 0:
grid_illumination_P22 = GRD.conv_grid2d(self.interferometers[key].location.x * (self.f[0]/FCNST.c),
self.interferometers[key].location.y * (self.f[0]/FCNST.c),
self.interferometers[key].wtspos_P22[0][:,0],
self.interferometers[key].wtspos_P22[0][:,1],
self.interferometers[key].wts_P22[0],
self.gridu * (self.f[0]/FCNST.c),
self.gridv * (self.f[0]/FCNST.c),
method=method)
grid_illumination_P22 = grid_illumination_P22.reshape(self.gridu.shape)
if normalize:
grid_illumination_P22 = grid_illumination_P22 / NP.sum(grid_illumination_P22)
roi_ind = NP.where(NP.abs(grid_illumination_P22) >= eps)
else:
raise ValueError('Invalid scale option specified. Aborting grid_convolve().')
self.grid_illumination_P22[:,:,i] += grid_illumination_P22
self.grid_Vf_P22[:,:,i] += self.interferometers[key].crosspol.Vf_P22[i] * grid_illumination_P22
self.interferometers[key]._gridinfo_P22[i] = {} # Create a nested dictionary to hold channel info
self.interferometers[key]._gridinfo_P22[i]['f'] = self.f[i]
self.interferometers[key]._gridinfo_P22[i]['flag'] = False
self.interferometers[key]._gridinfo_P22[i]['gridxy_ind'] = zip(*roi_ind)
if method == 'NN':
self.interferometers[key]._gridinfo_P22[i]['illumination'] = nnval
self.interferometers[key]._gridinfo_P22[i]['Vf'] = self.interferometers[key].crosspol.Vf_P22[i] * nnval
else:
self.interferometers[key]._gridinfo_P22[i]['illumination'] = grid_illumination_P22[roi_ind]
self.interferometers[key]._gridinfo_P22[i]['Vf'] = self.interferometers[key].crosspol.Vf_P22[i] * grid_illumination_P22[roi_ind]
############################################################################
def grid_unconvolve(self, antpairs, pol=None):
"""
------------------------------------------------------------------------
[Needs to be re-written]
Routine to de-project the visibility illumination pattern and the
visibilities on the grid. It can operate on the entire interferometer
array or incrementally de-project the visibilities and illumination
patterns of specific antenna pairs from an already existing grid.
Inputs:
antpairs [instance of class InterferometerArray, single instance or
list of instances of class Interferometer, or a dictionary
holding instances of class Interferometer] If a dictionary
is provided, the keys should be the interferometer labels
and the values should be instances of class Interferometer.
If a list is provided, it should be a list of valid
instances of class Interferometer. These instance(s) of
class Interferometer will be merged to the existing grid
contained in the instance of InterferometerArray class. If
any of the interferoemters are not found to be in the
already existing set of interferometers, an exception is
raised accordingly and code execution stops.
pol [String] The polarization to be gridded. Can be set to
'P11', 'P12', 'P21', or 'P22'. If set to None, gridding for
all polarizations is performed. Default=None.
------------------------------------------------------------------------
"""
try:
antpairs
except NameError:
raise NameError('No antenna pair(s) supplied.')
if (pol is None) or (pol == 'P11'):
if isinstance(ants, (Interferometer, str)):
antpairs = [antpairs]
if isinstance(antpairs, (dict, InterferometerArray)):
# Check if these interferometers are new or old and compatible
for key in antpairs:
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if key in self.interferometers:
if self.interferometers[key]._gridinfo_P11: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[key]._gridinfo_P11[i]['gridxy_ind'])
self.grid_illumination_P11[xind, yind, i] -= self.interferometers[key]._gridinfo_P11[i]['illumination']
self.grid_Vf_P11[xind, yind, i] -= self.interferometers[key]._gridinfo_P11[i]['Vf']
self.interferometers[key]._gridinfo_P11 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key].label))
elif isinstance(antpairs, list):
# Check if these interferometers are new or old and compatible
for key in range(len(antpairs)):
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if antpairs[key].label in self.interferometers:
if self.interferometers[antpairs[key].label]._gridinfo_P11: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[antpairs[key].label]._gridinfo_P11[i]['gridxy_ind'])
self.grid_illumination_P11[xind, yind, i] -= self.interferometers[antpairs[key].label]._gridinfo_P11[i]['illumination']
self.grid_Vf_P11[xind, yind, i] -= self.interferometers[antpairs[key].label]._gridinfo_P11[i]['Vf']
self.interferometers[antpairs[key].label]._gridinfo_P11 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key].label))
elif isinstance(antpairs[key], str):
if antpairs[key] in self.interferometers:
if self.interferometers[antpairs[key]]._gridinfo_P11: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[antpairs[key]]._gridinfo_P11[i]['gridxy_ind'])
self.grid_illumination_P11[xind, yind, i] -= self.interferometers[antpairs[key]]._gridinfo_P11[i]['illumination']
self.grid_Vf_P11[xind, yind, i] -= self.interferometers[antpairs[key]]._gridinfo_P11[i]['Vf']
self.interferometers[antpairs[key]]._gridinfo_P11 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key]))
else:
raise TypeError('antpairs must be an instance of class InterferometerArray, a list of instances of class Interferometer, a dictionary of instances of class Interferometer or a list of antenna labels.')
else:
raise TypeError('antpairs must be an instance of InterferometerArray, a dictionary of Interferometer instances, a list of Interferometer instances, an Interferometer instance, or a list of antenna labels.')
if (pol is None) or (pol == 'P22'):
if isinstance(ants, (Interferometer, str)):
antpairs = [antpairs]
if isinstance(antpairs, (dict, InterferometerArray)):
# Check if these interferometers are new or old and compatible
for key in antpairs:
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if key in self.interferometers:
if self.interferometers[key]._gridinfo_P22: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[key]._gridinfo_P22[i]['gridxy_ind'])
self.grid_illumination_P22[xind, yind, i] -= self.interferometers[key]._gridinfo_P22[i]['illumination']
self.grid_Vf_P22[xind, yind, i] -= self.interferometers[key]._gridinfo_P22[i]['Vf']
self.interferometers[key]._gridinfo_P22 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key].label))
elif isinstance(antpairs, list):
# Check if these interferometers are new or old and compatible
for key in range(len(antpairs)):
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if antpairs[key].label in self.interferometers:
if self.interferometers[antpairs[key].label]._gridinfo_P22: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[antpairs[key].label]._gridinfo_P22[i]['gridxy_ind'])
self.grid_illumination_P22[xind, yind, i] -= self.interferometers[antpairs[key].label]._gridinfo_P22[i]['illumination']
self.grid_Vf_P22[xind, yind, i] -= self.interferometers[antpairs[key].label]._gridinfo_P22[i]['Vf']
self.interferometers[antpairs[key].label]._gridinfo_P22 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key].label))
elif isinstance(antpairs[key], str):
if antpairs[key] in self.interferometers:
if self.interferometers[antpairs[key]]._gridinfo_P22: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[antpairs[key]]._gridinfo_P22[i]['gridxy_ind'])
self.grid_illumination_P22[xind, yind, i] -= self.interferometers[antpairs[key]]._gridinfo_P22[i]['illumination']
self.grid_Vf_P22[xind, yind, i] -= self.interferometers[antpairs[key]]._gridinfo_P22[i]['Vf']
self.interferometers[antpairs[key]]._gridinfo_P22 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key]))
else:
raise TypeError('antpairs must be an instance of class InterferometerArray, a list of instances of class Interferometer, a dictionary of instances of class Interferometer or a list of antenna labels.')
else:
raise TypeError('antpairs must be an instance of InterferometerArray, a dictionary of Interferometer instances, a list of Interferometer instances, an Interferometer instance, or a list of antenna labels.')
if (pol is None) or (pol == 'P12'):
if isinstance(ants, (Interferometer, str)):
antpairs = [antpairs]
if isinstance(antpairs, (dict, InterferometerArray)):
# Check if these interferometers are new or old and compatible
for key in antpairs:
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if key in self.interferometers:
if self.interferometers[key]._gridinfo_P12: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[key]._gridinfo_P12[i]['gridxy_ind'])
self.grid_illumination_P12[xind, yind, i] -= self.interferometers[key]._gridinfo_P12[i]['illumination']
self.grid_Vf_P12[xind, yind, i] -= self.interferometers[key]._gridinfo_P12[i]['Vf']
self.interferometers[key]._gridinfo_P12 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key].label))
elif isinstance(antpairs, list):
# Check if these interferometers are new or old and compatible
for key in range(len(antpairs)):
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if antpairs[key].label in self.interferometers:
if self.interferometers[antpairs[key].label]._gridinfo_P12: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[antpairs[key].label]._gridinfo_P12[i]['gridxy_ind'])
self.grid_illumination_P12[xind, yind, i] -= self.interferometers[antpairs[key].label]._gridinfo_P12[i]['illumination']
self.grid_Vf_P12[xind, yind, i] -= self.interferometers[antpairs[key].label]._gridinfo_P12[i]['Vf']
self.interferometers[antpairs[key].label]._gridinfo_P12 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key].label))
elif isinstance(antpairs[key], str):
if antpairs[key] in self.interferometers:
if self.interferometers[antpairs[key]]._gridinfo_P12: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[antpairs[key]]._gridinfo_P12[i]['gridxy_ind'])
self.grid_illumination_P12[xind, yind, i] -= self.interferometers[antpairs[key]]._gridinfo_P12[i]['illumination']
self.grid_Vf_P12[xind, yind, i] -= self.interferometers[antpairs[key]]._gridinfo_P12[i]['Vf']
self.interferometers[antpairs[key]]._gridinfo_P12 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key]))
else:
raise TypeError('antpairs must be an instance of class InterferometerArray, a list of instances of class Interferometer, a dictionary of instances of class Interferometer or a list of antenna labels.')
else:
raise TypeError('antpairs must be an instance of InterferometerArray, a dictionary of Interferometer instances, a list of Interferometer instances, an Interferometer instance, or a list of antenna labels.')
if (pol is None) or (pol == 'P21'):
if isinstance(ants, (Interferometer, str)):
antpairs = [antpairs]
if isinstance(antpairs, (dict, InterferometerArray)):
# Check if these interferometers are new or old and compatible
for key in antpairs:
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if key in self.interferometers:
if self.interferometers[key]._gridinfo_P21: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[key]._gridinfo_P21[i]['gridxy_ind'])
self.grid_illumination_P21[xind, yind, i] -= self.interferometers[key]._gridinfo_P21[i]['illumination']
self.grid_Vf_P21[xind, yind, i] -= self.interferometers[key]._gridinfo_P21[i]['Vf']
self.interferometers[key]._gridinfo_P21 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key].label))
elif isinstance(antpairs, list):
# Check if these interferometers are new or old and compatible
for key in range(len(antpairs)):
if isinstance(antpairs[key], Interferometer): # required if antpairs is a dictionary and not instance of InterferometerArray
if antpairs[key].label in self.interferometers:
if self.interferometers[antpairs[key].label]._gridinfo_P21: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[antpairs[key].label]._gridinfo_P21[i]['gridxy_ind'])
self.grid_illumination_P21[xind, yind, i] -= self.interferometers[antpairs[key].label]._gridinfo_P21[i]['illumination']
self.grid_Vf_P21[xind, yind, i] -= self.interferometers[antpairs[key].label]._gridinfo_P21[i]['Vf']
self.interferometers[antpairs[key].label]._gridinfo_P21 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key].label))
elif isinstance(antpairs[key], str):
if antpairs[key] in self.interferometers:
if self.interferometers[antpairs[key]]._gridinfo_P21: # if gridding info is not empty
for i in range(len(self.f)):
xind, yind = zip(*self.interferometers[antpairs[key]]._gridinfo_P21[i]['gridxy_ind'])
self.grid_illumination_P21[xind, yind, i] -= self.interferometers[antpairs[key]]._gridinfo_P21[i]['illumination']
self.grid_Vf_P21[xind, yind, i] -= self.interferometers[antpairs[key]]._gridinfo_P21[i]['Vf']
self.interferometers[antpairs[key]]._gridinfo_P21 = {}
else:
raise KeyError('Interferometer {0} not found to exist in the dictionary of interferometers.'.format(antpairs[key]))
else:
raise TypeError('antpairs must be an instance of class InterferometerArray, a list of instances of class Interferometer, a dictionary of instances of class Interferometer or a list of antenna labels.')
else:
raise TypeError('antpairs must be an instance of InterferometerArray, a dictionary of Interferometer instances, a list of Interferometer instances, an Interferometer instance, or a list of antenna labels.')
############################################################################
def update_flags(self, dictflags=None, stack=True, verify=False):
"""
------------------------------------------------------------------------
Updates all flags in the interferometer array followed by any flags that
need overriding through inputs of specific flag information
Inputs:
dictflags [dictionary] contains flag information overriding after
default flag updates are determined. Baseline based flags
are given as further dictionaries with each under under a
key which is the same as the interferometer label. Flags for
each baseline are specified as a dictionary holding boolean
flags for each of the four cross-polarizations which are
stored under keys 'P11', 'P12', 'P21', and 'P22'. An absent
key just means it is not a part of the update. Flag
information under each baseline must be of same type as
input parameter flags in member function update_flags() of
class CrossPolInfo
stack [boolean] If True (default), appends the updated flag to the
end of the stack of flags as a function of timestamp. If
False, updates the last flag in the stack with the updated
flag and does not append
verify [boolean] If True, verify and update the flags, if necessary.
Visibilities are checked for NaN values and if found, the
flag in the corresponding polarization is set to True.
Default=False.
------------------------------------------------------------------------
"""
for label in self.interferometers:
self.interferometers[label].update_flags(stack=stack, verify=verify)
if dictflags is not None: # Performs flag overriding. Use stack=False
if not isinstance(dictflags, dict):
raise TypeError('Input parameter dictflags must be a dictionary')
for label in dictflags:
if label in self.interferometers:
self.interferometers[label].update_flags(flags=dictflags[label], stack=False, verify=True)
############################################################################
def update(self, interferometer_level_updates=None,
antenna_level_updates=None, do_correlate=None, parallel=False,
nproc=None, verbose=False):
"""
------------------------------------------------------------------------
Updates the interferometer array instance with newer attribute values.
Can also be used to add and/or remove interferometers with/without
affecting the existing grid.
Inputs:
antenna_level_updates
[Dictionary] Provides update information on individual
antennas and antenna array as a whole. Should be of same
type as input parameter updates in member function update()
of class AntennaArray. It consists of information updates
under the following principal keys:
'antenna_array': Consists of updates for the AntennaArray
instance. This is a dictionary which consists of
the following keys:
'timestamp' Unique identifier of the time
series. It is optional to set this
to a scalar. If not given, no
change is made to the existing
timestamp attribute
'do_grid' [boolean] If set to True, create
or recreate a grid. To be
specified when the antenna
locations are updated.
'antennas': Holds a list of dictionaries consisting of
updates for individual antennas. Each element
in the list contains update for one antenna.
For each of these dictionaries, one of the keys
is 'label' which indicates an antenna label. If
absent, the code execution stops by throwing an
exception. The other optional keys and the
information they hold are listed below:
'action' [String scalar] Indicates the type
of update operation. 'add' adds
the Antenna instance to the
AntennaArray instance. 'remove'
removes the antenna from the
antenna array instance. 'modify'
modifies the antenna attributes in
the antenna array instance. This
key has to be set. No default.
'grid_action' [Boolean] If set to True, will
apply the grdding operations
(grid(), grid_convolve(), and
grid_unconvolve()) appropriately
according to the value of the
'action' key. If set to None or
False, gridding effects will
remain unchanged. Default=None
(=False).
'antenna' [instance of class Antenna]
Updated Antenna class instance.
Can work for action key 'remove'
even if not set (=None) or set to
an empty string '' as long as
'label' key is specified.
'gridpol' [Optional. String scalar]
Initiates the specified action on
polarization 'P1' or 'P2'. Can be
set to 'P1' or 'P2'. If not
provided (=None), then the
specified action applies to both
polarizations. Default = None.
'Et' [Optional. Dictionary] Complex
Electric field time series under
two polarizations which are under
keys 'P1' and 'P2'. Is used only
if set and if 'action' key value
is set to 'modify'.
Default = None.
'stack' [boolean] If True (default),
appends the updated flag and data
to the end of the stack as a
function of timestamp. If False,
updates the last flag and data in
the stack and does not append
't' [Optional. Numpy array] Time axis
of the time series. Is used only
if set and if 'action' key value
is set to 'modify'. Default=None.
'timestamp' [Optional. Scalar] Unique
identifier of the time series. Is
used only if set and if 'action'
key value is set to 'modify'.
Default = None.
'location' [Optional. instance of GEOM.Point
class]
Antenna location in the local ENU
coordinate system. Used only if
set and if 'action' key value is
set to 'modify'. Default = None.
'aperture' [instance of class
APR.Aperture] aperture
information for the antenna. Read
docstring of class
Aperture for details
'wtsinfo' [Optional. Dictionary]
See description in Antenna class
member function update(). Is used
only if set and if 'action' key
value is set to 'modify'.
Default = None.
'flags' [Optional. Dictionary] holds
boolean flags for each of the 2
polarizations which are stored
under keys 'P1' and 'P2'.
Default=None means no updates for
flags. If True, that polarization
will be flagged. If not set
(=None), the previous or default
flag status will continue to
apply. If set to False, the
antenna status will be updated to
become unflagged.
'gridfunc_freq'
[Optional. String scalar] Read the
description of inputs to Antenna
class member function update(). If
set to None (not provided), this
attribute is determined based on
the size of wtspos_P1 and
wtspos_P2. It is applicable only
when 'action' key is set to
'modify'. Default = None.
'delaydict' [Dictionary] contains information
on delay compensation to be
applied to the fourier transformed
electric fields under each
polarization which are stored
under keys 'P1' and 'P2'. Default
is None (no delay compensation to
be applied). Refer to the
docstring of member function
delay_compensation() of class
PolInfo for more details.
'ref_freq' [Optional. Scalar] Positive value
(in Hz) of reference frequency
(used if gridfunc_freq is set to
'scale') at which wtspos_P1 and
wtspos_P2 in wtsinfo_P1 and
wtsinfo_P2, respectively, are
provided. If set to None, the
reference frequency already set in
antenna array instance remains
unchanged. Default = None.
'pol_type' [Optional. String scalar] 'Linear'
or 'Circular'. Used only when
action key is set to 'modify'. If
not provided, then the previous
value remains in effect.
Default = None.
'norm_wts' [Optional. Boolean] Default=False.
If set to True, the gridded
weights are divided by the sum of
weights so that the gridded
weights add up to unity. This is
used only when grid_action keyword
is set when action keyword is set
to 'add' or 'modify'
'gridmethod' [Optional. String] Indicates
gridding method. It accepts the
following values 'NN' (nearest
neighbour), 'BL' (Bi-linear
interpolation), and'CS' (Cubic
Spline interpolation).
Default='NN'
'distNN' [Optional. Scalar] Indicates the
upper bound on distance for a
nearest neighbour search if the
value of 'gridmethod' is set to
'NN'. The units are of physical
distance, the same as what is
used for antenna locations.
Default = NP.inf
'maxmatch' [scalar] A positive value
indicating maximum number of input
locations in the antenna grid to
be assigned. Default = None. If
set to None, all the antenna array
grid elements specified are
assigned values for each antenna.
For instance, to have only one
antenna array grid element to be
populated per antenna, use
maxmatch=1.
'tol' [scalar] If set, only lookup data
with abs(val) > tol will be
considered for nearest neighbour
lookup. Default = None implies
all lookup values will be
considered for nearest neighbour
determination. tol is to be
interpreted as a minimum value
considered as significant in the
lookup table.
interferometer_level_updates
[Dictionary] Consists of information updates for individual
interferoemters and interferometer array as a whole under
the following principal keys:
'interferometer_array': Consists of updates for the
InterferometerArray instance. This is a
dictionary which consists of the following keys:
'timestamp': Unique identifier of the time
series. It is optional to set this to a
scalar. If not given, no change is made
to the existing timestamp attribute
'interferometers': Holds a list of dictionaries where
element consists of updates for individual
interferometers. Each dictionary must contain a
key 'label' which indicates an interferometer
label. If absent, the code execution stops by
throwing an exception. The other optional keys
and the information they hold are listed below:
'action' [String scalar] Indicates the type
of update operation. 'add' adds
the Interferometer instance to the
InterferometerArray instance.
'remove' removes the
interferometer from the
interferometer array instance.
'modify' modifies the
interferometer attributes in the
interferometer array instance.
This key has to be set. No default
'grid_action' [Boolean] If set to True, will
apply the grdding operations
(grid(), grid_convolve(), and
grid_unconvolve()) appropriately
according to the value of the
'action' key. If set to None or
False, gridding effects will
remain unchanged. Default=None
(=False).
'interferometer'
[instance of class Interferometer]
Updated Interferometer class
instance. Can work for action key
'remove' even if not set (=None)
or set to an empty string '' as
long as 'label' key is specified.
'gridpol' [Optional. String scalar]
Initiates the specified action on
polarization 'P11' or 'P22'. Can
be set to 'P11' or 'P22'. If not
provided (=None), then the
specified action applies to both
polarizations. Default = None.
'Vt' [Optional. Dictionary] Complex
visibility time series for each
of the four cross-polarization
specified as keys 'P11', 'P12',
'P21' and 'P22'. Is used only if
set and if 'action' key value is
set to 'modify'. Default = None.
't' [Optional. Numpy array] Time axis
of the time series. Is used only
if set and if 'action' key value
is set to 'modify'. Default=None.
'timestamp' [Optional. Scalar] Unique
identifier of the time series. Is
used only if set and if 'action'
key value is set to 'modify'.
Default = None.
'stack' [boolean] If True (default),
appends the updated flag and data
to the end of the stack as a
function of timestamp. If False,
updates the last flag and data in
the stack and does not append
'location' [Optional. instance of GEOM.Point
class] Interferometer location in
the local ENU coordinate system.
Used only if set and if 'action'
key value is set to 'modify'.
Default=None.
'aperture' [instance of class
APR.Aperture] aperture
information for the
interferometer. Read docstring of
class Aperture for details
'wtsinfo' [Optional. Dictionary] See
description in Interferometer
class member function update().
Is used only if set and if
'action' key value is set to
'modify'. Default = None.
'flags' [Optional. Dictionary] holds
boolean flags for each of the 4
cross-polarizations which are
stored under keys 'P11', 'P12',
'P21' and 'P2'. Default=None means
no updates for flags. If True,
that polarization will be flagged.
If not set (=None), the previous
or default flag status will
continue to apply. If set to
False, the antenna status will be
updated to become unflagged.
'gridfunc_freq'
[Optional. String scalar] Read the
description of inputs to
Interferometer class member
function update(). If set to None
(not provided), this attribute is
determined based on the size of
wtspos under each polarization.
It is applicable only when
'action' key is set to 'modify'.
Default = None.
'ref_freq' [Optional. Scalar] Positive value
(in Hz) of reference frequency
(used if gridfunc_freq is set to
'scale') at which wtspos in
wtsinfo are provided. If set to
None, the reference frequency
already set in interferometer
array instance remains unchanged.
Default = None.
'pol_type' [Optional. String scalar] 'Linear'
or 'Circular'. Used only when
action key is set to 'modify'. If
not provided, then the previous
value remains in effect.
Default = None.
'norm_wts' [Optional. Boolean] Default=False.
If set to True, the gridded
weights are divided by the sum of
weights so that the gridded
weights add up to unity. This is
used only when grid_action keyword
is set when action keyword is set
to 'add' or 'modify'
'gridmethod' [Optional. String] Indicates
gridding method. It accepts the
following values 'NN' (nearest
neighbour), 'BL' (Bi-linear
interpolation), and'CS' (Cubic
Spline interpolation).
Default='NN'
'distNN' [Optional. Scalar] Indicates the
upper bound on distance for a
nearest neighbour search if the
value of 'gridmethod' is set to
'NN'. The units are of physical
distance, the same as what is
used for interferometer locations.
Default = NP.inf
'maxmatch' [scalar] A positive value
indicating maximum number of input
locations in the interferometer
grid to be assigned. Default=None.
If set to None, all the
interferometer array grid elements
specified are assigned values for
each interferometer. For instance,
to have only one interferometer
array grid element to be populated
per interferometer, use maxmatch=1
'tol' [scalar] If set, only lookup data
with abs(val) > tol will be
considered for nearest neighbour
lookup. Default = None implies
all lookup values will be
considered for nearest neighbour
determination. tol is to be
interpreted as a minimum value
considered as significant in the
lookup table.
do_correlate
[string] Indicates whether correlation operation is to be
performed after updates. Accepted values are 'FX' (for FX
operation) and 'XF' (for XF operation). Default=None means
no correlating operation is to be performed after updates.
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to
spawn. Default = None, means automatically determines the
number of process cores in the system and use one less than
that to avoid locking the system for other processes.
Applies only if input parameter 'parallel' (see above) is
set to True. If nproc is set to a value more than the number
of process cores in the system, it will be reset to number
of process cores in the system minus one to avoid locking
the system out for other processes
verbose [Boolean] Default = False. If set to True, prints some
diagnotic or progress messages.
------------------------------------------------------------------------
"""
if antenna_level_updates is not None:
if verbose:
print 'Updating antenna array...'
self.antenna_array.update(updates=antenna_level_updates)
if verbose:
print 'Updated antenna array. Refreshing interferometer flags from antenna flags...'
self.update_flags(dictflags=None, stack=False, verify=False) # Update interferometer flags using antenna level flags
if verbose:
print 'Refreshed interferometer flags. Refreshing antenna pairs...'
self.refresh_antenna_pairs()
if verbose:
print 'Refreshed antenna pairs...'
if verbose:
print 'Updating interferometer array ...'
self.timestamp = self.antenna_array.timestamp
self.t = self.antenna_array.t
if interferometer_level_updates is not None:
if not isinstance(interferometer_level_updates, dict):
raise TypeError('Input parameter interferometer_level_updates must be a dictionary')
if 'interferometers' in interferometer_level_updates:
if not isinstance(interferometer_level_updates['interferometers'], list):
interferometer_level_updates['interferometers'] = [interferometer_level_updates['interferometers']]
if parallel:
list_of_interferometer_updates = []
list_of_interferometers = []
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Interferometers '.format(len(interferometer_level_updates['interferometers'])), PGB.ETA()], maxval=len(interferometer_level_updates['interferometers'])).start()
loopcount = 0
for dictitem in interferometer_level_updates['interferometers']:
if not isinstance(dictitem, dict):
raise TypeError('Interferometer_Level_Updates to {0} instance should be provided in the form of a list of dictionaries.'.format(self.__class__.__name__))
elif 'label' not in dictitem:
raise KeyError('No interferometer label specified in the dictionary item to be updated.')
if 'action' not in dictitem:
raise KeyError('No action specified for update. Action key should be set to "add", "remove" or "modify".')
elif dictitem['action'] == 'add':
if dictitem['label'] in self.interferometers:
if verbose:
print 'Interferometer {0} for adding already exists in current instance of {1}. Skipping over to the next item to be updated.'.format(dictitem['label'], self.__class__.__name__)
else:
if verbose:
print 'Adding interferometer {0}...'.format(dictitem['label'])
self.add_interferometers(dictitem['interferometer'])
if 'grid_action' in dictitem:
self.grid_convolve(pol=dictitem['gridpol'], antpairs=dictitem['interferometer'], unconvolve_existing=False)
elif dictitem['action'] == 'remove':
if dictitem['label'] not in self.interferometers:
if verbose:
print 'Interferometer {0} for removal not found in current instance of {1}. Skipping over to the next item to be updated.'.format(dictitem['label'], self.__class__.__name__)
else:
if verbose:
print 'Removing interferometer {0}...'.format(dictitem['label'])
if 'grid_action' in dictitem:
self.grid_unconvolve(dictitem['label'], dictitem['gridpol'])
self.remove_interferometers(dictitem['label'])
elif dictitem['action'] == 'modify':
if dictitem['label'] not in self.interferometers:
if verbose:
print 'Interferometer {0} for modification not found in current instance of {1}. Skipping over to the next item to be updated.'.format(dictitem['label'], self.__class__.__name__)
else:
if verbose:
print 'Modifying interferometer {0}...'.format(dictitem['label'])
if 'Vt' not in dictitem: dictitem['Vt']=None
if 't' not in dictitem: dictitem['t']=None
if 'timestamp' not in dictitem: dictitem['timestamp']=None
if 'location' not in dictitem: dictitem['location']=None
if 'wtsinfo' not in dictitem: dictitem['wtsinfo']=None
if 'flags' not in dictitem: dictitem['flags']=None
if 'stack' not in dictitem: dictitem['stack']=False
if 'gridfunc_freq' not in dictitem: dictitem['gridfunc_freq']=None
if 'ref_freq' not in dictitem: dictitem['ref_freq']=None
if 'pol_type' not in dictitem: dictitem['pol_type']=None
if 'norm_wts' not in dictitem: dictitem['norm_wts']=False
if 'gridmethod' not in dictitem: dictitem['gridmethod']='NN'
if 'distNN' not in dictitem: dictitem['distNN']=NP.inf
if 'maxmatch' not in dictitem: dictitem['maxmatch']=None
if 'tol' not in dictitem: dictitem['tol']=None
if 'do_correlate' not in dictitem: dictitem['do_correlate']=None
if 'aperture' not in dictitem: dictitem['aperture']=None
if not parallel:
# self.interferometers[dictitem['label']].update_old(dictitem['label'], dictitem['Vt'], dictitem['t'], dictitem['timestamp'], dictitem['location'], dictitem['wtsinfo'], dictitem['flags'], dictitem['gridfunc_freq'], dictitem['ref_freq'], dictitem['do_correlate'], verbose)
self.interferometers[dictitem['label']].update(dictitem, verbose)
else:
list_of_interferometers += [self.interferometers[dictitem['label']]]
list_of_interferometer_updates += [dictitem]
if 'gric_action' in dictitem:
self.grid_convolve(pol=dictitem['gridpol'], antpairs=dictitem['interferometer'], unconvolve_existing=True, normalize=dictitem['norm_wts'], method=dictitem['gridmethod'], distNN=dictitem['distNN'], tol=dictitem['tol'], maxmatch=dictitem['maxmatch'])
else:
raise ValueError('Update action should be set to "add", "remove" or "modify".')
if verbose:
progress.update(loopcount+1)
loopcount += 1
if verbose:
progress.finish()
if parallel:
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
pool = MP.Pool(processes=nproc)
updated_interferometers = pool.map(unwrap_interferometer_update, IT.izip(list_of_interferometers, list_of_interferometer_updates))
pool.close()
pool.join()
# Necessary to make the returned and updated interferometers current, otherwise they stay unrelated
for interferometer in updated_interferometers:
self.interferometers[interferometer.label] = interferometer
del updated_interferometers
################################################################################
class Image(object):
"""
----------------------------------------------------------------------------
Class to manage image information and processing pertaining to the class
holding antenna array or interferometer array information.
[Docstring is outdated. Needs to be updated definitely]
Attributes:
timestamp: [Scalar] String or float representing the timestamp for the
current attributes
f: [vector] Frequency channels (in Hz)
f0: [Scalar] Positive value for the center frequency in Hz.
autocorr_wts_vuf
[dictionary] dictionary with polarization keys 'P1' and 'P2.
Under each key is a matrix of size nt x nv x nu x nchan
autocorr_data_vuf
[dictionary] dictionary with polarization keys 'P1' and 'P2.
Under each key is a matrix of size nt x nv x nu x nchan
where nt=1, nt=n_timestamps, or nt=n_tavg if datapool is set
to 'current', 'stack' or 'avg' respectively
gridx_P1 [Numpy array] x-locations of the grid lattice for P1
polarization
gridy_P1 [Numpy array] y-locations of the grid lattice for P1
polarization
gridx_P2 [Numpy array] x-locations of the grid lattice for P2
polarization
gridy_P2 [Numpy array] y-locations of the grid lattice for P2
polarization
grid_illumination_P1
[Numpy array] Electric field illumination for P1 polarization
on the grid. Could be complex. Same size as the grid
grid_illumination_P2
[Numpy array] Electric field illumination for P2 polarization
on the grid. Could be complex. Same size as the grid
grid_Ef_P1 [Numpy array] Complex Electric field of polarization P1
projected on the grid.
grid_Ef_P2 [Numpy array] Complex Electric field of polarization P2
projected on the grid.
holograph_PB_P1
[Numpy array] Complex holographic electric field pattern on sky
for polarization P1. Obtained by inverse fourier transforming
grid_illumination_P1. It is 3-dimensional (third dimension is
the frequency axis)
holograph_P1 [Numpy array] Complex holographic image cube for polarization
P1 obtained by inverse fourier transforming Ef_P1
PB_P1 [Numpy array] Power pattern of the antenna obtained by squaring
the absolute value of holograph_PB_P1. It is 3-dimensional
(third dimension is the frequency axis)
lf_P1 [Numpy array] 3D grid of l-axis in the direction cosines
coordinate system corresponding to polarization P1, the third
axis being along frequency.
mf_P1 [Numpy array] 3D grid of m-axis in the direction cosines
coordinate system corresponding to polarization P1, the third
axis being along frequency.
img_P1 [Numpy array] 3D image cube obtained by squaring the absolute
value of holograph_P1. The third dimension is along frequency.
holograph_PB_P2
[Numpy array] Complex holographic electric field pattern on sky
for polarization P2. Obtained by inverse fourier transforming
grid_illumination_P2. It is 3-dimensional (third dimension is
the frequency axis)
holograph_P2 [Numpy array] Complex holographic image cube for polarization
P2 obtained by inverse fourier transforming Ef_P2
PB_P2 [Numpy array] Power pattern of the antenna obtained by squaring
the absolute value of holograph_PB_P2. It is 3-dimensional
(third dimension is the frequency axis)
lf_P2 [Numpy array] 3D grid of l-axis in the direction cosines
coordinate system corresponding to polarization P2, the third
axis being along frequency.
mf_P2 [Numpy array] 3D grid of m-axis in the direction cosines
coordinate system corresponding to polarization P2, the third
axis being along frequency.
img_P2 [Numpy array] 3D image cube obtained by squaring the absolute
value of holograph_P2. The third dimension is along frequency.
extfile [string] external filename under which images and associated
info will be stored
Member Functions:
__init__() Initializes an instance of class Image which manages
information and processing of images from data obtained by an
antenna array. It can be initialized either by values in an
instance of class AntennaArray, by values in a fits file
containing information about the antenna array, or to defaults.
imagr() Imaging engine that performs inverse fourier transforms of
appropriate electric field quantities associated with the
antenna array.
stack() Stacks current images and UV-grid information onto a stack
accumulate_inplace()
Accumulates (adds) in-place the image, synthesized beam,
gridded visibilities and aperture plane weights in the
external file.
reset_extfile()
Reset/initialize the extfile under specified datapool(s)
accumulate() Accumulates and averages gridded quantities that are
statistically stationary such as images and visibilities
average() Averages the image, synthesized beam, gridded visibilities,
aperture plane weights, autocorrelation data and weights in
the external file.
evalAutoCorr()
Evaluate sum of auto-correlations of all antenna weights on
the UV-plane.
evalPowerPattern()
Evaluate power pattern for the antenna from its zero-centered
cross-correlated footprint
getStats() Get statistics from images from inside specified boxes
save() Saves the image information to disk
Read the member function docstrings for more details
----------------------------------------------------------------------------
"""
def __init__(self, f0=None, f=None, pol=None, antenna_array=None,
interferometer_array=None, infile=None, timestamp=None,
extfile=None, verbose=True):
"""
------------------------------------------------------------------------
Initializes an instance of class Image which manages information and
processing of images from data obtained by an antenna array or
interferometer array. It can be initialized either by values in an
instance of class AntennaArray, by values in an instance of class
InterferometerArray, or by values in a fits file containing information
about the antenna array or interferometer array, or to defaults.
Class attributes initialized are:
timestamp, f, f0, gridx_P1, gridy_P1, grid_illumination_P1, grid_Ef_P1,
holograph_P1, holograph_PB_P1, img_P1, PB_P1, lf_P1, mf_P1, gridx_P1,
gridy_P1, grid_illumination_P1, grid_Ef_P1, holograph_P1,
holograph_PB_P1, img_P1, PB_P1, lf_P1, mf_P1, autocorr_wts_vuf,
autocorr_data_vuf, extfile
Read docstring of class Image for details on these attributes.
------------------------------------------------------------------------
"""
if verbose:
print '\nInitializing an instance of class Image...\n'
print '\tVerifying for compatible arguments...'
if timestamp is not None:
self.timestamp = timestamp
if verbose:
print '\t\tInitialized time stamp.'
self.timestamps = []
self.tbinsize = None
if f0 is not None:
self.f0 = f0
if verbose:
print '\t\tInitialized center frequency.'
if f is not None:
self.f = NP.asarray(f)
if verbose:
print '\t\tInitialized frequency channels.'
self.measured_type = None
self.antenna_array = None
self.interferometer_array = None
self.autocorr_set = False
self.autocorr_removed = False
if (infile is None) and (antenna_array is None) and (interferometer_array is None):
self.extfile = None
self.gridx_P1 = None
self.gridy_P1 = None
self.grid_illumination_P1 = None
self.grid_Ef_P1 = None
self.holograph_P1 = None
self.holograph_PB_P1 = None
self.img_P1 = None
self.PB_P1 = None
self.lf_P1 = None
self.mf_P1 = None
self.gridx_P2 = None
self.gridy_P2 = None
self.grid_illumination_P2 = None
self.grid_Ef_P2 = None
self.holograph_P2 = None
self.holograph_PB_P2 = None
self.img_P2 = None
self.PB_P2 = None
self.lf_P2 = None
self.mf_P2 = None
if verbose:
print '\t\tInitialized gridx_P1, gridy_P1, grid_illumination_P1, and grid_Ef_P1'
print '\t\tInitialized lf_P1, mf_P1, holograph_PB_P1, PB_P1, holograph_P1, and img_P1'
print '\t\tInitialized gridx_P2, gridy_P2, grid_illumination_P2, and grid_Ef_P2'
print '\t\tInitialized lf_P2, mf_P2, holograph_PB_P2, PB_P2, holograph_P2, and img_P2'
print '\t\tInitialized extfile'
if (infile is not None) and (antenna_array is not None):
raise ValueError('Both gridded data file and antenna array information are specified. One and only one of these should be specified. Cannot initialize an instance of class Image.')
if (infile is not None) and (interferometer_array is not None):
raise ValueError('Both gridded data file and interferometer array information are specified. One and only one of these should be specified. Cannot initialize an instance of class Image.')
if (antenna_array is not None) and (interferometer_array is not None):
raise ValueError('Both antenna array and interferometer array information are specified. One and only one of these should be specified. Cannot initialize an instance of class Image.')
if verbose:
print '\tArguments verified for initialization.'
if infile is not None:
if verbose:
print '\tInitializing from input file...'
try:
hdulist = fits.open(infile)
except IOError:
raise IOError('File not found. Image instance not initialized.')
except EOFError:
raise EOFError('EOF encountered. File cannot be read. Image instance not initialized.')
else:
extnames = [hdu.header['EXTNAME'] for hdu in hdulist]
if verbose:
print '\t\tFITS file opened successfully. The extensions have been read.'
if 'FREQ' in extnames:
self.f = hdulist['FREQ'].data
if verbose:
print '\t\t\tInitialized frequency channels.'
else:
raise KeyError('Frequency information unavailable in the input file.')
if 'f0' in hdulist[0].header:
self.f0 = hdulist[0].header['f0']
if verbose:
print '\t\t\tInitialized center frequency to {0} Hz from FITS header.'.format(self.f0)
else:
self.f0 = self.f[int(len(self.f)/2)]
if verbose:
print '\t\t\tNo center frequency found in FITS header. Setting it to \n\t\t\t\tthe center of frequency channels: {0} Hz'.format(self.f0)
if 'tobs' in hdulist[0].header:
self.timestamp = hdulist[0].header['tobs']
if verbose:
print '\t\t\tInitialized time stamp.'
if (pol is None) or (pol == 'P1'):
if verbose:
print '\n\t\t\tWorking on polarization P1...'
if ('GRIDX_P1' not in extnames) or ('GRIDY_P1' not in extnames) or ('GRID_ILLUMINATION_P1_REAL' not in extnames) or ('GRID_ILLUMINATION_P1_IMAG' not in extnames) or ('GRID_EF_P1_REAL' not in extnames) or ('GRID_EF_P1_IMAG' not in extnames):
raise KeyError('One or more pieces of gridding information is missing in the input file for polarization P1. Verify the file contains appropriate data.')
self.gridx_P1 = hdulist['GRIDX_P1'].data
self.gridy_P1 = hdulist['GRIDY_P1'].data
self.grid_illumination_P1 = hdulist['GRID_ILLUMINATION_P1_REAL'].data + 1j * hdulist['GRID_ILLUMINATION_P1_IMAG'].data
self.grid_Ef_P1 = hdulist['GRID_EF_P1_REAL'].data + 1j * hdulist['GRID_EF_P1_IMAG'].data
self.holograph_P1 = None
self.img_P1 = None
self.holograph_PB_P1 = None
self.PB_P1 = None
self.lf_P1 = None
self.mf_P1 = None
if verbose:
print '\t\t\tInitialized gridx_P1, gridy_P1, grid_illumination_P1, and grid_Ef_P1'
print '\t\t\tInitialized lf_P1, mf_P1, holograph_PB_P1, PB_P1, holograph_P1, and img_P1'
if (pol is None) or (pol == 'P2'):
if verbose:
print '\n\t\t\tWorking on polarization P2...'
if ('GRIDX_P2' not in extnames) or ('GRIDY_P2' not in extnames) or ('GRID_ILLUMINATION_P2_REAL' not in extnames) or ('GRID_ILLUMINATION_P2_IMAG' not in extnames) or ('GRID_EF_P2_REAL' not in extnames) or ('GRID_EF_P2_IMAG' not in extnames):
raise KeyError('One or more pieces of gridding information is missing in the input file for polarization P2. Verify the file contains appropriate data.')
self.gridx_P2 = hdulist['GRIDX_P2'].data
self.gridy_P2 = hdulist['GRIDY_P2'].data
self.grid_illumination_P2 = hdulist['GRID_ILLUMINATION_P2_REAL'].data + 1j * hdulist['GRID_ILLUMINATION_P2_IMAG'].data
self.grid_Ef_P2 = hdulist['GRID_EF_P2_REAL'].data + 1j * hdulist['GRID_EF_P2_IMAG'].data
self.holograph_P2 = None
self.img_P2 = None
self.holograph_PB_P2 = None
self.PB_P2 = None
self.lf_P2 = None
self.mf_P2 = None
if verbose:
print '\t\t\tInitialized gridx_P2, gridy_P2, grid_illumination_P2, and grid_Ef_P2'
print '\t\t\tInitialized lf_P2, mf_P2, holograph_PB_P2, PB_P2, holograph_P2, and img_P2'
hdulist.close()
if verbose:
print '\t\tClosed input FITS file.'
self.grid_illumination = {}
self.holimg = {}
self.holbeam = {}
self.img = {}
self.beam = {}
self.pbeam = {}
self.gridl = {}
self.gridm = {}
self.grid_wts = {}
self.grid_Ef = {}
self.grid_Vf = {}
self.holimg_stack = {}
self.holbeam_stack = {}
self.img_stack = {}
self.beam_stack = {}
self.grid_illumination_stack = {}
self.grid_vis_stack = {}
self.img_avg = {}
self.beam_avg = {}
self.grid_vis_avg = {}
self.grid_illumination_avg = {}
self.wts_vuf = {}
self.vis_vuf = {}
self.twts = {}
self.autocorr_wts_vuf = {}
self.autocorr_data_vuf = {}
self.nzsp_grid_vis_avg = {}
self.nzsp_grid_illumination_avg = {}
self.nzsp_wts_vuf = {}
self.nzsp_vis_vuf = {}
self.nzsp_img_avg = {}
self.nzsp_beam_avg = {}
self.nzsp_img = {}
self.nzsp_beam = {}
if antenna_array is not None:
if verbose:
print '\tInitializing from an instance of class AntennaArray...'
if isinstance(antenna_array, AntennaArray):
self.f = antenna_array.f
if verbose:
print '\t\tInitialized frequency channels.'
self.f0 = antenna_array.f0
if verbose:
print '\t\tInitialized center frequency to {0} Hz from antenna array info.'.format(self.f0)
self.timestamp = antenna_array.timestamp
if verbose:
print '\t\tInitialized time stamp to {0} from antenna array info.'.format(self.timestamp)
if pol is None:
pol = ['P1', 'P2']
pol = NP.unique(NP.asarray(pol))
self.gridu, self.gridv = antenna_array.gridu, antenna_array.gridv
for apol in ['P1', 'P2']:
self.holimg[apol] = None
self.holbeam[apol] = None
self.img[apol] = None
self.beam[apol] = None
self.grid_illumination[apol] = None
self.grid_Ef[apol] = None
self.grid_wts[apol] = None
self.holimg_stack[apol] = None
self.holbeam_stack[apol] = None
self.img_stack[apol] = None
self.beam_stack[apol] = None
self.grid_illumination_stack[apol] = None
self.grid_vis_stack[apol] = None
self.grid_vis_avg[apol] = None
self.grid_illumination_avg[apol] = None
self.img_avg[apol] = None
self.beam_avg[apol] = None
self.twts[apol] = None
self.wts_vuf[apol] = None
self.vis_vuf[apol] = None
self.autocorr_wts_vuf[apol] = None
self.autocorr_data_vuf[apol] = None
self.nzsp_grid_vis_avg[apol] = None
self.nzsp_grid_illumination_avg[apol] = None
self.nzsp_wts_vuf[apol] = None
self.nzsp_vis_vuf[apol] = None
self.nzsp_img_avg[apol] = None
self.nzsp_beam_avg[apol] = None
self.nzsp_img[apol] = None
self.nzsp_beam[apol] = None
self.pbeam[apol] = None
self.antenna_array = antenna_array
self.measured_type = 'E-field'
if verbose:
print '\t\tInitialized gridded attributes for image object'
else:
raise TypeError('antenna_array is not an instance of class AntennaArray. Cannot initiate instance of class Image.')
if extfile is not None:
if not isinstance(extfile, str):
raise TypeError('Input extfile name must be a string')
self.extfile = extfile
with h5py.File(self.extfile, 'w') as fext:
hdr_group = fext.create_group('header')
hdr_group['f'] = self.f
hdr_group['f'].attrs['units'] = 'Hz'
hdr_group['f0'] = self.f0
hdr_group['f0'].attrs['units'] = 'Hz'
hdr_group['pol'] = pol
if verbose:
print '\t\tInitialized extfile'
if interferometer_array is not None:
if verbose:
print '\tInitializing from an instance of class InterferometerArray...'
if isinstance(interferometer_array, InterferometerArray):
self.f = interferometer_array.f
if verbose:
print '\t\tInitialized frequency channels.'
self.f0 = interferometer_array.f0
if verbose:
print '\t\tInitialized center frequency to {0} Hz from interferometer array info.'.format(self.f0)
self.timestamp = interferometer_array.timestamp
if verbose:
print '\t\tInitialized time stamp to {0} from interferometer array info.'.format(self.timestamp)
if pol is None:
pol = ['P11', 'P12', 'P21', 'P22']
pol = NP.unique(NP.asarray(pol))
self.gridu, self.gridv = interferometer_array.gridu, interferometer_array.gridv
for cpol in ['P11', 'P12', 'P21', 'P22']:
self.holimg[cpol] = None
self.holbeam[cpol] = None
self.img[cpol] = None
self.beam[cpol] = None
self.grid_illumination[cpol] = None
self.grid_Vf[cpol] = None
self.grid_wts[cpol] = None
self.holimg_stack[cpol] = None
self.holbeam_stack[cpol] = None
self.img_stack[cpol] = None
self.beam_stack[cpol] = None
self.grid_illumination_stack[cpol] = None
self.grid_vis_stack[cpol] = None
self.grid_vis_avg[cpol] = None
self.grid_illumination_avg[cpol] = None
self.img_avg[cpol] = None
self.beam_avg[cpol] = None
self.twts[cpol] = None
self.wts_vuf[cpol] = None
self.vis_vuf[cpol] = None
self.autocorr_wts_vuf[cpol] = None
self.nzsp_grid_vis_avg[cpol] = None
self.nzsp_grid_illumination_avg[cpol] = None
self.nzsp_wts_vuf[cpol] = None
self.nzsp_vis_vuf[cpol] = None
self.nzsp_img_avg[cpol] = None
self.nzsp_beam_avg[cpol] = None
self.nzsp_img[cpol] = None
self.nzsp_beam[cpol] = None
self.pbeam[cpol] = None
self.interferometer_array = interferometer_array
self.measured_type = 'visibility'
if verbose:
print '\t\tInitialized gridded attributes for image object'
else:
raise TypeError('interferometer_array is not an instance of class InterferometerArray. Cannot initiate instance of class Image.')
if verbose:
print '\nSuccessfully initialized an instance of class Image\n'
############################################################################
def reset(self, verbose=True):
"""
------------------------------------------------------------------------
Reset some grid level attributes of image object to init values
Inputs:
verbose [boolean] If True (default), prints diagnostic and progress
messages. If False, suppress printing such messages.
The attributes reset to init values are grid_illumination, holbeam,
grid_Vf, grid_Ef, interferometer_array, antenna_array, holimg, gridl,
gridm, img, beam, grid_wts
------------------------------------------------------------------------
"""
if verbose:
print 'Resetting grid level attributes of image object...'
self.antenna_array = None
self.interferometer_array = None
self.timestamp = None
self.grid_illumination = {}
self.holimg = {}
self.holbeam = {}
self.img = {}
self.beam = {}
self.gridl = {}
self.gridm = {}
self.grid_wts = {}
self.grid_Ef = {}
self.grid_Vf = {}
self.wts_vuf = {}
self.vis_vuf = {}
if self.measured_type == 'E-field':
for apol in ['P1', 'P2']:
self.holimg[apol] = None
self.holbeam[apol] = None
self.img[apol] = None
self.beam[apol] = None
self.grid_illumination[apol] = None
self.grid_Ef[apol] = None
self.grid_wts[apol] = None
self.wts_vuf[apol] = None
self.vis_vuf[apol] = None
else:
for cpol in ['P11', 'P12', 'P21', 'P22']:
self.holimg[cpol] = None
self.holbeam[cpol] = None
self.img[cpol] = None
self.beam[cpol] = None
self.grid_illumination[cpol] = None
self.grid_Vf[cpol] = None
self.grid_wts[cpol] = None
self.wts_vuf[cpol] = None
self.vis_vuf[cpol] = None
############################################################################
def update(self, antenna_array=None, interferometer_array=None, reset=True,
verbose=True):
"""
------------------------------------------------------------------------
Updates the image object with newer instance of class AntennaArray or
InterferometerArray
Inputs:
antenna_array [instance of class AntennaArray] Update the image object
with this new instance of class AntennaArray (if attribute
measured_type is 'E-field')
interferometer_array
[instance of class InterferometerArray] Update the image
object with this new instance of class InterferometerArray
(if attribute measured_type is 'visibility')
reset [boolean] if set to True (default), resets some of the
image object attribtues by calling member function reset()
verbose [boolean] If True (default), prints diagnostic and progress
messages. If False, suppress printing such messages.
------------------------------------------------------------------------
"""
if not isinstance(reset, bool):
raise TypeError('reset keyword must be of boolean type')
if not isinstance(verbose, bool):
raise TypeError('verbose keyword must be of boolean type')
if self.measured_type == 'E-field':
if antenna_array is not None:
if isinstance(antenna_array, AntennaArray):
if reset:
self.reset(verbose=verbose)
self.gridu, self.gridv = antenna_array.gridu, antenna_array.gridv
self.antenna_array = antenna_array
else:
raise TypeError('Input antenna_array must be an instance of class AntennaArray')
self.timestamp = antenna_array.timestamp
if verbose:
print 'Updated antenna array attributes of the image instance'
else:
if interferometer_array is not None:
if isinstance(interferometer_array, InterferometerArray):
if reset:
self.reset(verbose=verbose)
self.gridu, self.gridv = interferometer_array.gridu, interferometer_array.gridv
self.interferometer_array = interferometer_array
else:
raise TypeError('Input interferometer_array must be an instance of class InterferometerArray')
self.timestamp = interferometer_array.timestamp
if verbose:
print 'Updated interferometer array attributes of the image instance'
############################################################################
def imagr(self, pol=None, weighting='natural', pad=0, stack=True,
grid_map_method='sparse', cal_loop=False, nproc=None,
verbose=True):
"""
------------------------------------------------------------------------
Imaging engine that performs inverse fourier transforms of appropriate
electric fields or visibilities associated with the antenna array or
interferometer array respectively.
Keyword Inputs:
pol [string] indicates which polarization information to be
imaged. Allowed values are 'P1', 'P2' or None (default). If
None, both polarizations are imaged.
weighting [string] indicates weighting scheme. Default='natural'.
Accepted values are 'natural' and 'uniform'
pad [integer] indicates the amount of padding before imaging.
In case of MOFF imaging the output image will be of size
2**(pad+1) times the size of the antenna array grid along u-
and v-axes. In case of FX imaging, the output image will be of
size 2**pad times the size of interferometer array grid along
u- and v-axes. Value must not be negative. Default=0 (implies
padding by factor 2 along u- and v-axes for MOFF, and no
padding for FX)
stack [boolean] If True (default), stacks the imaged and uv-gridded
data to the stack for batch processing later. If False, it
will accumulate these in-place
grid_map_method
[string] Accepted values are 'regular' and 'sparse' (default).
If 'regular' it applies the regular grid mapping while
'sparse' applies the grid mapping based on sparse matrix
methods
cal_loop [boolean] Applicable only in case when attribute
measured_type is set to 'E-field' (MOFF imaging) and
grid_map_method is set to 'sparse'. If True, the calibration
loop is assumed to be ON and hence the calibrated electric
fields are used in imaging. If False (default), the
calibration loop is assumed to be OFF and the current stream
of electric fields are assumed to be the calibrated data to
be mapped to the grid
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
verbose [boolean] If True (default), prints diagnostic and progress
messages. If False, suppress printing such messages.
------------------------------------------------------------------------
"""
if verbose:
print '\nPreparing to image...\n'
if self.f is None:
raise ValueError('Frequency channels have not been initialized. Cannot proceed with imaging.')
if self.measured_type is None:
raise ValueError('Measured type is unknown.')
if not isinstance(pad, int):
raise TypeError('Input keyword pad must be an integer')
elif pad < 0:
raise ValueError('Input keyword pad must not be negative')
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
grid_shape = self.gridu.shape
if self.measured_type == 'E-field':
if pol is None: pol = ['P1', 'P2']
pol = NP.unique(NP.asarray(pol)).tolist()
for apol in pol:
if apol in ['P1', 'P2']:
if grid_map_method == 'regular':
self.antenna_array.make_grid_cube_new(pol=apol, verbose=verbose)
elif grid_map_method == 'sparse':
self.antenna_array.applyMappingMatrix(pol=apol, cal_loop=cal_loop, verbose=verbose)
else:
raise ValueError('Invalid value specified for input parameter grid_map_method')
self.grid_wts[apol] = NP.zeros(self.gridu.shape+(self.f.size,))
if apol in self.antenna_array.grid_illumination:
if SpM.issparse(self.antenna_array.grid_illumination[apol]):
self.grid_illumination[apol] = self.antenna_array.grid_illumination[apol].A.reshape(self.gridu.shape+(self.f.size,))
self.grid_Ef[apol] = self.antenna_array.grid_Ef[apol].A.reshape(self.gridu.shape+(self.f.size,))
else:
self.grid_illumination[apol] = self.antenna_array.grid_illumination[apol]
self.grid_Ef[apol] = self.antenna_array.grid_Ef[apol]
if verbose: print 'Preparing to Inverse Fourier Transform...'
if weighting == 'uniform':
self.grid_wts[apol][NP.abs(self.grid_illumination[apol]) > 0.0] = 1.0/NP.abs(self.grid_illumination[apol][NP.abs(self.grid_illumination[apol]) > 0.0])
else:
self.grid_wts[apol][NP.abs(self.grid_illumination[apol]) > 0.0] = 1.0
sum_wts = NP.sum(NP.abs(self.grid_wts[apol] * self.grid_illumination[apol]), axis=(0,1), keepdims=True)
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
if nproc > 1:
s_list = [(2**(pad+1) * self.gridu.shape[0], 2**(pad+1) * self.gridv.shape[1])] * nproc
axes_list = [(0,1)] * nproc
for qty in ['psf', 'image']:
if qty == 'psf':
qtylist = NP.array_split(self.grid_wts[apol]*self.grid_illumination[apol], nproc, axis=2)
else:
qtylist = NP.array_split(self.grid_wts[apol]*self.grid_Ef[apol], nproc, axis=2)
pool = MP.Pool(processes=nproc)
outqtylist = pool.map(DSP.unwrap_FFT2D, IT.izip(qtylist, s_list, axes_list))
pool.close()
pool.join()
if qty == 'psf':
syn_beam = NP.concatenate(tuple(outqtylist), axis=2)
else:
dirty_image = NP.concatenate(tuple(outqtylist), axis=2)
del outqtylist
else:
syn_beam = NP.fft.fft2(self.grid_wts[apol]*self.grid_illumination[apol], s=[2**(pad+1) * self.gridu.shape[0], 2**(pad+1) * self.gridv.shape[1]], axes=(0,1))
dirty_image = NP.fft.fft2(self.grid_wts[apol]*self.grid_Ef[apol], s=[2**(pad+1) * self.gridu.shape[0], 2**(pad+1) * self.gridv.shape[1]], axes=(0,1))
self.gridl, self.gridm = NP.meshgrid(NP.fft.fftshift(NP.fft.fftfreq(2**(pad+1) * self.gridu.shape[1], du)), NP.fft.fftshift(NP.fft.fftfreq(2**(pad+1) * self.gridv.shape[0], dv)))
self.holbeam[apol] = NP.fft.fftshift(syn_beam/sum_wts, axes=(0,1))
self.holimg[apol] = NP.fft.fftshift(dirty_image/sum_wts, axes=(0,1))
syn_beam = NP.abs(syn_beam)**2
sum_wts2 = sum_wts**2
dirty_image = NP.abs(dirty_image)**2
self.beam[apol] = NP.fft.fftshift(syn_beam/sum_wts2, axes=(0,1))
self.img[apol] = NP.fft.fftshift(dirty_image/sum_wts2, axes=(0,1))
if nproc > 1:
s_list = [None] * nproc
axes_list = [(0,1)] * nproc
for qty in ['wts', 'vis']:
if qty == 'wts':
qtylist = NP.array_split(syn_beam/sum_wts2, nproc, axis=2)
else:
qtylist = NP.array_split(dirty_image/sum_wts2, nproc, axis=2)
pool = MP.Pool(processes=nproc)
outqtylist = pool.map(DSP.unwrap_IFFT2D, IT.izip(qtylist, s_list, axes_list))
pool.close()
pool.join()
qty_vuf = NP.concatenate(tuple(outqtylist), axis=2)
qty_vuf = NP.fft.ifftshift(qty_vuf, axes=(0,1)) # Shift array to be centered
if qty == 'wts':
self.wts_vuf[apol] = qty_vuf[qty_vuf.shape[0]/2-self.gridv.shape[0]:qty_vuf.shape[0]/2+self.gridv.shape[0], qty_vuf.shape[1]/2-self.gridu.shape[1]:qty_vuf.shape[1]/2+self.gridu.shape[1], :]
else:
self.vis_vuf[apol] = qty_vuf[qty_vuf.shape[0]/2-self.gridv.shape[0]:qty_vuf.shape[0]/2+self.gridv.shape[0], qty_vuf.shape[1]/2-self.gridu.shape[1]:qty_vuf.shape[1]/2+self.gridu.shape[1], :]
else:
qty_vuf = NP.fft.ifft2(syn_beam/sum_wts2, axes=(0,1)) # Inverse FT
qty_vuf = NP.fft.ifftshift(qty_vuf, axes=(0,1)) # Shift array to be centered
# self.wts_vuf[apol] = qty_vuf[self.gridv.shape[0]:3*self.gridv.shape[0],self.gridu.shape[1]:3*self.gridu.shape[1],:]
self.wts_vuf[apol] = qty_vuf[qty_vuf.shape[0]/2-self.gridv.shape[0]:qty_vuf.shape[0]/2+self.gridv.shape[0], qty_vuf.shape[1]/2-self.gridu.shape[1]:qty_vuf.shape[1]/2+self.gridu.shape[1], :]
qty_vuf = NP.fft.ifft2(dirty_image/sum_wts2, axes=(0,1)) # Inverse FT
qty_vuf = NP.fft.ifftshift(qty_vuf, axes=(0,1)) # Shift array to be centered
self.vis_vuf[apol] = qty_vuf[qty_vuf.shape[0]/2-self.gridv.shape[0]:qty_vuf.shape[0]/2+self.gridv.shape[0], qty_vuf.shape[1]/2-self.gridu.shape[1]:qty_vuf.shape[1]/2+self.gridu.shape[1], :]
if self.measured_type == 'visibility':
if pol is None: pol = ['P11', 'P12', 'P21', 'P22']
pol = NP.unique(NP.asarray(pol)).tolist()
for cpol in pol:
if cpol in ['P11', 'P12', 'P21', 'P22']:
if grid_map_method == 'regular':
self.interferometer_array.make_grid_cube_new(verbose=verbose, pol=cpol)
elif grid_map_method == 'sparse':
self.interferometer_array.applyMappingMatrix(pol=cpol, verbose=verbose)
else:
raise ValueError('Invalid value specified for input parameter grid_map_method')
self.grid_wts[cpol] = NP.zeros(self.gridu.shape+(self.f.size,))
if cpol in self.interferometer_array.grid_illumination:
if SpM.issparse(self.interferometer_array.grid_illumination[cpol]):
self.grid_illumination[cpol] = self.interferometer_array.grid_illumination[cpol].A.reshape(self.gridu.shape+(self.f.size,))
self.grid_Vf[cpol] = self.interferometer_array.grid_Vf[cpol].A.reshape(self.gridu.shape+(self.f.size,))
else:
self.grid_illumination[cpol] = self.interferometer_array.grid_illumination[cpol]
self.grid_Vf[cpol] = self.interferometer_array.grid_Vf[cpol]
if verbose: print 'Preparing to Inverse Fourier Transform...'
if weighting == 'uniform':
self.grid_wts[cpol][NP.abs(self.grid_illumination[cpol]) > 0.0] = 1.0/NP.abs(self.grid_illumination[cpol][NP.abs(self.grid_illumination[cpol]) > 0.0])
else:
self.grid_wts[cpol][NP.abs(self.grid_illumination[cpol]) > 0.0] = 1.0
sum_wts = NP.sum(NP.abs(self.grid_wts[cpol] * self.grid_illumination[cpol]), axis=(0,1), keepdims=True)
padded_syn_beam_in_uv = NP.pad(self.grid_wts[cpol]*self.grid_illumination[cpol], (((2**pad-1)*self.gridv.shape[0]/2,(2**pad-1)*self.gridv.shape[0]/2),((2**pad-1)*self.gridu.shape[1]/2,(2**pad-1)*self.gridu.shape[1]/2),(0,0)), mode='constant', constant_values=0)
padded_grid_Vf = NP.pad(self.grid_wts[cpol]*self.grid_Vf[cpol], (((2**pad-1)*self.gridv.shape[0]/2,(2**pad-1)*self.gridv.shape[0]/2),((2**pad-1)*self.gridu.shape[1]/2,(2**pad-1)*self.gridu.shape[1]/2),(0,0)), mode='constant', constant_values=0)
self.gridl, self.gridm = NP.meshgrid(NP.fft.fftshift(NP.fft.fftfreq(2**pad * grid_shape[1], du)), NP.fft.fftshift(NP.fft.fftfreq(2**pad * grid_shape[0], dv)))
# Shift to be centered
padded_syn_beam_in_uv = NP.fft.ifftshift(padded_syn_beam_in_uv, axes=(0,1))
padded_grid_Vf = NP.fft.ifftshift(padded_grid_Vf, axes=(0,1))
# Compute the synthesized beam. It is at a finer resolution due to padding
syn_beam = NP.fft.fft2(padded_syn_beam_in_uv, axes=(0,1))
dirty_image = NP.fft.fft2(padded_grid_Vf, axes=(0,1))
# Select only the real part, equivalent to adding conjugate baselines
dirty_image = dirty_image.real
syn_beam = syn_beam.real
self.beam[cpol] = NP.fft.fftshift(syn_beam/sum_wts, axes=(0,1))
self.img[cpol] = NP.fft.fftshift(dirty_image/sum_wts, axes=(0,1))
qty_vuf = NP.fft.ifft2(syn_beam/sum_wts, axes=(0,1)) # Inverse FT
qty_vuf = NP.fft.ifftshift(qty_vuf, axes=(0,1)) # Shift array to be centered
# self.wts_vuf[cpol] = qty_vuf[self.gridv.shape[0]/2:3*self.gridv.shape[0]/2,self.gridu.shape[1]/2:3*self.gridu.shape[1]/2,:]
self.wts_vuf[cpol] = qty_vuf[qty_vuf.shape[0]/2-self.gridv.shape[0]/2:qty_vuf.shape[0]/2+self.gridv.shape[0]/2, qty_vuf.shape[1]/2-self.gridu.shape[1]/2:qty_vuf.shape[1]/2+self.gridu.shape[1]/2,:]
qty_vuf = NP.fft.ifft2(dirty_image/sum_wts, axes=(0,1)) # Inverse FT
qty_vuf = NP.fft.ifftshift(qty_vuf, axes=(0,1)) # Shift array to be centered
# self.vis_vuf[cpol] = qty_vuf[self.gridv.shape[0]/2:3*self.gridv.shape[0]/2,self.gridu.shape[1]/2:3*self.gridu.shape[1]/2,:]
self.vis_vuf[cpol] = qty_vuf[qty_vuf.shape[0]/2-self.gridv.shape[0]/2:qty_vuf.shape[0]/2+self.gridv.shape[0]/2, qty_vuf.shape[1]/2-self.gridu.shape[1]/2:qty_vuf.shape[1]/2+self.gridu.shape[1]/2,:]
nan_ind = NP.where(self.gridl**2 + self.gridm**2 > 1.0)
# nan_ind_unraveled = NP.unravel_index(nan_ind, self.gridl.shape)
# self.beam[cpol][nan_ind_unraveled,:] = NP.nan
# self.img[cpol][nan_ind_unraveled,:] = NP.nan
if verbose:
print 'Successfully imaged.'
# self.evalAutoCorr(datapool='current', forceeval=False)
with h5py.File(self.extfile, 'a') as fext:
if 'image-plane' not in fext:
planes = ['image-plane', 'aperture-plane']
arraytypes = ['stack', 'accumulate', 'avg']
reim_list = ['real', 'imag']
for p in pol:
dset = fext.create_dataset('twts/{0}'.format(p), data=NP.zeros(1), dtype='f4')
for plane in planes:
if plane == 'image-plane':
for arraytype in arraytypes:
if arraytype == 'avg':
tdt = h5py.special_dtype(vlen=NP.dtype('f8'))
tshape = (1,)
else:
tdt = 'f8'
tshape = (0,)
tdset = fext.create_dataset('{0}/{1}/timestamps'.format(plane,arraytype), shape=tshape, maxshape=(None,), dtype=tdt)
qtytypes = ['image', 'psf']
for lm in ['l', 'm']:
if '{0}/{1}'.format(plane, lm) not in fext:
if lm == 'l':
vect = self.gridl[0,:]
l_ind = NP.where(NP.abs(vect) <= 1.05)[0]
dset = fext.create_dataset('{0}/{1}_ind'.format(plane, lm), data=l_ind)
dset = fext.create_dataset('{0}/{1}'.format(plane, lm), data=vect[l_ind])
else:
vect = self.gridm[:,0]
m_ind = NP.where(NP.abs(vect) <= 1.05)[0]
dset = fext.create_dataset('{0}/{1}_ind'.format(plane, lm), data=m_ind)
dset = fext.create_dataset('{0}/{1}'.format(plane, lm), data=vect[m_ind])
else:
dset = fext['{0}/{1}_ind'.format(plane, lm)]
if lm == 'l':
l_ind = dset.value
else:
m_ind = dset.value
else:
qtytypes = ['xcorr', 'acorr']
subqtytypes = ['vals', 'wts']
for qtytype in qtytypes:
for arraytype in arraytypes:
if arraytype == 'avg':
tdt = h5py.special_dtype(vlen=NP.dtype('f8'))
tshape = (1,)
else:
tdt = 'f8'
tshape = (0,)
tdset = fext.create_dataset('{0}/{1}/{2}/timestamps'.format(plane,qtytype,arraytype), shape=tshape, maxshape=(None,), dtype=tdt)
for uv in ['u', 'v']:
if '{0}/{1}'.format(plane, uv) not in fext:
if uv == 'u':
vect = self.gridu[0,:]
else:
vect = self.gridv[:,0]
dset = fext.create_dataset('{0}/{1}'.format(plane, uv), data=vect)
for qtytype in qtytypes:
for arraytype in arraytypes:
for p in pol:
if plane == 'image-plane':
if arraytype == 'stack':
dset = fext.create_dataset('{0}/{1}/{2}/{3}'.format(plane,qtytype,arraytype,p), data=NP.full((1,self.f.size,m_ind.size,l_ind.size), NP.nan), maxshape=(None,self.f.size,m_ind.size,l_ind.size), chunks=(1,1,m_ind.size,l_ind.size), dtype='f8', compression='gzip', compression_opts=9)
elif arraytype == 'accumulate':
dset = fext.create_dataset('{0}/{1}/{2}/{3}'.format(plane,qtytype,arraytype,p), data=NP.zeros((self.f.size,m_ind.size,l_ind.size)), maxshape=(self.f.size,m_ind.size,l_ind.size), chunks=(1,m_ind.size,l_ind.size), dtype='f8', compression='gzip', compression_opts=9)
elif arraytype == 'avg':
dset = fext.create_dataset('{0}/{1}/{2}/{3}'.format(plane,qtytype,arraytype,p), data=NP.full((1,self.f.size,m_ind.size,l_ind.size), NP.nan), maxshape=(None,self.f.size,m_ind.size,l_ind.size), chunks=(1,1,m_ind.size,l_ind.size), dtype='f8', compression='gzip', compression_opts=9)
else:
idxdt = h5py.special_dtype(vlen=NP.dtype('i8'))
valdt = h5py.special_dtype(vlen=NP.dtype('f8'))
for rowcol in ['freqind', 'ij']:
dset = fext.create_dataset('{0}/{1}/{2}/{3}/{4}'.format(plane,qtytype,rowcol,arraytype,p), shape=(1,), maxshape=(None,), dtype=idxdt, compression='gzip', compression_opts=9)
for subqty in subqtytypes:
for reim in reim_list:
dset = fext.create_dataset('{0}/{1}/{2}/{3}/{4}/{5}'.format(plane,qtytype,subqty,arraytype,p,reim), shape=(1,), maxshape=(None,), dtype=valdt, compression='gzip', compression_opts=9)
# Call stack() if required
if stack:
self.stack(pol=pol)
else:
self.accumulate_inplace(pol=pol)
############################################################################
def stack(self, pol=None):
"""
------------------------------------------------------------------------
Stacks current images and UV-grid information onto a stack
Inputs:
pol [string] indicates which polarization information to be saved.
Allowed values are 'P1', 'P2' in case of MOFF or 'P11', 'P12',
'P21', 'P22' in case of FX or None (default). If None,
information on all polarizations appropriate for MOFF or FX
are stacked
------------------------------------------------------------------------
"""
if self.timestamp not in self.timestamps:
if pol is None:
if self.measured_type == 'E-field':
pol = ['P1', 'P2']
else:
pol = ['P11', 'P12', 'P21', 'P22']
elif isinstance(pol, str):
pol = [pol]
elif isinstance(pol, list):
p = [item for item in pol if item in ['P1', 'P2', 'P11', 'P12', 'P21', 'P22']]
pol = p
else:
raise TypeError('Input pol must be a string or list specifying polarization(s)')
if self.extfile is not None:
with h5py.File(self.extfile, 'a') as fext:
planes = ['image-plane', 'aperture-plane']
arraytypes = ['stack']
reim_list = ['real', 'imag']
for plane in planes:
if plane == 'image-plane':
for arraytype in arraytypes:
tdset = fext['{0}/{1}/timestamps'.format(plane,arraytype)]
tdset.resize(tdset.size+1, axis=0)
tdset[-1:] = self.timestamp
qtytypes = ['image', 'psf']
for lm in ['l', 'm']:
dset = fext['{0}/{1}_ind'.format(plane, lm)]
if lm == 'l':
l_ind = dset.value
else:
m_ind = dset.value
mlf_ind = NP.ix_(m_ind, l_ind, NP.arange(self.f.size)) # m (row) first
else:
qtytypes = ['xcorr']
subqtytypes = ['vals', 'wts']
for qtytype in qtytypes:
for arraytype in arraytypes:
tdset = fext['{0}/{1}/{2}/timestamps'.format(plane,qtytype,arraytype)]
tdset.resize(tdset.size+1, axis=0)
tdset[-1:] = self.timestamp
for qtytype in qtytypes:
for arraytype in arraytypes:
for p in pol:
if plane == 'image-plane':
dset = fext['{0}/{1}/{2}/{3}'.format(plane,qtytype,arraytype,p)]
if NP.any(NP.isnan(dset.value)):
if NP.sum(NP.isnan(dset.value)) != dset.size:
raise ValueError('Inconsistent number of NaN found')
else:
dset.resize(dset.shape[0]+1, axis=0)
if qtytype == 'image':
dset[-1:] = NP.rollaxis(self.img[p][mlf_ind], 2, start=0)
elif qtytype == 'psf':
dset[-1:] = NP.rollaxis(self.beam[p][mlf_ind], 2, start=0)
else:
wts_vuf = NP.rollaxis(self.wts_vuf[p], 2, start=0)
xcorr_shape_3D = wts_vuf.shape
wts_vuf = wts_vuf.reshape(wts_vuf.shape[0], -1)
xcorr_shape_2D = wts_vuf.shape
sprow, spcol = NP.where(NP.abs(wts_vuf) > 1e-10)
vis_vuf = NP.rollaxis(self.vis_vuf[p], 2,start=0)
vis_vuf = vis_vuf.reshape(vis_vuf.shape[0], -1)
if '{0}/{1}/shape2D/{2}/{3}'.format(plane,qtytype,arraytype,p) not in fext:
dset = fext.create_dataset('{0}/{1}/shape2D/{2}/{3}'.format(plane,qtytype,arraytype,p), data=NP.asarray(xcorr_shape_2D))
if '{0}/{1}/shape3D/{2}/{3}'.format(plane,qtytype,arraytype,p) not in fext:
dset = fext.create_dataset('{0}/{1}/shape3D/{2}/{3}'.format(plane,qtytype,arraytype,p), data=NP.asarray(xcorr_shape_3D))
for rowcol in ['freqind', 'ij']:
dset = fext['{0}/{1}/{2}/{3}/{4}'.format(plane,qtytype,rowcol,arraytype,p)]
if dset[-1].size > 0:
dset.resize(dset.shape[0]+1, axis=0)
if rowcol == 'freqind':
dset[-1] = NP.copy(sprow)
else:
dset[-1] = NP.copy(spcol)
for subqty in subqtytypes:
for reim in ['real', 'imag']:
dset = fext['{0}/{1}/{2}/{3}/{4}/{5}'.format(plane,qtytype,subqty,arraytype,p,reim)]
if dset[-1].size > 0:
dset.resize(dset.shape[0]+1, axis=0)
if subqty == 'wts':
if reim == 'real':
dset[-1] = NP.copy(wts_vuf[sprow,spcol].real)
else:
dset[-1] = NP.copy(wts_vuf[sprow,spcol].imag)
else:
if reim == 'real':
dset[-1] = NP.copy(vis_vuf[sprow,spcol].real)
else:
dset[-1] = NP.copy(vis_vuf[sprow,spcol].imag)
for p in pol:
dset = fext['twts/{0}'.format(p)]
dset[...] += 1.0
else:
for p in pol:
if self.img_stack[p] is None:
self.img_stack[p] = self.img[p][NP.newaxis,:,:,:]
self.beam_stack[p] = self.beam[p][NP.newaxis,:,:,:]
self.grid_illumination_stack[p] = self.wts_vuf[p][NP.newaxis,:,:,:]
self.grid_vis_stack[p] = self.vis_vuf[p][NP.newaxis,:,:,:]
else:
self.img_stack[p] = NP.concatenate((self.img_stack[p], self.img[p][NP.newaxis,:,:,:]), axis=0)
self.beam_stack[p] = NP.concatenate((self.beam_stack[p], self.beam[p][NP.newaxis,:,:,:]), axis=0)
self.grid_illumination_stack[p] = NP.concatenate((self.grid_illumination_stack[p], self.wts_vuf[p][NP.newaxis,:,:,:]), axis=0)
self.grid_vis_stack[p] = NP.concatenate((self.grid_vis_stack[p], self.vis_vuf[p][NP.newaxis,:,:,:]), axis=0)
if self.measured_type == 'E-field':
if self.holimg_stack[p] is None:
self.holimg_stack[p] = self.holimg[p][NP.newaxis,:,:,:]
self.holbeam_stack[p] = self.holbeam[p][NP.newaxis,:,:,:]
else:
self.holimg_stack[p] = NP.concatenate((self.holimg_stack[p], self.holimg[p][NP.newaxis,:,:,:]), axis=0)
self.holbeam_stack[p] = NP.concatenate((self.holbeam_stack[p], self.holbeam[p][NP.newaxis,:,:,:]), axis=0)
self.timestamps += [self.timestamp]
############################################################################
def accumulate_inplace(self, pol=None, verbose=True):
"""
------------------------------------------------------------------------
Accumulates (adds) in-place the image, synthesized beam, gridded
visibilities and aperture plane weights in the external file.
Inputs:
pol [string] indicates which polarization information to be saved.
Allowed values are 'P1', 'P2' in case of MOFF or 'P11', 'P12',
'P21', 'P22' in case of FX or None (default). If None,
information on all polarizations appropriate for MOFF or FX
are accumulated
verbose [boolean] If True (default), prints diagnostic and progress
messages. If False, suppress printing such messages.
------------------------------------------------------------------------
"""
if self.timestamp not in self.timestamps:
if pol is None:
if self.measured_type == 'E-field':
pol = ['P1', 'P2']
else:
pol = ['P11', 'P12', 'P21', 'P22']
elif isinstance(pol, str):
pol = [pol]
elif isinstance(pol, list):
p = [item for item in pol if item in ['P1', 'P2', 'P11', 'P12', 'P21', 'P22']]
pol = p
else:
raise TypeError('Input pol must be a string or list specifying polarization(s)')
if self.extfile is not None:
with h5py.File(self.extfile, 'a') as fext:
planes = ['image-plane', 'aperture-plane']
arraytypes = ['accumulate']
reim_list = ['real', 'imag']
for plane in planes:
if plane == 'image-plane':
for arraytype in arraytypes:
tdset = fext['{0}/{1}/timestamps'.format(plane,arraytype)]
tdset.resize(tdset.size+1, axis=0)
tdset[-1] = self.timestamp
qtytypes = ['image', 'psf']
for lm in ['l', 'm']:
dset = fext['{0}/{1}_ind'.format(plane, lm)]
if lm == 'l':
l_ind = dset.value
else:
m_ind = dset.value
mlf_ind = NP.ix_(m_ind, l_ind, NP.arange(self.f.size)) # m (row) first
else:
qtytypes = ['xcorr']
subqtytypes = ['wts', 'vals']
for qtytype in qtytypes:
for arraytype in arraytypes:
tdset = fext['{0}/{1}/{2}/timestamps'.format(plane,qtytype,arraytype)]
tdset.resize(tdset.size+1, axis=0)
tdset[-1] = self.timestamp
for qtytype in qtytypes:
for arraytype in arraytypes:
for p in pol:
if plane == 'image-plane':
dset = fext['{0}/{1}/{2}/{3}'.format(plane,qtytype,arraytype,p)]
if qtytype == 'image':
dset[...] += NP.rollaxis(self.img[p][mlf_ind], 2, start=0)
else:
dset[...] += NP.rollaxis(self.beam[p][mlf_ind], 2, start=0)
else:
new_wts_vuf = NP.rollaxis(self.wts_vuf[p], 2, start=0)
xcorr_shape_3D = new_wts_vuf.shape
new_wts_vuf = new_wts_vuf.reshape(new_wts_vuf.shape[0], -1)
xcorr_shape_2D = new_wts_vuf.shape
new_sprow, new_spcol = NP.where(NP.abs(new_wts_vuf) > 1e-10)
new_vis_vuf = NP.rollaxis(self.vis_vuf[p], 2,start=0)
new_vis_vuf = new_vis_vuf.reshape(new_vis_vuf.shape[0], -1)
new_csc_wts_vuf = SpM.csc_matrix((new_wts_vuf[new_sprow,new_spcol], (new_sprow, new_spcol)), shape=xcorr_shape_2D)
new_csc_vis_vuf = SpM.csc_matrix((new_vis_vuf[new_sprow,new_spcol], (new_sprow, new_spcol)), shape=xcorr_shape_2D)
if '{0}/{1}/shape2D/{2}/{3}'.format(plane,qtytype,arraytype,p) not in fext:
dset = fext.create_dataset('{0}/{1}/shape2D/{2}/{3}'.format(plane,qtytype,arraytype,p), data=NP.asarray(xcorr_shape_2D))
if '{0}/{1}/shape3D/{2}/{3}'.format(plane,qtytype,arraytype,p) not in fext:
dset = fext.create_dataset('{0}/{1}/shape3D/{2}/{3}'.format(plane,qtytype,arraytype,p), data=NP.asarray(xcorr_shape_3D))
for rowcol in ['freqind', 'ij']:
dset = fext['{0}/{1}/{2}/{3}/{4}'.format(plane,qtytype,rowcol,arraytype,p)]
if dset[-1].size == 0:
if rowcol == 'freqind':
dset[-1] = NP.copy(new_sprow)
else:
dset[-1] = NP.copy(new_spcol)
else:
if rowcol == 'freqind':
acc_sprow = NP.copy(dset[-1])
else:
acc_spcol = NP.copy(dset[-1])
for subqty in subqtytypes:
for reim in ['real', 'imag']:
dset = fext['{0}/{1}/{2}/{3}/{4}/{5}'.format(plane,qtytype,subqty,arraytype,p,reim)]
if dset[-1].size == 0:
if subqty == 'wts':
if reim == 'real':
dset[-1] = NP.copy(new_wts_vuf[new_sprow, new_spcol].real)
else:
dset[-1] = NP.copy(new_wts_vuf[new_sprow, new_spcol].imag)
else:
if reim == 'real':
dset[-1] = NP.copy(new_vis_vuf[new_sprow, new_spcol].real)
else:
dset[-1] = NP.copy(new_vis_vuf[new_sprow, new_spcol].imag)
just_set = True
else:
if reim == 'real':
acc_qty = dset[-1].astype(NP.complex128)
else:
acc_qty += 1j * dset[-1]
just_set = False
if (dset[-1].size > 0) and not just_set:
acc_spmat = SpM.csc_matrix((acc_qty, (acc_sprow, acc_spcol)), shape=xcorr_shape_2D)
if subqty == 'wts':
acc_spmat += new_csc_wts_vuf
new_acc_sprow, new_acc_spcol = NP.where((NP.abs(acc_spmat) > 1e-10).toarray())
for rowcol in ['freqind', 'ij']:
dset = fext['{0}/{1}/{2}/{3}/{4}'.format(plane,qtytype,rowcol,arraytype,p)]
if rowcol == 'freqind':
dset[-1] = NP.copy(new_acc_sprow)
else:
dset[-1] = NP.copy(new_acc_spcol)
else:
acc_spmat += new_csc_vis_vuf
for reim in ['real', 'imag']:
dset = fext['{0}/{1}/{2}/{3}/{4}/{5}'.format(plane,qtytype,subqty,arraytype,p,reim)]
if reim == 'real':
dset[-1] = acc_spmat[new_acc_sprow, new_acc_spcol].real.A.ravel()
else:
dset[-1] = acc_spmat[new_acc_sprow, new_acc_spcol].imag.A.ravel()
for p in pol:
dset = fext['twts/{0}'.format(p)]
dset[...] += 1.0
self.timestamps += [self.timestamp]
if verbose:
print '\nIn-place accumulation of image, beam, visibility, and synthesis aperture weights completed for timestamp {0:.7f}.\n'.format(self.timestamp)
############################################################################
def average(self, pol=None, datapool='accumulate', autocorr_op='rmfit',
verbose=True):
"""
------------------------------------------------------------------------
Averages the image, synthesized beam, gridded visibilities, aperture
plane weights, autocorrelation data and weights in the external file,
with optional removal of autocorrelation weights and data.
Inputs:
pol [string] indicates which polarization information to be saved.
Allowed values are 'P1', 'P2' in case of MOFF or 'P11', 'P12',
'P21', 'P22' in case of FX or None (default). If None,
information on all polarizations appropriate for MOFF or FX
are accumulated
datapool
[string] Data pool from which values will be used in the
averaging. Accepted values are 'accumulate' (default) and
'stack'.
autocorr_op
[string] indicates if autocorrelation weights and data are to
be removed. Accepted values are 'rmfit' (fit and remove an
estimate of autocorr weights and data), 'mask' (mask the
footprint of autocorr weights to zero) , and 'none' (keep the
autocorr weights and data without any modification).
Default='rmfit'.
verbose [boolean] If True (default), prints diagnostic and progress
messages. If False, suppress printing such messages.
------------------------------------------------------------------------
"""
if pol is None:
if self.measured_type == 'E-field':
pol = ['P1', 'P2']
else:
pol = ['P11', 'P12', 'P21', 'P22']
elif isinstance(pol, str):
pol = [pol]
elif isinstance(pol, list):
p = [item for item in pol if item in ['P1', 'P2', 'P11', 'P12', 'P21', 'P22']]
pol = p
else:
raise TypeError('Input pol must be a string or list specifying polarization(s)')
if not isinstance(datapool, str):
raise TypeError('Input datapool must be a string')
else:
if datapool.lower() not in ['accumulate', 'stack']:
raise ValueError('Inout datapool value not accepted')
if not isinstance(autocorr_op, str):
raise TypeError('Input autocorr_op must be a string')
if autocorr_op.lower() not in ['rmfit', 'mask', 'none']:
raise ValueError('Invalid value specified for input autocorr_op')
if self.extfile is not None:
with h5py.File(self.extfile, 'a') as fext:
plane = 'aperture-plane'
reim_list = ['real', 'imag']
qtytypes = ['xcorr']
subqtytypes = ['wts', 'vals']
for qtytype in qtytypes:
tdset = fext['{0}/{1}/avg/timestamps'.format(plane,qtytype)]
if tdset[-1].size > 0:
tdset.resize(tdset.size+1, axis=0)
tdset[-1] = fext['{0}/{1}/{2}/timestamps'.format(plane,qtytype,datapool)].value
for p in pol:
if '{0}/{1}/shape2D/{2}/{3}'.format(plane,qtytype,datapool,p) not in fext:
raise KeyError('Key {0}/{1}/shape2D/{2}/{3} not found in external file')
else:
shape2D_dset = fext['{0}/{1}/shape2D/{2}/{3}'.format(plane,qtytype,datapool,p)]
shape2D = shape2D_dset.value
freqind_list = [arr for arr in fext['{0}/{1}/freqind/{2}/{3}'.format(plane,qtytype,datapool,p)]]
ijind_list = [arr for arr in fext['{0}/{1}/ij/{2}/{3}'.format(plane,qtytype,datapool,p)]]
for subqty in subqtytypes:
spmat = SpM.csc_matrix(tuple(shape2D), dtype=NP.complex128) # Create empty sparse matrix
for tind in range(len(freqind_list)):
for reim in reim_list:
dset = fext['{0}/{1}/{2}/{3}/{4}/{5}'.format(plane,qtytype,subqty,datapool,p,reim)][tind]
if reim == 'real':
spmat += SpM.csc_matrix((dset, (freqind_list[tind], ijind_list[tind])), shape=spmat.shape)
else:
spmat += 1j*SpM.csc_matrix((dset, (freqind_list[tind], ijind_list[tind])), shape=spmat.shape)
spmat /= fext['twts/{0}'.format(p)].value[0] # Average the accumulated sparse matrix
if autocorr_op in ['rmfit', 'mask']:
shape2D_auto = fext['{0}/acorr/shape2D/avg/{1}'.format(plane,p)].value
if not NP.array_equal(shape2D_auto,shape2D):
raise ValueError('Xcorr and Acorr shapes not equal')
sprow_auto = fext['{0}/acorr/freqind/avg/{1}'.format(plane,p)][-1]
spcol_auto = fext['{0}/acorr/ij/avg/{1}'.format(plane,p)][-1]
spmat_auto = SpM.csc_matrix(tuple(shape2D_auto), dtype=NP.complex128)
for reim in reim_list:
dset = fext['{0}/acorr/{1}/avg/{2}/{3}'.format(plane,subqty,p,reim)]
if reim == 'real':
spmat_auto += SpM.csc_matrix((dset[-1], (sprow_auto, spcol_auto)), shape=shape2D_auto)
else:
spmat_auto += 1j * SpM.csc_matrix((dset[-1], (sprow_auto, spcol_auto)), shape=shape2D_auto)
if autocorr_op.lower() == 'mask':
spmat -= SpM.csc_matrix((spmat[sprow_auto, spcol_auto].A.ravel(), (sprow_auto, spcol_auto)), shape=shape2D) # Force pixels present in auto footprint to zero
else:
spmat = spmat.A - (spmat[:,int(NP.floor(0.5*shape2D[1]))] / spmat_auto[:,int(NP.floor(0.5*shape2D[1]))]).A * spmat_auto.A # Force zero spacing pixel to match and then subtract the auto footprint, now a dense matrix
if subqty == 'wts':
sprow, spcol = NP.where((NP.abs(spmat) > 1e-10).toarray())
for rowcol in ['freqind', 'ij']:
rc_dset = fext['{0}/{1}/{2}/avg/{3}'.format(plane,qtytype,rowcol,p)]
if rc_dset[-1].size > 0:
rc_dset.resize(rc_dset.size+1, axis=0)
if rowcol == 'freqind':
rc_dset[-1] = NP.copy(sprow)
else:
rc_dset[-1] = NP.copy(spcol)
for reim in reim_list:
avg_dset = fext['{0}/{1}/{2}/avg/{3}/{4}'.format(plane,qtytype,subqty,p,reim)]
if avg_dset[-1].size > 0:
avg_dset.resize(avg_dset.size+1, axis=0)
if reim == 'real':
avg_dset[-1] = NP.copy(spmat[sprow,spcol].A.real.ravel())
else:
avg_dset[-1] = NP.copy(spmat[sprow,spcol].A.imag.ravel())
plane = 'image-plane'
tdset = fext['{0}/avg/timestamps'.format(plane)]
if tdset[-1].size > 0:
tdset.resize(tdset.size+1, axis=0)
tdset[-1] = fext['{0}/{1}/timestamps'.format(plane,datapool)].value
qtytypes = ['psf', 'image']
for qtytype in qtytypes:
for p in pol:
if autocorr_op.lower() == 'none':
dset = fext['{0}/{1}/{2}/{3}'.format(plane,qtytype,datapool,p)]
if datapool == 'stack':
qty_fml = NP.mean(dset.value, axis=0) # Average across time
else:
qty_fml = dset.value / fext['twts/{0}'.format(p)].value[0] # Average across time
else:
for lm in ['l', 'm']:
dset = fext['{0}/{1}_ind'.format(plane, lm)]
if lm == 'l':
l_ind = dset.value
else:
m_ind = dset.value
fml_ind = NP.ix_(NP.arange(self.f.size), m_ind, l_ind) # m (row) first
shape2D = fext['aperture-plane/xcorr/shape2D/{0}/{1}'.format(datapool,p)].value
shape3D = fext['aperture-plane/xcorr/shape3D/{0}/{1}'.format(datapool,p)].value
for rowcol in ['freqind', 'ij']:
dset = fext['aperture-plane/xcorr/{0}/avg/{1}'.format(rowcol,p)]
if rowcol == 'freqind':
sprow = dset[-1]
else:
spcol = dset[-1]
if qtytype == 'psf':
apqty = 'wts'
else:
apqty = 'vals'
for reim in reim_list:
dset = fext['aperture-plane/xcorr/{0}/avg/{1}/{2}'.format(apqty,p,reim)]
if reim == 'real':
spmat = SpM.csc_matrix((dset[-1], (sprow, spcol)), shape=shape2D, dtype=NP.complex128)
else:
spmat += 1j * SpM.csc_matrix((dset[-1], (sprow, spcol)), shape=shape2D, dtype=NP.complex128)
mat = spmat.A.reshape(shape3D)
if apqty == 'wts':
sum_wts = NP.sum(mat, axis=(1,2), keepdims=True)
qty_fml = NP.fft.fftshift(NP.fft.fft2(NP.fft.ifftshift(mat, axes=(1,2)), axes=(1,2)), axes=(1,2)) / sum_wts
if NP.abs(qty_fml.imag).max() > 1e-10:
raise ValueError('Significant imaginary component found in the image-plane quantity.')
qty_fml = qty_fml[fml_ind].real
dset = fext['{0}/{1}/avg/{2}'.format(plane,qtytype,p)]
if NP.any(NP.isnan(dset.value)):
if NP.sum(NP.isnan(dset.value)) != dset.size:
raise ValueError('Inconsistent number of NaN found')
else:
dset.resize(dset.shape[0]+1, axis=0)
dset[-1] = qty_fml
############################################################################
def reset_extfile(self, datapool=None):
"""
------------------------------------------------------------------------
Reset/initialize the extfile under specified datapool(s)
datapool
[None or string or list] Data pool which will be reset or
initialized in the external file. Accepted values are
'accumulate' and 'stack'. If set to None (default), both
'accumulate' and 'stack' datapools will be reset/initialized in
the external file
------------------------------------------------------------------------
"""
if datapool is None:
datapool = ['accumulate', 'stack']
elif isinstance(datapool, 'str'):
if datapool not in ['accumulate', 'stack']:
raise ValueError('Value "{0}" in input datapool not accepted.'.format(datapool))
datapool = [datapool]
elif isinstance(datapool, list):
for item in datapool:
if not isinstance(item, str):
raise TypeError('Item in datapool must be a string')
if item not in ['accumulate', 'stack']:
raise ValueError('Value "{0}" in input datapool not accepted.'.format(item))
else:
raise TypeError('Input datapool has invalid type')
pol = ['P1', 'P2']
if self.extfile is not None:
with h5py.File(self.extfile, 'a') as fext:
planes = ['image-plane', 'aperture-plane']
reim_list = ['real', 'imag']
for p in pol:
try:
dset = fext['twts/{0}'.format(p)]
dset[...] = NP.zeros(1)
except KeyError:
pass
for plane in planes:
if plane == 'image-plane':
qtytypes = ['image', 'psf']
for arraytype in datapool:
tdset = fext['{0}/{1}/timestamps'.format(plane,arraytype)]
tdset.resize(0, axis=0)
else:
qtytypes = ['xcorr', 'acorr']
subqtytypes = ['vals', 'wts']
for qtytype in qtytypes:
for arraytype in datapool:
tdset = fext['{0}/{1}/{2}/timestamps'.format(plane,qtytype,arraytype)]
tdset.resize(0, axis=0)
for qtytype in qtytypes:
for arraytype in datapool:
for p in pol:
if plane == 'image-plane':
try:
dset = fext['{0}/{1}/{2}/{3}'.format(plane,qtytype,arraytype,p)]
if arraytype == 'stack':
dset.resize(1, axis=0)
dset[-1] = NP.full((dset.shape[1], dset.shape[2], dset.shape[3]), NP.nan)
elif arraytype == 'accumulate':
dset[...] = NP.full(dset.shape, 0.0)
except KeyError:
pass
else:
for rowcol in ['freqind', 'ij']:
try:
dset = fext['{0}/{1}/{2}/{3}/{4}'.format(plane,qtytype,rowcol,arraytype,p)]
dset.resize(1, axis=0)
dset[-1] = NP.asarray([])
except KeyError:
pass
for subqty in subqtytypes:
for reim in reim_list:
try:
dset = fext['{0}/{1}/{2}/{3}/{4}/{5}'.format(plane,qtytype,subqty,arraytype,p,reim)]
dset.resize(1, axis=0)
dset[-1] = NP.asarray([])
except KeyError:
pass
############################################################################
def accumulate(self, tbinsize=None, verbose=True):
"""
------------------------------------------------------------------------
Accumulates and averages gridded quantities that are statistically
stationary such as images and visibilities
Input:
tbinsize [scalar or dictionary] Contains bin size of timestamps while
averaging. Default = None means gridded quantities over all
timestamps are averaged. If scalar, the same (positive) value
applies to all polarizations. If dictionary, timestamp bin size
(positive) is provided under each key 'P11', 'P12', 'P21',
'P22'. If any of the keys is missing the gridded quantities
for that polarization are averaged over all timestamps.
verbose [boolean] If True (default), prints diagnostic and progress
messages. If False, suppress printing such messages.
------------------------------------------------------------------------
"""
if self.measured_type == 'E-field':
pol = ['P1', 'P2']
else:
pol = ['P11', 'P12', 'P21', 'P22']
timestamps = NP.asarray(self.timestamps).astype(NP.float)
twts = {}
img_acc = {}
beam_acc = {}
grid_vis_acc = {}
grid_illumination_acc = {}
for p in pol:
img_acc[p] = None
beam_acc[p] = None
grid_vis_acc[p] = None
grid_illumination_acc[p] = None
twts[p] = []
if tbinsize is None: # Average across all timestamps
for p in pol:
if self.img_stack[p] is not None:
img_acc[p] = NP.nansum(self.img_stack[p], axis=0, keepdims=True)
beam_acc[p] = NP.nansum(self.beam_stack[p], axis=0, keepdims=True)
grid_vis_acc[p] = NP.nansum(self.grid_vis_stack[p], axis=0, keepdims=True)
grid_illumination_acc[p] = NP.nansum(self.grid_illumination_stack[p], axis=0, keepdims=True)
twts[p] = NP.asarray(len(self.timestamps)).reshape(-1,1,1,1)
self.tbinsize = tbinsize
elif isinstance(tbinsize, (int, float)): # Apply same time bin size to all polarizations
eps = 1e-10
tbins = NP.arange(timestamps.min(), timestamps.max(), tbinsize)
tbins = NP.append(tbins, timestamps.max()+eps)
for p in pol:
counts, tbin_edges, tbinnum, ri = OPS.binned_statistic(timestamps, statistic='count', bins=tbins)
for binnum in range(counts.size):
ind = ri[ri[binnum]:ri[binnum+1]]
twts[p] += [counts]
if img_acc[p] is None:
if self.img_stack[p] is not None:
img_acc[p] = NP.nansum(self.img_stack[p][ind,:,:,:], axis=0, keepdims=True)
beam_acc[p] = NP.nansum(self.beam_stack[p][ind,:,:,:], axis=0, keepdims=True)
grid_vis_acc[p] = NP.nansum(self.grid_vis_stack[p][ind,:,:,:], axis=0, keepdims=True)
grid_illumination_acc[p] = NP.nansum(self.grid_illumination_stack[p][ind,:,:,:], axis=0, keepdims=True)
else:
if self.img_stack[p] is not None:
img_acc[p] = NP.vstack((img_acc[p], NP.nansum(self.img_stack[p][ind,:,:,:], axis=0, keepdims=True)))
beam_acc[p] = NP.vstack((beam_acc[p], NP.nansum(self.beam_stack[p][ind,:,:,:], axis=0, keepdims=True)))
grid_vis_acc[p] = NP.vstack((grid_vis_acc[p], NP.nansum(self.grid_vis_stack[p][ind,:,:,:], axis=0, keepdims=True)))
grid_illumination_acc[p] = NP.vstack((grid_illumination_acc[p], NP.nansum(self.grid_illumination_stack[p][ind,:,:,:], axis=0, keepdims=True)))
twts[p] = NP.asarray(twts[p]).astype(NP.float).reshape(-1,1,1,1)
self.tbinsize = tbinsize
elif isinstance(tbinsize, dict): # Apply different time binsizes to corresponding polarizations
tbsize = {}
for p in pol:
if p not in tbinsize:
if self.img_stack[p] is not None:
img_acc[p] = NP.nansum(self.img_stack[p], axis=0, keepdims=True)
beam_acc[p] = NP.nansum(self.beam_stack[p], axis=0, keepdims=True)
grid_vis_acc[p] = NP.nansum(self.grid_vis_stack[p], axis=0, keepdims=True)
grid_illumination_acc[p] = NP.nansum(self.grid_illumination_stack[p], axis=0, keepdims=True)
twts[p] = NP.asarray(len(self.timestamps)).reshape(-1,1,1,1)
tbsize[p] = None
elif isinstance(tbinsize[p], (int,float)):
eps = 1e-10
tbins = NP.arange(timestamps.min(), timestamps.max(), tbinsize[p])
tbins = NP.append(tbins, timestamps.max()+eps)
counts, tbin_edges, tbinnum, ri = OPS.binned_statistic(timestamps, statistic='count', bins=tbins)
for binnum in range(counts.size):
ind = ri[ri[binnum]:ri[binnum+1]]
twts[p] += [counts]
if img_acc[p] is None:
if self.img_stack[p] is not None:
img_acc[p] = NP.nansum(self.img_stack[p][ind,:,:,:], axis=0, keepdims=True)
beam_acc[p] = NP.nansum(self.beam_stack[p][ind,:,:,:], axis=0, keepdims=True)
grid_vis_acc[p] = NP.nansum(self.grid_vis_stack[p][ind,:,:,:], axis=0, keepdims=True)
grid_illumination_acc[p] = NP.nansum(self.grid_illumination_stack[p][ind,:,:,:], axis=0, keepdims=True)
else:
if self.img_stack[p] is not None:
img_acc[p] = NP.vstack((img_acc[p], NP.nansum(self.img_stack[p][ind,:,:,:], axis=0, keepdims=True)))
beam_acc[p] = NP.vstack((beam_acc[p], NP.nansum(self.beam_stack[p][ind,:,:,:], axis=0, keepdims=True)))
grid_vis_acc[p] = NP.vstack((grid_vis_acc[p], NP.nansum(self.grid_vis_stack[p][ind,:,:,:], axis=0, keepdims=True)))
grid_illumination_acc[p] = NP.vstack((grid_illumination_acc[p], NP.nansum(self.grid_illumination_stack[p][ind,:,:,:], axis=0, keepdims=True)))
twts[p] = NP.asarray(twts[p]).astype(NP.float).reshape(-1,1,1,1)
tbsize[p] = tbinsize[p]
else:
if self.img_stack[p] is not None:
img_acc[p] = NP.nansum(self.img_stack[p], axis=0, keepdims=True)
beam_acc[p] = NP.nansum(self.beam_stack[p], axis=0, keepdims=True)
grid_vis_acc[p] = NP.nansum(self.grid_vis_stack[p], axis=0, keepdims=True)
grid_illumination_acc[p] = NP.nansum(self.grid_illumination_stack[p], axis=0, keepdims=True)
twts[p] = NP.asarray(len(self.timestamps)).reshape(-1,1,1,1)
tbsize[p] = None
self.tbinsize = tbsize
# Compute the averaged grid quantities from the accumulated versions
for p in pol:
if img_acc[p] is not None:
self.img_avg[p] = img_acc[p] / twts[p]
self.beam_avg[p] = beam_acc[p] / twts[p]
self.grid_vis_avg[p] = grid_vis_acc[p] / twts[p]
self.grid_illumination_avg[p] = grid_illumination_acc[p] / twts[p]
self.twts = twts
############################################################################
def evalAutoCorr(self, pol=None, datapool='avg', forceeval_autowts=False,
forceeval_autocorr=True, nproc=None, save=True,
verbose=True):
"""
------------------------------------------------------------------------
Evaluate sum of auto-correlations of all antenna weights on the
UV-plane.
Inputs:
pol [string] indicates which polarization information to be saved.
Allowed values are 'P1', 'P2' in case of MOFF. If None,
information on all polarizations appropriate for MOFF are
evaluated
datapool [string] Specifies whether data to be used in determining the
auto-correlation the E-fields to be used come from
'stack', 'current', or 'avg' (default). Squared electric
fields will be used if set to 'current' or 'stack', and
averaged squared electric fields if set to 'avg'
forceeval_autowts
[boolean] When set to False (default) the auto-correlation
weights in the UV plane is not evaluated if it was already
evaluated earlier. If set to True, it will be forcibly
evaluated independent of whether they were already evaluated
or not
forceeval_autocorr
[boolean] When set to False (default) the auto-correlation
data in the UV plane is not evaluated if it was already
evaluated earlier. If set to True, it will be forcibly
evaluated independent of whether they were already evaluated
or not
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
save [boolean] If True (default), save the autocorrelation weights
and data if an external file exists. It only applies when
datapool='avg', otherwise it does not save to external file.
verbose [boolean] When set to True (default), print diagnostic
messages, otherwise suppress messages
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
elif isinstance(pol, str):
pol = [pol]
elif isinstance(pol, list):
p = [item for item in pol if item in ['P1', 'P2']]
pol = p
else:
raise TypeError('Input pol must be a string or list specifying polarization(s)')
if not isinstance(forceeval_autowts, bool):
raise TypeError('Input forceeval_autowts must be boolean')
if not isinstance(forceeval_autocorr, bool):
raise TypeError('Input forceeval_autocorr must be boolean')
if not isinstance(save, bool):
raise TypeError('Input save must be boolean')
if forceeval_autowts or forceeval_autocorr or (not self.autocorr_set):
self.autocorr_wts_vuf, self.autocorr_data_vuf = self.antenna_array.makeAutoCorrCube(pol=pol, datapool=datapool, tbinsize=self.tbinsize, forceeval_autowts=forceeval_autowts, forceeval_autocorr=forceeval_autocorr, nproc=nproc)
self.autocorr_set = True
if verbose:
print 'Determined auto-correlation weights and data...'
if save:
if datapool == 'avg':
if self.extfile is not None:
with h5py.File(self.extfile, 'a') as fext:
planes = ['aperture-plane']
arraytypes = ['avg']
reim_list = ['real', 'imag']
for plane in planes:
if plane == 'aperture-plane':
qtytypes = ['acorr']
subqtytypes = ['wts', 'vals']
for qtytype in qtytypes:
for arraytype in arraytypes:
if arraytype == 'avg':
tdset = fext['{0}/{1}/{2}/timestamps'.format(plane,qtytype,arraytype)]
if (tdset.size == 1) and (tdset[-1].size == 0):
tdset[-1] = NP.asarray(self.timestamps)
else:
prev_max_tstamp = tdset[-1].max()
if (len(self.timestamps)>tdset[-1].size) or (max(self.timestamps)>tdset[-1].max()):
tstamps = NP.asarray(self.timestamps)
nearest_ind = NP.argmin(NP.abs(tstamps - tdset[-1].max()))
new_tstamps = tstamps[nearest_ind+1:]
tdset.resize(tdset.size+1, axis=0)
tdset[-1] = NP.copy(new_tstamps)
for p in pol:
if plane == 'aperture-plane':
wts_vuf = NP.rollaxis(NP.squeeze(self.autocorr_wts_vuf[p]), 2, start=0)
acorr_shape_3D = wts_vuf.shape
wts_vuf = wts_vuf.reshape(wts_vuf.shape[0], -1)
acorr_shape_2D = wts_vuf.shape
sprow, spcol = NP.where(NP.abs(wts_vuf) > 1e-10)
vis_vuf = NP.rollaxis(NP.squeeze(self.autocorr_data_vuf[p]), 2,start=0)
vis_vuf = vis_vuf.reshape(vis_vuf.shape[0], -1)
if '{0}/{1}/shape2D/{2}/{3}'.format(plane,qtytype,arraytype,p) not in fext:
dset = fext.create_dataset('{0}/{1}/shape2D/{2}/{3}'.format(plane,qtytype,arraytype,p), data=NP.asarray(acorr_shape_2D))
if '{0}/{1}/shape3D/{2}/{3}'.format(plane,qtytype,arraytype,p) not in fext:
dset = fext.create_dataset('{0}/{1}/shape3D/{2}/{3}'.format(plane,qtytype,arraytype,p), data=NP.asarray(acorr_shape_3D))
for rowcol in ['freqind', 'ij']:
dset = fext['{0}/{1}/{2}/{3}/{4}'.format(plane,qtytype,rowcol,arraytype,p)]
if dset[-1].size > 0:
dset.resize(dset.shape[0]+1, axis=0)
if rowcol == 'freqind':
dset[-1] = NP.copy(sprow)
else:
dset[-1] = NP.copy(spcol)
for subqty in subqtytypes:
for reim in ['real', 'imag']:
dset = fext['{0}/{1}/{2}/{3}/{4}/{5}'.format(plane,qtytype,subqty,arraytype,p,reim)]
if dset[-1].size > 0:
dset.resize(dset.shape[0]+1, axis=0)
if subqty == 'wts':
if reim == 'real':
dset[-1] = NP.copy(wts_vuf[sprow,spcol].real)
else:
dset[-1] = NP.copy(wts_vuf[sprow,spcol].imag)
else:
if reim == 'real':
dset[-1] = NP.copy(vis_vuf[sprow,spcol].real)
else:
dset[-1] = NP.copy(vis_vuf[sprow,spcol].imag)
############################################################################
def evalPowerPattern(self, pad=0, skypos=None, datapool='avg'):
"""
------------------------------------------------------------------------
Evaluate power pattern for the antenna from its zero-centered
cross-correlated footprint
Input:
datapool
[string] Specifies whether weights to be used in determining
the power pattern come from 'stack', 'current', or 'avg'
(default).
skypos [numpy array] Positions on sky at which power pattern is
to be esimated. It is a 2- or 3-column numpy array in
direction cosine coordinates. It must be of size nsrc x 2
or nsrc x 3. If set to None (default), the power pattern is
estimated over a grid on the sky. If a numpy array is
specified, then power pattern at the given locations is
estimated.
pad [integer] indicates the amount of padding before estimating
power pattern image. Applicable only when attribute
measured_type is set to 'E-field' (MOFF imaging). The output
image of the power pattern will be of size 2**pad-1 times the
size of the antenna array grid along u- and v-axes. Value must
not be negative. Default=0 (implies no padding). pad=1 implies
padding by factor 2 along u- and v-axes for MOFF
Outputs:
pbinfo is a dictionary with the following keys and values:
'pb' [dictionary] Dictionary with keys 'P1' and 'P2' for
polarization. Under each key is a numpy array of estimated
power patterns. If skypos was set to None, the numpy array is
3D masked array of size nm x nl x nchan. The mask is based on
which parts of the grid are valid direction cosine coordinates
on the sky. If skypos was a numpy array denoting specific sky
locations, the value in this key is a 2D numpy array of size
nsrc x nchan
'llocs' [None or numpy array] If the power pattern estimated is a grid
(if input skypos was set to None), it contains the l-locations
of the grid on the sky. If input skypos was not set to None,
the value under this key is set to None
'mlocs' [None or numpy array] If the power pattern estimated is a grid
(if input skypos was set to None), it contains the m-locations
of the grid on the sky. If input skypos was not set to None,
the value under this key is set to None
------------------------------------------------------------------------
"""
if not isinstance(pad, int):
raise TypeError('Input keyword pad must be an integer')
if datapool not in ['recent', 'stack', 'avg']:
raise ValueError('Invalid value specified for input datapool')
self.antenna_array.evalAllAntennaPairCorrWts()
centered_crosscorr_wts_vuf = self.antenna_array.makeCrossCorrWtsCube()
du = self.antenna_array.gridu[0,1] - self.antenna_array.gridu[0,0]
dv = self.antenna_array.gridv[1,0] - self.antenna_array.gridv[0,0]
ulocs = du*(NP.arange(2*self.antenna_array.gridu.shape[1])-self.antenna_array.gridu.shape[1])
vlocs = dv*(NP.arange(2*self.antenna_array.gridv.shape[0])-self.antenna_array.gridv.shape[0])
pol = ['P1', 'P2']
pbinfo = {'pb': {}}
for p in pol:
pb = evalApertureResponse(centered_crosscorr_wts_vuf[p], ulocs, vlocs, pad=pad, skypos=skypos)
pbinfo['pb'][p] = pb['pb']
pbinfo['llocs'] = pb['llocs']
pbinfo['mlocs'] = pb['mlocs']
return pbinfo
############################################################################
def removeAutoCorr(self, lkpinfo=None, forceeval=False, datapool='avg',
pad=0):
"""
------------------------------------------------------------------------
Remove auto-correlation of single antenna weights with itself from the
UV-plane.
Inputs:
lkpinfo [dictionary] consists of weights information for each of
the polarizations under polarization keys. Each of
the values under the keys is a string containing the full
path to a filename that contains the positions and
weights for the aperture illumination in the form of
a lookup table as columns (x-loc [float], y-loc
[float], wts[real], wts[imag if any]). In this case, the
lookup is for auto-corrlation of antenna weights. It only
applies when the antenna aperture class is set to
lookup-based kernel estimation instead of a functional form
forceeval [boolean] When set to False (default) the auto-correlation in
the UV plane is not evaluated if it was already evaluated
earlier. If set to True, it will be forcibly evaluated
independent of whether they were already evaluated or not
datapool [string] When set to 'avg' (or None) (default),
auto-correlations from antennas (zero-spacing with a width)
are removed from the averaged data set. If set to 'current',
the latest timestamp is used in subtracting the zero-spacing
visibilities information
pad [integer] indicates the amount of padding before imaging.
Applicable only when attribute measured_type is set to
'E-field' (MOFF imaging). The output image will be of size
2**pad-1 times the size of the antenna array grid along u-
and v-axes. Value must not be negative. Default=0 (implies no
padding of the auto-correlated footprint). pad=1 implies
padding by factor 2 along u- and v-axes for MOFF
------------------------------------------------------------------------
"""
if self.measured_type == 'E-field':
if forceeval or (not self.autocorr_removed):
if isinstance(datapool, str):
if datapool is None: datapool = 'avg'
if datapool not in ['avg', 'current']:
raise ValueError('Input keywrod datapool must be set to "avg" or "current"')
else:
raise TypeError('Input keyword data pool must be a string')
if forceeval or (not self.autocorr_set):
self.evalAutoCorr(forceeval=forceeval)
# self.evalAutoCorr(lkpinfo=lkpinfo, forceeval=forceeval)
autocorr_wts_vuf = copy.deepcopy(self.autocorr_wts_vuf)
autocorr_data_vuf = copy.deepcopy(self.autocorr_data_vuf)
pol = ['P1', 'P2']
for p in pol:
if datapool == 'avg':
if self.grid_illumination_avg[p] is not None:
vis_vuf = NP.copy(self.grid_vis_avg[p])
wts_vuf = NP.copy(self.grid_illumination_avg[p])
# autocorr_wts_vuf[p] = autocorr_wts_vuf[p][NP.newaxis,:,:,:]
vis_vuf = vis_vuf - (vis_vuf[:,self.gridv.shape[0],self.gridu.shape[1],:][:,NP.newaxis,NP.newaxis,:] / autocorr_data_vuf[p][:,self.gridv.shape[0],self.gridu.shape[1],:][:,NP.newaxis,NP.newaxis,:]) * autocorr_data_vuf[p]
wts_vuf = wts_vuf - (wts_vuf[:,self.gridv.shape[0],self.gridu.shape[1],:][:,NP.newaxis,NP.newaxis,:] / autocorr_wts_vuf[p][:,self.gridv.shape[0],self.gridu.shape[1],:][:,NP.newaxis,NP.newaxis,:]) * autocorr_wts_vuf[p]
sum_wts = NP.sum(wts_vuf, axis=(1,2), keepdims=True)
padded_wts_vuf = NP.pad(wts_vuf, ((0,0),((2**pad-1)*self.gridv.shape[0],(2**pad-1)*self.gridv.shape[0]),((2**pad-1)*self.gridu.shape[1],(2**pad-1)*self.gridu.shape[1]),(0,0)), mode='constant', constant_values=0)
padded_wts_vuf = NP.fft.ifftshift(padded_wts_vuf, axes=(1,2))
wts_lmf = NP.fft.fft2(padded_wts_vuf, axes=(1,2)) / sum_wts
if NP.abs(wts_lmf.imag).max() > 1e-10:
raise ValueError('Significant imaginary component found in the synthesized beam.')
self.nzsp_beam_avg[p] = NP.fft.fftshift(wts_lmf.real, axes=(1,2))
padded_vis_vuf = NP.pad(vis_vuf, ((0,0),((2**pad-1)*self.gridv.shape[0],(2**pad-1)*self.gridv.shape[0]),((2**pad-1)*self.gridu.shape[1],(2**pad-1)*self.gridu.shape[1]),(0,0)), mode='constant', constant_values=0)
padded_vis_vuf = NP.fft.ifftshift(padded_vis_vuf, axes=(1,2))
vis_lmf = NP.fft.fft2(padded_vis_vuf, axes=(1,2)) / sum_wts
if NP.abs(vis_lmf.imag).max() > 1e-10:
raise ValueError('Significant imaginary component found in the synthesized dirty image.')
self.nzsp_img_avg[p] = NP.fft.fftshift(vis_lmf.real, axes=(1,2))
self.nzsp_grid_vis_avg[p] = vis_vuf
self.nzsp_grid_illumination_avg[p] = wts_vuf
else:
if self.wts_vuf[p] is not None:
vis_vuf = NP.copy(self.vis_vuf[p])
wts_vuf = NP.copy(self.wts_vuf[p])
vis_vuf = vis_vuf - (vis_vuf[self.gridv.shape[0],self.gridu.shape[1],:].reshape(1,1,self.f.size) / autocorr_data_vuf[p][self.gridv.shape[0],self.gridu.shape[1],:].reshape(1,1,self.f.size)) * autocorr_data_vuf[p]
wts_vuf = wts_vuf - (wts_vuf[self.gridv.shape[0],self.gridu.shape[1],:].reshape(1,1,self.f.size) / autocorr_wts_vuf[p][self.gridv.shape[0],self.gridu.shape[1],:].reshape(1,1,self.f.size)) * autocorr_wts_vuf[p]
sum_wts = NP.sum(wts_vuf, axis=(0,1), keepdims=True)
padded_wts_vuf = NP.pad(wts_vuf, (((2**pad-1)*self.gridv.shape[0],(2**pad-1)*self.gridv.shape[0]),((2**pad-1)*self.gridu.shape[1],(2**pad-1)*self.gridu.shape[1]),(0,0)), mode='constant', constant_values=0)
padded_wts_vuf = NP.fft.ifftshift(padded_wts_vuf, axes=(0,1))
wts_lmf = NP.fft.fft2(padded_wts_vuf, axes=(0,1)) / sum_wts
if NP.abs(wts_lmf.imag).max() > 1e-10:
raise ValueError('Significant imaginary component found in the synthesized beam.')
self.nzsp_beam[p] = NP.fft.fftshift(wts_lmf.real, axes=(0,1))
padded_vis_vuf = NP.pad(vis_vuf, (((2**pad-1)*self.gridv.shape[0],(2**pad-1)*self.gridv.shape[0]),((2**pad-1)*self.gridu.shape[1],(2**pad-1)*self.gridu.shape[1]),(0,0)), mode='constant', constant_values=0)
padded_vis_vuf = NP.fft.ifftshift(padded_vis_vuf, axes=(0,1))
vis_lmf = NP.fft.fft2(padded_vis_vuf, axes=(0,1)) / sum_wts
if NP.abs(vis_lmf.imag).max() > 1e-10:
raise ValueError('Significant imaginary component found in the synthesized dirty image.')
self.nzsp_img[p] = NP.fft.fftshift(vis_lmf.real, axes=(0,1))
self.nzsp_wts_vuf[p] = wts_vuf
self.nzsp_vis_vuf[p] = vis_vuf
self.autocorr_removed = True
else:
print 'Antenna auto-correlations have been removed already'
############################################################################
def getStats(self, box_type='square', box_center=None, box_size=None,
rms_box_scale_factor=10.0, coords='physical', datapool='avg'):
"""
------------------------------------------------------------------------
Get statistics from images from inside specified boxes
NEEDS FURTHER DEVELOPMENT !!!
Inputs:
box_type [string] Shape of box. Accepted values are 'square'
(default) and 'circle' on the celestial plane. In 3D the
the box will be a cube or cylinder.
box_center [list] Center locations of boxes specified as a list one for
each box. The centers will have units as specified in input
coords. Each element must be another list, tuple or numpy
array of two or three elements. The first element refers to
the x-coordinate of the box center, the second refers to
y-coordinate of the box center. The third element (optional)
refers to the center of frequency around which the 3D box
must be placed. If third element is not specified, it will
be assumed to be center of the band. If coords is set to
'physical', these three elements will have units of dircos,
dircos and frequency (Hz). If coords is set to 'index',
these three elements must be indices of the three axes.
box_size [list] Sizes of boxes specified as a list one for each box.
Number of elements in this list will be equal to that in
input box_center. They will have 'physical' (dircos,
frequency in Hz) or 'index' units as specified in the input
coords. Each element in the list is a one- or two-element
list, tuple or numpy array. The first element is size of the
box in the celestial plane (size of square if box_type is set
to 'square', diameter of circle if box_type is set to
'circle'). The second element (optional) is size along
frequency axis. If second element is not specified, it will
be assumed to be the entire band.
rms_box_scale_factor
[scalar] Size scale on celestial plane used to determine
the box to determine the rms statistic. Must be positive.
For instance, the box size used to find the rms will use a
box that is rms_box_scale_factor times the box size on each
side used for determining the peak. Default = 10.0
coords [string] String specifying coordinates of box_center and
box_size. If set to 'physical' (default) the box_center
will have units of [dircos, dircos, frequency in Hz
(optional)] and box_size will have units of [dircos,
frequency in Hz (optional)]. If set to 'index', box_center
will have units of [index, index, index (optional)] and
box_size will have units of [number of pixels, number of
frequency channels].
datapool [string] String specifying type of image on which the
statistics will be estimated. Accepted values are 'avg'
(default), 'stack' and 'current'. These represent
time-averaged, stacked and recent images respectively
Outputs:
outstats [list] List of dictionaries one for each element in input
box_center. Each dictionary consists of the following keys
'P1' and 'P2' for the two polarizations. Under each of
these keys is another dictionary with the following keys and
values:
'peak-spectrum'
[list of numpy arrays] List of Numpy arrays
with peak value in each frequency channel. This array
is of size nchan. Length of the list is equal to the
number of timestamps as determined by input datapool.
If input datapool is set to 'current', the list will
contain one numpy array of size nchan. If datapool is
set to 'avg' or 'stack', the list will contain n_t
number of numpy arrays one for each processed
timestamp
'peak-avg'
[list] Average of each numpy array in the list under
key 'peak-spectrum'. It will have n_t elements where
n_t is the number of timestamps as determined by
input datapool
'nn-spectrum'
[list] Frequency spectrum of the nearest neighbour
pixel relative to the box center.
'mad' [list] Median Absolute Deviation(s) in
the box determined by input rms_box_scale_factor.
If input datapool is set to 'current', it will be a
one-element list, but if set to 'avg' or 'stack', it
will be a list one for each timestamp in the image
------------------------------------------------------------------------
"""
if box_type not in ['square', 'circle']:
raise ValueError('Input box_type must be specified as "square" or "circle"')
if box_center is None:
raise ValueError('Input box_center must be specified')
if box_size is None:
raise ValueError('Input box_size must be specified')
if coords not in ['physical', 'index']:
raise ValueError('Input coords must be specified as "physical" or "index"')
if datapool not in ['avg', 'current', 'stack']:
raise ValueError('Input datappol must be specified as "avg", "current" or "stack"')
if not isinstance(box_center, list):
raise TypeError('Input box_center must be a list')
if not isinstance(box_size, list):
raise TypeError('Input box_size must be a list')
if len(box_center) != len(box_size):
raise ValueError('Lengths of box_center and box_size must be equal')
if isinstance(rms_box_scale_factor, (int,float)):
rms_box_scale_factor = float(rms_box_scale_factor)
if rms_box_scale_factor <= 0.0:
raise ValueError('Input rms_box_scale_factor must be positive')
else:
raise TypeError('Input rms_box_scale_factor must be a scalar')
bandwidth = (self.f[1] - self.f[0]) * self.f.size
lfgrid = self.gridl[:,:,NP.newaxis] * NP.ones(self.f.size).reshape(1,1,-1) # nm x nl x nchan
mfgrid = self.gridm[:,:,NP.newaxis] * NP.ones(self.f.size).reshape(1,1,-1) # nm x nl x nchan
fgrid = NP.ones_like(self.gridl)[:,:,NP.newaxis] * self.f.reshape(1,1,-1) # nm x nl x nchan
outstats = []
for i in xrange(len(box_center)):
stats = {}
bc = NP.asarray(box_center[i]).reshape(-1)
bs = NP.asarray(box_size[i]).reshape(-1)
if (bc.size < 2) or (bc.size > 3):
raise ValueError('Each box center must have two or three elements')
if (bs.size < 1) or (bs.size > 2):
raise ValueError('Each box size must have one or two elements')
if bc.size == 2:
if coords == 'physical':
bc = NP.hstack((bc, NP.mean(self.f)))
else:
bc = NP.hstack((bc, self.f.size/2))
if bs.size == 1:
if coords == 'physical':
bs = NP.hstack((bs, bandwidth))
else:
bs = NP.hstack((bs, self.f.size))
if coords == 'physical':
if NP.sum(bc[:2]**2) > 1.0:
raise ValueError('Invalid dirction cosines specified')
if (bc[2] < self.f.min()) or (bc[2] > self.f.max()):
raise ValueError('Invalid frequency specified in input box_center')
else:
if (bc[0] < 0) or (bc[1] < 0) or (bc[0] > self.gridl.shape[1]) or (bc[1] > self.gridl.shape[0]):
raise ValueError('Invalid box center specified')
if bc[2] > self.f.size:
bc[2] = self.f.size
if coords == 'physical':
nn_ind2d = NP.argmin(NP.abs((lfgrid[:,:,0] - bc[0])**2 + (mfgrid[:,:,0] - bc[1])**2))
unraveled_nn_ind2d = NP.unravel_index(nn_ind2d, self.gridl.shape)
unraveled_nn_ind3d = (NP.asarray([unraveled_nn_ind2d[0]]*self.f.size), NP.asarray([unraveled_nn_ind2d[1]]*self.f.size), NP.arange(self.f.size))
if box_type == 'square':
ind3d = (NP.abs(lfgrid - bc[0]) <= 0.5*bs[0]) & (NP.abs(mfgrid - bc[1]) <= 0.5*bs[0]) & (NP.abs(fgrid - bc[2]) <= 0.5*bs[1])
ind3d_rmsbox = (NP.abs(lfgrid - bc[0]) <= 0.5*rms_box_scale_factor*bs[0]) & (NP.abs(mfgrid - bc[1]) <= 0.5*rms_box_scale_factor*bs[0]) & (NP.abs(fgrid - bc[2]) <= 0.5*bs[1])
else:
ind3d = (NP.sqrt(NP.abs(lfgrid - bc[0])**2 + NP.abs(mfgrid - bc[0])**2) <= 0.5*bs[0]) & (NP.abs(fgrid - bc[2]) <= 0.5*bs[1])
ind3d_rmsbox = (NP.sqrt(NP.abs(lfgrid - bc[0])**2 + NP.abs(mfgrid - bc[0])**2) <= 0.5*rms_box_scale_factor*bs[0]) & (NP.abs(fgrid - bc[2]) <= 0.5*bs[1])
msk = NP.logical_not(ind3d)
msk_rms = NP.logical_not(ind3d_rmsbox)
for apol in ['P1', 'P2']:
stats[apol] = {'peak-spectrum': [], 'peak-avg': [], 'mad': [], 'nn-spectrum': [], 'nn-avg': []}
if datapool == 'current':
if self.nzsp_img[apol] is not None:
img_masked = MA.array(self.nzsp_img[apol], mask=msk)
stats[apol]['peak-spectrum'] += [NP.amax(NP.abs(img_masked), axis=(0,1))]
stats[apol]['peak-avg'] += [NP.mean(stats[apol]['peak-spectrum'])]
stats[apol]['nn-spectrum'] += [NP.abs(img_masked[unraveled_nn_ind3d])]
stats[apol]['nn-avg'] += [NP.mean(stats[apol]['nn-spectrum'])]
img_masked = MA.array(self.nzsp_img[apol], mask=msk_rms)
mdn = NP.median(img_masked[~img_masked.mask])
absdev = NP.abs(img_masked - mdn)
stats[apol]['mad'] += [NP.median(absdev[~absdev.mask])]
else:
if datapool == 'avg':
if self.nzsp_img_avg[apol] is not None:
for ti in range(self.nzsp_img_avg[apol].shape[0]):
img_masked = MA.array(self.nzsp_img_avg[apol][ti,...], mask=msk)
stats[apol]['peak-spectrum'] += [NP.amax(NP.abs(img_masked), axis=(0,1))]
stats[apol]['peak-avg'] += [NP.mean(stats[apol]['peak-spectrum'][ti])]
stats[apol]['nn-spectrum'] += [NP.abs(img_masked[unraveled_nn_ind3d])]
stats[apol]['nn-avg'] += [NP.mean(stats[apol]['nn-spectrum'][ti])]
img_masked = MA.array(self.nzsp_img_avg[apol][ti,...], mask=msk_rms)
mdn = NP.median(img_masked[~img_masked.mask])
absdev = NP.abs(img_masked - mdn)
stats[apol]['mad'] += [NP.median(absdev[~absdev.mask])]
else:
if self.img_stack[apol] is not None:
for ti in range(self.img_stack[apol].shape[0]):
img_masked = MA.array(self.img_stack[apol][ti,...], mask=msk)
stats[apol]['peak-spectrum'] += [NP.amax(NP.abs(img_masked), axis=(0,1))]
stats[apol]['peak-avg'] += [NP.mean(stats[apol]['peak-spectrum'][ti])]
stats[apol]['nn-spectrum'] += [NP.abs(img_masked[unraveled_nn_ind3d])]
stats[apol]['nn-avg'] += [NP.mean(stats[apol]['nn-spectrum'][ti])]
img_masked = MA.array(self.img_stack[apol][ti,...], mask=msk_rms)
mdn = NP.median(img_masked[~img_masked.mask])
absdev = NP.abs(img_masked - mdn)
stats[apol]['mad'] += [NP.median(absdev[~absdev.mask])]
outstats += [stats]
else:
pass
return outstats
############################################################################
def save(self, imgfile, pol=None, overwrite=False, verbose=True):
"""
------------------------------------------------------------------------
Saves the image information to disk.
Input:
imgfile [string] Image filename with full path. Will be appended
with '.fits' extension
Keyword Input(s):
pol [string] indicates which polarization information to be
saved. Allowed values are 'P1', 'P2' or None (default). If
None, information on both polarizations are saved.
overwrite [boolean] True indicates overwrite even if a file already
exists. Default = False (does not overwrite)
verbose [boolean] If True (default), prints diagnostic and progress
messages. If False, suppress printing such messages.
------------------------------------------------------------------------
"""
try:
imgfile
except NameError:
raise NameError('No filename provided. Aborting Image.save()')
filename = imgfile + '.fits'
if verbose:
print '\nSaving image information...'
hdulst = []
hdulst += [fits.PrimaryHDU()]
hdulst[0].header['f0'] = (self.f0, 'Center frequency (Hz)')
hdulst[0].header['tobs'] = (self.timestamp, 'Timestamp associated with observation.')
hdulst[0].header['EXTNAME'] = 'PRIMARY'
if verbose:
print '\tCreated a primary HDU.'
hdulst += [fits.ImageHDU(self.f, name='FREQ')]
if verbose:
print '\t\tCreated an extension HDU of {0:0d} frequency channels'.format(len(self.f))
if (pol is None) or (pol == 'P1'):
if verbose:
print '\tWorking on polarization P1...'
if self.lf_P1 is not None:
hdulst += [fits.ImageHDU(self.lf_P1, name='grid_lf_P1')]
if verbose:
print '\t\tCreated an extension HDU of l-coordinates of grid of size: {0[0]} \n\t\t\tfor each of the {0[1]} frequency channels'.format(self.lf_P1.shape)
if self.mf_P1 is not None:
hdulst += [fits.ImageHDU(self.mf_P1, name='grid_mf_P1')]
if verbose:
print '\t\tCreated an extension HDU of m-coordinates of grid of size: {0[0]} \n\t\t\tfor each of the {0[1]} frequency channels'.format(self.mf_P1.shape)
if self.holograph_PB_P1 is not None:
hdulst += [fits.ImageHDU(self.holograph_PB_P1.real, name='holograph_PB_P1_real')]
hdulst += [fits.ImageHDU(self.holograph_PB_P1.imag, name='holograph_PB_P1_imag')]
if verbose:
print "\t\tCreated separate extension HDUs of grid's voltage reception pattern spectra\n\t\t\twith size {0[0]}x{0[1]}x{0[2]} for real and imaginary parts.".format(self.holograph_PB_P1.shape)
if self.holograph_P1 is not None:
hdulst += [fits.ImageHDU(self.holograph_P1.real, name='holograph_P1_real')]
hdulst += [fits.ImageHDU(self.holograph_P1.imag, name='holograph_P1_imag')]
if verbose:
print "\t\tCreated separate extension HDUs of grid's voltage holograph spectra of \n\t\t\tsize {0[0]}x{0[1]}x{0[2]} for real and imaginary parts.".format(self.holograph_P1.shape)
if (pol is None) or (pol == 'P2'):
if verbose:
print '\tWorking on polarization P2...'
if self.lf_P2 is not None:
hdulst += [fits.ImageHDU(self.lf_P2, name='grid_lf_P2')]
if verbose:
print '\t\tCreated an extension HDU of l-coordinates of grid of size: {0[0]} \n\t\t\tfor each of the {0[1]} frequency channels'.format(self.lf_P2.shape)
if self.mf_P2 is not None:
hdulst += [fits.ImageHDU(self.mf_P2, name='grid_mf_P2')]
if verbose:
print '\t\tCreated an extension HDU of m-coordinates of grid of size: {0[0]} \n\t\t\tfor each of the {0[1]} frequency channels'.format(self.mf_P2.shape)
if self.holograph_PB_P2 is not None:
hdulst += [fits.ImageHDU(self.holograph_PB_P2.real, name='holograph_PB_P2_real')]
hdulst += [fits.ImageHDU(self.holograph_PB_P2.imag, name='holograph_PB_P2_imag')]
if verbose:
print "\t\tCreated separate extension HDUs of grid's voltage reception pattern spectra\n\t\t\twith size {0[0]}x{0[1]}x{0[2]} for real and imaginary parts.".format(self.holograph_PB_P2.shape)
if self.holograph_P2 is not None:
hdulst += [fits.ImageHDU(self.holograph_P2.real, name='holograph_P2_real')]
hdulst += [fits.ImageHDU(self.holograph_P2.imag, name='holograph_P2_imag')]
if verbose:
print "\t\tCreated separate extension HDUs of grid's voltage holograph spectra of \n\t\t\tsize {0[0]}x{0[1]}x{0[2]} for real and imaginary parts.".format(self.holograph_P2.shape)
if verbose:
print '\tNow writing FITS file to disk:\n\t\t{0}'.format(filename)
hdu = fits.HDUList(hdulst)
hdu.writeto(filename, clobber=overwrite)
if verbose:
print '\tImage information written successfully to FITS file on disk:\n\t\t{0}\n'.format(filename)
################################################################################
class PolInfo(object):
"""
----------------------------------------------------------------------------
Class to manage polarization information of an antenna.
Attributes:
Et [dictionary] holds measured complex electric field time series
under 2 polarizations which are stored under keys 'P1', and 'P2'
Ef [dictionary] holds complex electric field spectra under 2
polarizations which are stored under keys 'P1', and 'P2'. The
length of the spectra is twice that of the time series.
flag [dictionary] holds boolean flags for each of the 2 polarizations
which are stored under keys 'P1', and 'P2'. Default=True means
that polarization is flagged.
Member functions:
__init__(): Initializes an instance of class PolInfo
__str__(): Prints a summary of current attributes.
FT(): Perform a Fourier transform of an Electric field time series
after doubling the length of the sequence with zero padding
(in order to be identical to what would be obtained from a
XF operation)
update_flags() Updates the flags based on current inputs and verifies and
updates flags based on current values of the electric field.
update(): Updates the electric field time series and spectra, and
flags for different polarizations
delay_compensation():
Routine to apply delay compensation to Electric field
spectra through additional phase. This assumes that the
spectra have already been made
Read the member function docstrings for details.
----------------------------------------------------------------------------
"""
def __init__(self, nsamples=1):
"""
------------------------------------------------------------------------
Initialize the PolInfo Class which manages polarization information of
an antenna.
Class attributes initialized are:
Et, Ef, flag
Read docstring of class PolInfo for details on these attributes.
------------------------------------------------------------------------
"""
self.Et = {}
self.Ef = {}
self.flag = {}
if not isinstance(nsamples, int):
raise TypeError('nsamples must be an integer')
elif nsamples <= 0:
nsamples = 1
for pol in ['P1', 'P2']:
self.Et[pol] = NP.empty(nsamples, dtype=NP.complex64)
self.Ef[pol] = NP.empty(2*nsamples, dtype=NP.complex64)
self.Et[pol].fill(NP.nan)
self.Ef[pol].fill(NP.nan)
self.flag[pol] = True
############################################################################
def __str__(self):
return ' Instance of class "{0}" in module "{1}" \n flag (P1): {2} \n flag (P2): {3} '.format(self.__class__.__name__, self.__module__, self.flag['P1'], self.flag['P2'])
############################################################################
def FT(self, pol=None):
"""
------------------------------------------------------------------------
Perform a Fourier transform of an Electric field time series after
doubling the length of the sequence with zero padding (in order to be
identical to what would be obtained from a XF operation)
Keyword Input(s):
pol [scalar or list] polarization to be Fourier transformed. Set
to 'P1' and/or 'P2'. If None (default) provided, time series
of both polarizations are Fourier transformed.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
for p in pol:
if p in ['P1', 'P2']:
Et = NP.pad(self.Et[p], [(0,0), (0,self.Et[p].shape[1])], 'constant', constant_values=(0,0))
self.Ef[p] = DSP.FT1D(Et, ax=0, use_real=False, inverse=False, shift=True)
else:
raise ValueError('polarization string "{0}" unrecognized. Verify inputs. Aborting {1}.{2}()'.format(p, self.__class__.__name__, 'FT'))
############################################################################
def delay_compensation(self, delaydict):
"""
------------------------------------------------------------------------
Routine to apply delay compensation to Electric field spectra through
additional phase. This assumes that the spectra have already been made
Keyword input(s):
delaydict [dictionary] contains one or both polarization keys, namely,
'P1' and 'P2'. The value under each of these keys is another
dictionary with the following keys and values:
'frequencies': scalar, list or numpy vector specifying the
frequencie(s) (in Hz) for which delays are specified.
If a scalar is specified, the delays are assumed to
be frequency independent and the delays are assumed
to be valid for all frequencies. If a vector is
specified, it must be of same size as the delays and
as the number of samples in the electric field
timeseries. These frequencies are assumed to match
those of the electric field spectrum. No default.
'delays': list or numpy vector specifying the delays (in
seconds) at the respective frequencies which are to
be compensated through additional phase in the
electric field spectrum. Must be of same size as
frequencies and the size of the electric field
timeseries. No default.
'fftshifted': boolean scalar indicating if the frequencies
provided have already been fft-shifted. If True
(default) or this key is absent, the frequencies are
assumed to have been fft-shifted. If False, they have
to be fft-shifted before applying the delay
compensation to rightly align with the fft-shifted
electric field spectrum computed in member function
FT().
------------------------------------------------------------------------
"""
try:
delaydict
except NameError:
raise NameError('Delay information must be supplied for delay correction in the dictionary delaydict.')
if not isinstance(delaydict, dict):
raise TypeError('delaydict must be a dictionary')
for pol in delaydict:
if pol not in ['P1','P2']:
raise ValueError('Invalid specification for polarization')
if 'delays' in delaydict[pol]:
if NP.asarray(delaydict[pol]['delays']).size == 1:
delays = delaydict[pol]['delays'] + NP.zeros(self.Et[pol].size)
else:
if (NP.asarray(delaydict[pol]['delays']).size == self.Et[pol].size):
delays = NP.asarray(delaydict[pol]['delays']).ravel()
else:
raise IndexError('Size of delays in delaydict must be equal to 1 or match that of the timeseries.')
if 'frequencies' in delaydict[pol]:
frequencies = NP.asarray(delaydict[pol]['frequencies']).ravel()
if frequencies.size != self.Et[pol].size:
raise IndexError('Size of frequencies must match that of the Electric field time series.')
else:
raise KeyError('Key "frequencies" not found in dictionary delaydict[{0}] holding delay information.'.format(pol))
temp_phases = 2 * NP.pi * delays * frequencies
# Convert phases to fft-shifted arrangement based on key "fftshifted" in delaydict
if 'fftshifted' in delaydict[pol]:
if not isinstance(delaydict[pol]['fftshifted'], bool):
raise TypeError('Value under key "fftshifted" must be boolean')
if not delaydict[pol]['fftshifted']:
temp_phases = NP.fft.fftshift(temp_phases)
# Expand the size to account for the fact that the Fourier transform of the timeseries is obtained after zero padding
phases = NP.empty(2*frequencies.size)
phases[0::2] = temp_phases
phases[1::2] = temp_phases
self.Ef[pol] *= NP.exp(1j * phases.reshape(1,-1))
## INSERT FEATURE: yet to modify the timeseries after application of delay compensation ##
############################################################################
def update_flags(self, flags=None, verify=False):
"""
------------------------------------------------------------------------
Updates the flags based on current inputs and verifies and updates flags
based on current values of the electric field.
Inputs:
flags [dictionary] holds boolean flags for each of the 2
polarizations which are stored under keys 'P1', and 'P2'.
Default=None means no new flagging to be applied. If
the value under the polarization key is True, it is to be
flagged and if False, it is to be unflagged.
verify [boolean] If True, verify and update the flags, if necessary.
Electric fields are checked for NaN values and if found, the
flag in the corresponding polarization is set to True.
Default=False.
Flag verification and re-updating happens if flags is set to None or if
verify is set to True.
------------------------------------------------------------------------
"""
# if not isinstance(stack, bool):
# raise TypeError('Input keyword stack must be of boolean type')
if not isinstance(verify, bool):
raise TypeError('Input keyword verify must be of boolean type')
if flags is not None:
if not isinstance(flags, dict):
raise TypeError('Input parameter flags must be a dictionary')
for pol in ['P1', 'P2']:
if pol in flags:
if isinstance(flags[pol], bool):
self.flag[pol] = flags[pol]
else:
raise TypeError('flag values must be boolean')
# Perform flag verification and re-update current flags
if verify or (flags is None):
for pol in ['P1', 'P2']:
if NP.any(NP.isnan(self.Et[pol])) and NP.any(NP.isnan(self.Ef[pol])):
self.flag[pol] = True
############################################################################
def update(self, Et=None, Ef=None, flags=None, delaydict=None,
verify=False):
"""
------------------------------------------------------------------------
Updates the electric field time series and spectra, and flags for
different polarizations
Inputs:
Et [dictionary] holds time series under 2 polarizations which are
stored under keys 'P1', and 'P2'. Default=None implies no updates
for Et.
Ef [dictionary] holds spectra under 2 polarizations which are
stored under keys 'P1', and 'P2'. Default=None implies no updates
for Ef.
flag [dictionary] holds boolean flags for each of the 2 polarizations
which are stored under keys 'P1', and 'P2'. Default=None means
no updates for flags.
delaydict
[dictionary] contains one or both polarization keys, namely,
'P1' and 'P2'. The value under each of these keys is another
dictionary with the following keys and values:
'frequencies': scalar, list or numpy vector specifying the
frequencie(s) (in Hz) for which delays are specified.
If a scalar is specified, the delays are assumed to be
frequency independent and the delays are assumed to be
valid for all frequencies. If a vector is specified,
it must be of same size as the delays and as the
number of samples in the electric field timeseries.
These frequencies are assumed to match those of the
electric field spectrum. No default.
'delays': list or numpy vector specifying the delays (in
seconds) at the respective frequencies which are to be
compensated through additional phase in the electric
field spectrum. Must be of same size as frequencies
and the size of the electric field timeseries. No
default.
'fftshifted': boolean scalar indicating if the frequencies
provided have already been fft-shifted. If True
(default) or this key is absent, the frequencies are
assumed to have been fft-shifted. If False, they have
to be fft-shifted before applying the delay
compensation to rightly align with the fft-shifted
electric field spectrum computed in member function
FT().
verify [boolean] If True, verify and update the flags, if necessary.
Electric fields are checked for NaN values and if found, the
flag in the corresponding polarization is set to True.
Default=False.
------------------------------------------------------------------------
"""
current_flags = copy.deepcopy(self.flag)
if flags is None:
flags = copy.deepcopy(current_flags)
# if flags is not None:
# self.update_flags(flags)
if Et is not None:
if isinstance(Et, dict):
for pol in ['P1', 'P2']:
if pol in Et:
self.Et[pol] = Et[pol]
if NP.any(NP.isnan(Et[pol])):
# self.Et[pol] = NP.nan
flags[pol] = True
# self.flag[pol] = True
self.FT() # Update the spectrum
else:
raise TypeError('Input parameter Et must be a dictionary')
if Ef is not None:
if isinstance(Ef, dict):
for pol in ['P1', 'P2']:
if pol in Ef:
self.Ef[pol] = Ef[pol]
if NP.any(NP.isnan(Ef[pol])):
# self.Ef[pol] = NP.nan
flags[pol] = True
# self.flag[pol] = True
else:
raise TypeError('Input parameter Ef must be a dictionary')
if delaydict is not None:
self.delay_compensation(delaydict)
# Verify and update flags
self.update_flags(flags=flags, verify=verify)
################################################################################
class Antenna(object):
"""
----------------------------------------------------------------------------
Class to manage individual antenna information.
Attributes:
label: [Scalar] A unique identifier (preferably a string) for the
antenna.
typetag [scalar or string] Tag (integer or string) to identify antenna
type. Will be used in determining if the antenna array is made
of identical antennas or not
latitude: [Scalar] Latitude of the antenna's location.
longitude: [Scalar] Longitude of the antenna's location.
location: [Instance of GEOM.Point class] The location of the antenna in
local East, North, Up coordinate system.
timestamp: [Scalar] String or float representing the timestamp for the
current attributes
timestamps [list] list of all timestamps to be held in the stack
t: [vector] The time axis for the time series of electric fields
f: [vector] Frequency axis obtained by a Fourier Transform of
the electric field time series. Same length as attribute t
f0: [Scalar] Center frequency in Hz.
antpol: [Instance of class PolInfo] polarization information for the
antenna. Read docstring of class PolInfo for details
aperture [Instance of class APR.Aperture] aperture information
for the antenna. Read docstring of class Aperture for
details
Et_stack [dictionary] holds a stack of complex electric field time series
measured at various time stamps under 2 polarizations which are
stored under keys 'P1' and 'P2'
Ef_stack [dictionary] holds a stack of complex electric field spectra
measured at various time stamps under 2 polarizations which are
stored under keys 'P1' and 'P2'
flag_stack
[dictionary] holds a stack of flags appropriate for different
time stamps as a numpy array under 2 polarizations which are
stored under keys 'P1' and 'P2'
wts: [dictionary] The gridding weights for antenna. Different
polarizations 'P1' and 'P2' form the keys
of this dictionary. These values are in general complex. Under
each key, the values are maintained as a list of numpy vectors,
where each vector corresponds to a frequency channel. See
wtspos_scale for more requirements.
wtspos [dictionary] two-dimensional locations of the gridding weights
in wts for each polarization under keys 'P1' and 'P2'. The
locations are in ENU coordinate system as a list of 2-column
numpy arrays. Each 2-column array in the list is the position
of the gridding weights for a corresponding frequency
channel. The size of the list must be the same as wts and the
number of channels. Units are in number of wavelengths. See
wtspos_scale for more requirements.
wtspos_scale [dictionary] The scaling of weights is specified for each
polarization under one of the keys 'P1' and 'P2'.
The values under these keys can be either None (default) or
'scale'. If None, numpy vectors in wts and wtspos under
corresponding keys are provided for each frequency channel. If
set to 'scale' wts and wtspos contain a list of only one
numpy array corresponding to a reference frequency. This is
scaled internally to correspond to the first channel.
The gridding positions are correspondingly scaled to all the
frequency channels.
blc [2-element numpy array] Bottom Left corner where the
antenna contributes non-zero weight to the grid. Same
for all polarizations
trc [2-element numpy array] Top right corner where the
antenna contributes non-zero weight to the grid. Same
for all polarizations
Member Functions:
__init__(): Initializes an instance of class Antenna
__str__(): Prints a summary of current attributes
channels(): Computes the frequency channels from a temporal Fourier
Transform
FT() Computes the Fourier transform of the time series of the
antennas in the antenna array to compute the visibility
spectra. Read docstring of member function FT() of class
PolInfo
FT_pp() Computes the Fourier transform of the time series of the
antennas in the antenna array to compute the visibility
spectra. Read docstring of member function FT() of class
PolInfo. Differs from FT() member function in that here
an instance of class Antenna is returned and is mainly used
in case of parallel processing and is not meant to be
accessed directly by the user. Use FT() for all other pruposes.
update_flags()
Updates flags for polarizations provided as input parameters
update(): Updates the antenna instance with newer attribute values
Updates the electric field spectrum and timeseries. It also
applies Fourier transform if timeseries is updated
update_pp() Wrapper for member function update() and returns the updated
instance of this class. Mostly intended to be used when
parallel processing is applicable and not to be used directly.
Use update() instead when updates are to be applied directly.
get_E_fields()
Returns the electric fields based on selection criteria on
timestamp flags, timestamps and frequency channel indices and
the type of data (most recent or stacked electric fields)
evalGridIllumination()
Evaluate antenna illumination function on a specified grid
save(): Saves the antenna information to disk. Needs serious
development.
Read the member function docstrings for details.
----------------------------------------------------------------------------
"""
def __init__(self, label, typetag, latitude, longitude, location,
center_freq, nsamples=1, aperture=None):
"""
------------------------------------------------------------------------
Initialize the Antenna Class which manages an antenna's information
Class attributes initialized are:
label, latitude, longitude, location, pol, t, timestamp, f0, f, wts,
wtspos, wtspos_scale, blc, trc, timestamps, antpol, Et_stack, Ef_stack,
flag_stack, aperture, typetag
Read docstring of class Antenna for details on these attributes.
------------------------------------------------------------------------
"""
try:
label
except NameError:
raise NameError('Antenna label must be provided.')
try:
typetag
except NameError:
raise NameError('Antenna type tag must be provided.')
if not isinstance(typetag, (int,str)):
raise TypeError('Antenna type tag must be an integer or string')
try:
latitude
except NameError:
latitude = 0.0
try:
longitude
except NameError:
longitude = 0.0
try:
location
except NameError:
self.location = GEOM.Point()
try:
center_freq
except NameError:
raise NameError('Center frequency must be provided.')
self.label = label
self.typetag = typetag
self.latitude = latitude
self.longitude = longitude
if isinstance(location, GEOM.Point):
self.location = location
elif isinstance(location, (list, tuple, NP.ndarray)):
self.location = GEOM.Point(location)
else:
raise TypeError('Antenna position must be a 3-element tuple or an instance of GEOM.Point')
if aperture is not None:
if isinstance(aperture, APR.Aperture):
if len(aperture.pol) != 2:
raise ValueError('Antenna aperture must contain dual polarization types')
self.aperture = aperture
else:
raise TypeError('aperture must be an instance of class Aperture found in module {0}'.format(APR.__name__))
else:
self.aperture = APR.Aperture(pol_type='dual')
self.antpol = PolInfo(nsamples=nsamples)
self.t = 0.0
self.timestamp = 0.0
self.timestamps = []
self.f0 = center_freq
self.f = self.f0
self.Et_stack = {}
self.Ef_stack = {}
self.flag_stack = {}
self.wts = {}
self.wtspos = {}
self.wtspos_scale = {}
self._gridinfo = {}
for pol in ['P1', 'P2']:
self.Et_stack[pol] = None
self.Ef_stack[pol] = None
self.flag_stack[pol] = NP.asarray([])
self.wtspos[pol] = []
self.wts[pol] = []
self.wtspos_scale[pol] = None
self._gridinfo[pol] = {}
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1)
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1)
############################################################################
def __str__(self):
return ' Instance of class "{0}" in module "{1}" \n label: {2} \n typetag: {3} \n location: {4}'.format(self.__class__.__name__, self.__module__, self.label, self.typetag, self.location.__str__())
############################################################################
def channels(self):
"""
------------------------------------------------------------------------
Computes the frequency channels from a temporal Fourier Transform
Output(s):
Frequencies corresponding to channels obtained by a Fourier Transform
of the time series.
------------------------------------------------------------------------
"""
return DSP.spectax(2*self.t.size, self.t[1]-self.t[0], shift=True)
############################################################################
def FT(self, pol=None):
"""
------------------------------------------------------------------------
Computes the Fourier transform of the time series of the antennas in the
antenna array to compute the visibility spectra. Read docstring of
member function FT() of class PolInfo
Inputs:
pol [scalar or list] Scalar string or list of strings specifying
polarization. Accepted values are 'P1' and/or 'P2'. Default=None
means both time series of electric fields of both polarizations
are Fourier transformed
# stack [boolean] If set to True, perform Fourier transform on the
# timestamp-stacked electric field time series. Default = False
------------------------------------------------------------------------
"""
self.antpol.FT(pol=pol)
############################################################################
def FT_pp(self, pol=None):
"""
------------------------------------------------------------------------
Computes the Fourier transform of the time series of the antennas in the
antenna array to compute the visibility spectra. Read docstring of
member function FT() of class PolInfo. Differs from FT() member function
in that here an instance of class Antenna is returned and is mainly used
in case of parallel processing and is not meant to be accessed directly
by the user. Use FT() for all other pruposes.
Inputs:
pol [scalar or list] Scalar string or list of strings specifying
polarization. Accepted values are 'P1' and/or 'P2'. Default=None
means both time series of electric fields of both polarizations
are Fourier transformed
# stack [boolean] If set to True, perform Fourier transform on the
# timestamp-stacked electric field time series. Default = False
Outputs:
Instance of class Antenna
------------------------------------------------------------------------
"""
self.antpol.FT(pol=pol)
return self
############################################################################
def update_flags(self, flags=None, stack=False, verify=True):
"""
------------------------------------------------------------------------
Updates flags for antenna polarizations. Invokes member function
update_flags() of class PolInfo
Inputs:
flags [dictionary] boolean flags for each of the 2 polarizations
of the antenna which are stored under keys 'P1' and 'P2',
Default=None means no updates for flags.
stack [boolean] If True (default), appends the updated flag to the
end of the stack of flags as a function of timestamp. If False,
updates the last flag in the stack with the updated flag and
does not append
verify [boolean] If True, verify and update the flags, if necessary.
Electric fields are checked for NaN values and if found, the
flag in the corresponding polarization is set to True.
Default=True
------------------------------------------------------------------------
"""
# By default carry over the flags from previous timestamp
if flags is None:
flags = copy.deepcopy(self.antpol.flag)
self.antpol.update_flags(flags=flags, verify=verify)
# Stack on to last value or update last value in stack
for pol in ['P1', 'P2']:
if stack is True:
self.flag_stack[pol] = NP.append(self.flag_stack[pol], self.antpol.flag[pol])
else:
if self.flag_stack[pol].size == 0:
self.flag_stack[pol] = NP.asarray(self.antpol.flag[pol]).reshape(-1)
else:
self.flag_stack[pol][-1] = self.antpol.flag[pol]
self.flag_stack[pol] = self.flag_stack[pol].astype(NP.bool)
############################################################################
def update(self, update_dict=None, verbose=True):
"""
------------------------------------------------------------------------
Updates the antenna instance with newer attribute values. Updates
the electric field spectrum and timeseries. It also applies Fourier
transform if timeseries is updated
Inputs:
update_dict [dictionary] contains the following keys and values:
label [Scalar] A unique identifier (preferably a string) for
the antenna. Default=None means no update to apply
typetag [scalar or string] Antenna type identifier (integer or
preferably string) which will be used in determining if
all antennas in the antenna array are identical
latitude [Scalar] Latitude of the antenna's location. Default=None
means no update to apply
location [Instance of GEOM.Point class] The location of the
antenna in local East, North, Up (ENU) coordinate system.
Default=None means no update to apply
timestamp [Scalar] String or float representing the timestamp for
the current attributes. Default=None means no update to
apply
t [vector] The time axis for the electric field time
series. Default=None means no update to apply
flags [dictionary] holds boolean flags for each of the 2
polarizations which are stored under keys 'P1' and 'P22'.
Default=None means no updates for flags.
Et [dictionary] holds time series under 2 polarizations
which are stored under keys 'P1' and 'P22'. Default=None
implies no updates for Et.
Ef [dictionary] holds spectrum under 2 polarizations
which are stored under keys 'P1' and 'P22'. Default=None
implies no updates for Ef.
aperture [instance of class APR.Aperture] aperture
information for the antenna. Read docstring of class
Aperture for details
wtsinfo [dictionary] consists of weights information for each of
the two polarizations under keys 'P1' and 'P2'. Each of
the values under the keys is a list of dictionaries.
Length of list is equal to the number of frequency
channels or one (equivalent to setting wtspos_scale to
'scale'.). The list is indexed by the frequency channel
number. Each element in the list consists of a dictionary
corresponding to that frequency channel. Each dictionary
consists of these items with the following keys:
wtspos [2-column Numpy array, optional] u- and v-
positions for the gridding weights. Units
are in number of wavelengths.
wts [Numpy array] Complex gridding weights. Size
is equal to the number of rows in wtspos
above
orientation [scalar] Orientation (in radians) of the
wtspos coordinate system relative to the
local ENU coordinate system. It is measured
North of East.
lookup [string] If set, refers to a file location
containing the wtspos and wts information
above as columns (x-loc [float], y-loc
[float], wts[real], wts[imag if any]). If
set, wtspos and wts information are obtained
from this lookup table and the wtspos and wts
keywords in the dictionary are ignored. Note
that wtspos values are obtained after
dividing x- and y-loc lookup values by the
wavelength
gridfunc_freq
[String scalar] If set to None (not provided) or to
'scale' assumes that wtspos in wtsinfo are given for a
reference frequency which need to be scaled for the
frequency channels. Will be ignored if the list of
dictionaries under the polarization keys in wtsinfo have
number of elements equal to the number of frequency
channels.
ref_freq [Scalar] Positive value (in Hz) of reference frequency
(used if gridfunc_freq is set to None or 'scale') at
which wtspos is provided. If set to None, ref_freq is
assumed to be equal to the center frequency in the class
Antenna's attribute.
delaydict [Dictionary] contains information on delay compensation
to be applied to the fourier transformed electric fields
under each polarization which are stored under keys 'P1'
and 'P2'. Default is None (no delay compensation to be
applied). Refer to the docstring of member function
delay_compensation() of class PolInfo for more details.
stack [boolean] If True (default), appends the updated flag
and data to the end of the stack as a function of
timestamp. If False, updates the last flag and data in
the stack and does not append
verify [boolean] If True, verify and update the flags, if
necessary. Electric fields are checked for NaN values and
if found, the flag in the corresponding polarization is
set to True. Default=True
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
label = None
typetag = None
location = None
timestamp = None
t = None
flags = None
stack = False
verify_flags = True
Et = None
Ef = None
wtsinfo = None
gridfunc_freq = None
ref_freq = None
delaydict = None
aperture = None
if update_dict is not None:
if not isinstance(update_dict, dict):
raise TypeError('Input parameter containing updates must be a dictionary')
if 'label' in update_dict: label = update_dict['label']
if 'typetag' in update_dict: typetag = update_dict['typetag']
if 'location' in update_dict: location = update_dict['location']
if 'timestamp' in update_dict: timestamp = update_dict['timestamp']
if 't' in update_dict: t = update_dict['t']
if 'Et' in update_dict: Et = update_dict['Et']
if 'Ef' in update_dict: Ef = update_dict['Ef']
if 'flags' in update_dict: flags = update_dict['flags']
if 'stack' in update_dict: stack = update_dict['stack']
if 'verify_flags' in update_dict: verify_flags = update_dict['verify_flags']
if 'wtsinfo' in update_dict: wtsinfo = update_dict['wtsinfo']
if 'gridfunc_freq' in update_dict: gridfunc_freq = update_dict['gridfunc_freq']
if 'ref_freq' in update_dict: ref_freq = update_dict['ref_freq']
if 'delaydict' in update_dict: delaydict = update_dict['delaydict']
if 'aperture' in update_dict: aperture = update_dict['aperture']
if label is not None: self.label = label
if typetag is not None: self.typetag = typetag
if location is not None: self.location = location
if timestamp is not None:
self.timestamp = timestamp
self.timestamps += [copy.deepcopy(timestamp)]
if t is not None:
self.t = t
self.f = self.f0 + self.channels()
# Updates, Et, Ef, delays, flags and verifies flags
if (Et is not None) or (Ef is not None) or (delaydict is not None) or (flags is not None):
self.antpol.update(Et=Et, Ef=Ef, delaydict=delaydict, flags=flags, verify=verify_flags)
# Stack flags and data
self.update_flags(flags=None, stack=stack, verify=True)
for pol in ['P1', 'P2']:
if self.Et_stack[pol] is None:
self.Et_stack[pol] = copy.deepcopy(self.antpol.Et[pol].reshape(1,-1))
self.Ef_stack[pol] = copy.deepcopy(self.antpol.Ef[pol].reshape(1,-1))
else:
if stack:
self.Et_stack[pol] = NP.vstack((self.Et_stack[pol], self.antpol.Et[pol].reshape(1,-1)))
self.Ef_stack[pol] = NP.vstack((self.Ef_stack[pol], self.antpol.Ef[pol].reshape(1,-1)))
else:
self.Et_stack[pol][-1,:] = copy.deepcopy(self.antpol.Et[pol].reshape(1,-1))
self.Ef_stack[pol][-1,:] = copy.deepcopy(self.antpol.Ef[pol].reshape(1,-1))
blc_orig = NP.copy(self.blc)
trc_orig = NP.copy(self.trc)
eps = 1e-6
if aperture is not None:
if isinstance(aperture, APR.Aperture):
self.aperture = copy.deepcopy(aperture)
else:
raise TypeError('Update for aperture must be an instance of class Aperture.')
if wtsinfo is not None:
if not isinstance(wtsinfo, dict):
raise TypeError('Input parameter wtsinfo must be a dictionary.')
self.wtspos = {}
self.wts = {}
self.wtspos_scale = {}
angles = []
max_wtspos = []
for pol in ['P1', 'P2']:
self.wts[pol] = []
self.wtspos[pol] = []
self.wtspos_scale[pol] = None
if pol in wtsinfo:
if len(wtsinfo[pol]) == len(self.f):
angles += [elem['orientation'] for elem in wtsinfo[pol]]
for i in xrange(len(self.f)):
rotation_matrix = NP.asarray([[NP.cos(-angles[i]), NP.sin(-angles[i])],
[-NP.sin(-angles[i]), NP.cos(-angles[i])]])
if ('lookup' not in wtsinfo[pol][i]) or (wtsinfo[pol][i]['lookup'] is None):
self.wts[pol] += [wtsinfo[pol][i]['wts']]
wtspos = wtsinfo[pol][i]['wtspos']
else:
lookupdata = LKP.read_lookup(wtsinfo[pol][i]['lookup'])
wtspos = NP.hstack((lookupdata[0].reshape(-1,1),lookupdata[1].reshape(-1,1))) * (self.f[i]/FCNST.c)
self.wts[pol] += [lookupdata[2]]
self.wtspos[pol] += [ NP.dot(NP.asarray(wtspos), rotation_matrix.T) ]
max_wtspos += [NP.amax(NP.abs(self.wtspos[pol][-1]), axis=0)]
elif len(wtsinfo[pol]) == 1:
if (gridfunc_freq is None) or (gridfunc_freq == 'scale'):
self.wtspos_scale[pol] = 'scale'
if ref_freq is None:
ref_freq = self.f0
angles = wtsinfo[pol][0]['orientation']
rotation_matrix = NP.asarray([[NP.cos(-angles), NP.sin(-angles)],
[-NP.sin(-angles), NP.cos(-angles)]])
if ('lookup' not in wtsinfo[pol][0]) or (wtsinfo[pol][0]['lookup'] is None):
self.wts[pol] += [ wtsinfo[pol][0]['wts'] ]
wtspos = wtsinfo[pol][0]['wtspos']
else:
lookupdata = LKP.read_lookup(wtsinfo[pol][0]['lookup'])
wtspos = NP.hstack((lookupdata[0].reshape(-1,1),lookupdata[1].reshape(-1,1))) * (ref_freq/FCNST.c)
self.wts[pol] += [lookupdata[2]]
self.wtspos[pol] += [ (self.f[0]/ref_freq) * NP.dot(NP.asarray(wtspos), rotation_matrix.T) ]
max_wtspos += [NP.amax(NP.abs(self.wtspos[pol][-1]), axis=0)]
else:
raise ValueError('gridfunc_freq must be set to None, "scale" or "noscale".')
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) - FCNST.c/self.f.min() * NP.amin(NP.abs(self.wtspos[pol][0]), 0)
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) + FCNST.c/self.f.min() * NP.amax(NP.abs(self.wtspos[pol][0]), 0)
else:
raise ValueError('Number of elements in wtsinfo for {0} is incompatible with the number of channels.'.format(pol))
max_wtspos = NP.amax(NP.asarray(max_wtspos).reshape(-1,blc_orig.size), axis=0)
self.blc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) - FCNST.c/self.f.min() * max_wtspos
self.trc = NP.asarray([self.location.x, self.location.y]).reshape(1,-1) + FCNST.c/self.f.min() * max_wtspos
if (NP.abs(NP.linalg.norm(blc_orig)-NP.linalg.norm(self.blc)) > eps) or (NP.abs(NP.linalg.norm(trc_orig)-NP.linalg.norm(self.trc)) > eps):
if verbose:
print 'Grid corner(s) of antenna {0} have changed. Should re-grid the antenna array.'.format(self.label)
############################################################################
def update_pp(self, update_dict=None, verbose=True):
"""
------------------------------------------------------------------------
Wrapper for member function update() and returns the updated instance
of this class. Mostly intended to be used when parallel processing is
applicable and not to be used directly. Use update() instead when
updates are to be applied directly.
See member function update() for details on inputs.
------------------------------------------------------------------------
"""
self.update(update_dict=update_dict, verbose=verbose)
return self
############################################################################
def get_E_fields(self, pol, flag=None, tselect=None, fselect=None,
datapool=None):
"""
------------------------------------------------------------------------
Returns the electric fields based on selection criteria on timestamp
flags, timestamps and frequency channel indices and the type of data
(most recent or stacked electric fields)
Inputs:
pol [string] select baselines of this polarization that are either
flagged or unflagged as specified by input parameter flag.
Allowed values are 'P1' and 'P2'. Only one of these values
must be specified.
flag [boolean] If False, return electric fields of unflagged
timestamps, or if True return flagged ones. Default=None means
all electric fields independent of flagging are returned. This
flagging refers to that along the timestamp axis under each
polarization
tselect [scalar, list, numpy array] timestamp index for electric
fields selection. For most recent electric fields, it must be
set to -1. For all other selections, indices in tselect must
be in the valid range of indices along time axis for stacked
electric fields. Default=None means most recent data is
selected.
fselect [scalar, list, numpy array] frequency channel index for
electric field spectrum selection. Indices must be in the
valid range of indices along the frequency axis for
electric fields. Default=None selects all frequency channels
datapool [string] denotes the data pool from which electric fields are
to be selected. Accepted values are 'current', 'stack', and
None (default, same as 'current'). If set to None or
'current', the value in tselect is ignored and only
electric fields of the most recent timestamp are selected. If
set to None or 'current' the attribute Ef_stack is checked
first and if unavailable, attribute antpol.Ef is used. For
'stack', attribute Ef_stack respectively
Output:
outdict [dictionary] consists of electric fields information under the
following keys:
'label' [string] antenna label
'pol' [string] polarization string, one of 'P1' or
'P2'
'E-fields' [numpy array] selected electric fields spectra
with dimensions n_ts x nchan which
are in time-frequency order. If no electric
fields are found satisfying the selection
criteria, the value under this key is set to
None.
'twts' [numpy array of boolean] weights corresponding
to the time axis in the selected electric
fields. A zero weight indicates unflagged
electric fields were not found for that
timestamp. A non-zero weight indicates how
many unflagged electric fields were found for
that timestamp. If no electric fields are found
satisfying the selection criteria, the value
under this key is set to None.
------------------------------------------------------------------------
"""
try:
pol
except NameError:
raise NameError('Input parameter pol must be specified.')
if not isinstance(pol, str):
raise TypeError('Input parameter must be a string')
if pol not in ['P1', 'P2']:
raise ValueError('Invalid specification for input parameter pol')
if datapool is None:
n_timestamps = 1
datapool = 'current'
elif datapool == 'stack':
n_timestamps = len(self.timestamps)
elif datapool == 'current':
n_timestamps = 1
else:
raise ValueError('Invalid datapool specified')
if tselect is None:
tsind = NP.asarray(-1).reshape(-1) # Selects most recent data
elif isinstance(tselect, (int, float, list, NP.ndarray)):
tsind = NP.asarray(tselect).ravel()
tsind = tsind.astype(NP.int)
if tsind.size == 1:
if (tsind < -1) or (tsind >= n_timestamps):
tsind = NP.asarray(-1).reshape(-1)
else:
if NP.any(tsind < 0) or NP.any(tsind >= n_timestamps):
raise IndexError('Timestamp indices outside available range for the specified datapool')
else:
raise TypeError('tselect must be None, integer, float, list or numpy array for visibilities selection')
if fselect is None:
chans = NP.arange(self.f.size) # Selects all channels
elif isinstance(fselect, (int, float, list, NP.ndarray)):
chans = NP.asarray(fselect).ravel()
chans = chans.astype(NP.int)
if NP.any(chans < 0) or NP.any(chans >= self.f.size):
raise IndexError('Channel indices outside available range')
else:
raise TypeError('fselect must be None, integer, float, list or numpy array for visibilities selection')
select_ind = NP.ix_(tsind, chans)
outdict = {}
outdict['pol'] = pol
outdict['twts'] = None
outdict['label'] = self.label
outdict['E-fields'] = None
if datapool == 'current':
if self.Ef_stack[pol] is not None:
outdict['E-fields'] = self.Ef_stack[pol][-1,chans].reshape(1,chans.size)
outdict['twts'] = NP.logical_not(NP.asarray(self.flag_stack[pol][-1]).astype(NP.bool).reshape(-1)).astype(NP.float)
else:
outdict['E-fields'] = self.antpol.Ef[pol][chans].reshape(1,chans.size)
outdict['twts'] = NP.logical_not(NP.asarray(self.antpol.flag[pol]).astype(NP.bool).reshape(-1)).astype(NP.float)
else:
if self.Ef_stack[pol] is not None:
outdict['E-fields'] = self.Ef_stack[pol][select_ind].reshape(tsind.size,chans.size)
outdict['twts'] = NP.logical_not(NP.asarray(self.flag_stack[pol][tsind]).astype(NP.bool).reshape(-1)).astype(NP.float)
else:
raise ValueError('Attribute Ef_stack has not been initialized to obtain electric fields from. Consider running method stack()')
return outdict
############################################################################
def evalGridIllumination(self, uvlocs=None, xy_center=None):
"""
------------------------------------------------------------------------
Evaluate antenna illumination function on a specified grid
Inputs:
uvlocs [tuple] 2-element tuple where first and second elements
are numpy arrays that contain u- and v-locations
respectively. Default=None means determine u- and v-
locations from attributes blc and trc
xy_center [tuple, list or numpy array] 2-element list, tuple or numpy
array denoting x- and y-locations of center of antenna.
Default=None means use the x- and y-locations of the
antenna
Outputs:
antenna_grid_wts_vuf
[scipy sparse array] Complex antenna illumination weights
placed on the specified grid. When expanded it will be of
size nv x nu x nchan
------------------------------------------------------------------------
"""
if xy_center is None:
xy_center = NP.asarray([self.location.x, self.location.y])
elif isinstance(xy_center, (list,tuple,NP.ndarray)):
xy_center = NP.asarray(xy_center)
if xy_center.size != 2:
raise ValueError('Input xy_center must be a two-element numpy array')
xy_center = xy_center.ravel()
else:
raise TypeError('Input xy_center must be a numpy array')
wavelength = FCNST.c / self.f
min_wl = NP.abs(wavelength).min()
uvspacing = 0.5
if uvlocs is None:
blc = self.blc - xy_center
trc = self.trc - xy_center
trc = NP.amax(NP.abs(NP.vstack((blc, trc))), axis=0).ravel() / min_wl
blc = -1 * trc
gridu, gridv = GRD.grid_2d([(blc[0], trc[0]), (blc[1], trc[1])], pad=0.0, spacing=uvspacing, pow2=True)
du = gridu[0,1] - gridu[0,0]
dv = gridv[1,0] - gridv[0,0]
elif isinstance(uvlocs, tuple):
if len(uvlocs) != 2:
raise ValueError('Input uvlocs must be a two-element tuple')
ulocs, vlocs = uvlocs
if not isinstance(ulocs, NP.ndarray):
raise TypeError('Elements in input tuple uvlocs must be a numpy array')
if not isinstance(vlocs, NP.ndarray):
raise TypeError('Elements in input tuple uvlocs must be a numpy array')
ulocs = ulocs.ravel()
vlocs = vlocs.ravel()
du = ulocs[1] - ulocs[0]
dv = vlocs[1] - vlocs[0]
gridu, gridv = NP.meshgrid(ulocs, vlocs)
else:
raise TypeError('Input uvlocs must be a two-element tuple')
rmaxNN = 0.5 * NP.sqrt(du**2 + dv**2) * min_wl
gridx = gridu[:,:,NP.newaxis] * wavelength.reshape(1,1,-1)
gridy = gridv[:,:,NP.newaxis] * wavelength.reshape(1,1,-1)
gridxy = NP.hstack((gridx.reshape(-1,1), gridy.reshape(-1,1)))
wl = NP.ones(gridu.shape)[:,:,NP.newaxis] * wavelength.reshape(1,1,-1)
max_aprtr_size = max([NP.sqrt(self.aperture.xmax['P1']**2 + NP.sqrt(self.aperture.ymax['P1']**2)), NP.sqrt(self.aperture.xmax['P2']**2 + NP.sqrt(self.aperture.ymax['P2']**2)), self.aperture.rmax['P1'], self.aperture.rmax['P2']])
distNN = 2.0 * max_aprtr_size
indNN_list, blind, vuf_gridind = LKP.find_NN(xy_center.reshape(1,-1), gridxy, distance_ULIM=distNN, flatten=True, parallel=False)
dxy = gridxy[vuf_gridind,:]
unraveled_vuf_ind = NP.unravel_index(vuf_gridind, gridu.shape+(self.f.size,))
unraveled_vu_ind = (unraveled_vuf_ind[0], unraveled_vuf_ind[1])
raveled_vu_ind = NP.ravel_multi_index(unraveled_vu_ind, (gridu.shape[0], gridu.shape[1]))
antenna_grid_wts_vuf = {}
pol = ['P1', 'P2']
for p in pol:
krn = self.aperture.compute(dxy, wavelength=wl.ravel()[vuf_gridind], pol=p, rmaxNN=rmaxNN, load_lookup=False)
krn_sparse = SpM.csr_matrix((krn[p], (raveled_vu_ind,)+(unraveled_vuf_ind[2],)), shape=(gridu.size,)+(self.f.size,), dtype=NP.complex64)
krn_sparse_sumuv = krn_sparse.sum(axis=0)
krn_sparse_norm = krn_sparse.A / krn_sparse_sumuv.A
sprow = raveled_vu_ind
spcol = unraveled_vuf_ind[2]
spval = krn_sparse_norm[(sprow,)+(spcol,)]
antenna_grid_wts_vuf[p] = SpM.csr_matrix((spval, (sprow,)+(spcol,)), shape=(gridu.size,)+(self.f.size,), dtype=NP.complex64)
return antenna_grid_wts_vuf
################################################################################
class AntennaArray(object):
"""
----------------------------------------------------------------------------
Class to manage collective information on a group of antennas.
Attributes:
antennas: [Dictionary] Dictionary consisting of keys which hold instances
of class Antenna. The keys themselves are identical to the
label attributes of the antenna instances they hold.
latitude [Scalar] Latitude of the antenna array location.
longitude [Scalar] Longitude of the antenna array location.
blc [2-element Numpy array] The coordinates of the bottom left
corner of the array of antennas
trc [2-element Numpy array] The coordinates of the top right
corner of the array of antennas
grid_blc [2-element Numpy array] The coordinates of the bottom left
corner of the grid constructed for the array of antennas.
This may differ from blc due to any extra padding during the
gridding process.
grid_trc [2-element Numpy array] The coordinates of the top right
corner of the grid constructed for the array of antennas
This may differ from trc due to any extra padding during the
gridding process.
grid_ready [boolean] True if the grid has been created, False otherwise
gridu [Numpy array] u-locations of the grid lattice stored as 2D
array. It is the same for all frequencies and hence no third
dimension for the spectral axis.
gridv [Numpy array] v-locations of the grid lattice stored as 2D
array. It is the same for all frequencies and hence no third
dimension for the spectral axis.
antenna_autowts_set
[boolean] Indicates if auto-correlation of antenna-wise weights
have been determined (True) or not (False).
antenna_crosswts_set
[boolean] Indicates if zero-centered cross-correlation of
antenna pair weights have been determined (True) or not (False)
auto_corr_data
[dictionary] holds antenna auto-correlation of complex electric
field spectra. It is under keys 'current', 'stack' and 'avg'
for the current, stacked and time-averaged auto-correlations.
Under eack of these keys is another dictionary with two keys
'P1' and 'P2' for the two polarizations. Under each of these
polarization keys is a dictionary with the following keys
and values:
'labels' [list of strings] Contains a list of antenna
labels
'E-fields' [numpy array] Contains time-averaged
auto-correlation of antenna electric fields. It is
of size n_tavg x nant x nchan
'twts' [numpy array] Contains number of unflagged electric
field spectra used in the averaging of antenna
auto-correlation spectra. It is of size
n_tavg x nant x 1
pairwise_typetag_crosswts_vuf
[dictionary] holds grid illumination wts (centered on grid
origin) obtained from cross-correlation of antenna pairs that
belong to their respective typetags. Tuples of typetag pairs
form the keys. Under each key is another dictionary with
keys 'last_updated', and 'P1' and 'P2' for each polarization.
Under 'last_updated' it stores the timestamp when the last
update took place for this typetag pair. Under each of the
polarization keys is a complex numpy array of size
nv x nu x nchan. It is obtained by correlating the aperture
illumination weights of one antenna type with the complex
conjugate of another.
antennas_center
[Numpy array] geometrical center of the antenna array locations
as a 2-element array of x- and y-values of the center. This is
not the center of mass of the antenna locations but simply the
mid-point between the extreme x- and y- coordinates of the
antennas
grid_illumination
[dictionary] Electric field illumination of antenna aperture
for each polarization held under keys 'P1' and 'P2'. Could be
complex. Stored as numpy arrays in the form of cubes with
same dimensions as gridu or gridv in the transverse (first two
dimensions) and the depth along the third dimension (spectral
axis) is equal to number of frequency channels
grid_Ef [dictionary] Complex Electric field projected on the grid
for each polarization under the keys 'P1' and P2'. Stored as
numpy arrays in the form of cubes with same dimensions as gridu
or gridv in the transverse (first two dimensions) and the depth
along the third dimension (spectral axis) is equal to number of
frequency channels.
f [Numpy array] Frequency channels (in Hz)
f0 [Scalar] Center frequency of the observing band (in Hz)
typetags [dictionary] Dictionary containing keys which are unique
antenna type tags. Under each of these type tag keys is a
set of antenna labels denoting antennas that are of that type
pairwise_typetags
[dictionary] Dictionary containing keys which are unique
pairwise combination (tuples) of antenna type tags. Under each
of these pairwise type tag keys is a dictionary with two keys
'auto' and 'cross' each of which contains a set of pairwise
(tuple) antenna labels denoting the antenna pairs that are of
that type. Under 'auto' are tuples with same antennas while
under 'cross' it contains antenna pairs in which the antennas
are not the same. The 'auto' key exists only when antenna
type tag tuple contains both antennas of same type.
antenna_pair_to_typetag
[dictionary] Dictionary containing antenna pair keys and the
corresponding values are typetag pairs.
timestamp: [Scalar] String or float representing the timestamp for the
current attributes
timestamps [list] list of all timestamps to be held in the stack
tbinsize [scalar or dictionary] Contains bin size of timestamps while
averaging after stacking. Default = None means all antenna
E-field auto-correlation spectra over all timestamps are
averaged. If scalar, the same (positive) value applies to all
polarizations. If dictionary, timestamp bin size (positive) in
seconds is provided under each key 'P1' and 'P2'. If any of
the keys is missing the auto-correlated antenna E-field spectra
for that polarization are averaged over all timestamps.
grid_mapper [dictionary] antenna-to-grid mapping information for each of
four polarizations under keys 'P1' and 'P2'. Under each
polarization, it is a dictionary with values under the following
keys:
'refind' [list] each element in the list corresponds to a
sequential frequency channel and is another list
with indices to the lookup locations that map to
the grid locations (indices in 'gridind') for this
frequency channel. These indices index the array
in 'refwts'
'gridind' [list] each element in the list corresponds to a
sequential frequency channel and is another list
with indices to the grid locations that map to
the lookup locations (indices in 'refind') for
this frequency channel.
'refwts' [numpy array] antenna weights of size
n_ant x n_wts flattened to be a vector. Indices in
'refind' index to this array. Currently only valid
when lookup weights scale with frequency.
'labels' [dictionary] contains mapping information from
antenna (specified by key which is the
antenna label). The value under each label
key is another dictionary with the following keys
and information:
'twts' [scalar] if positive, indicates
the number of timestamps that
have gone into the measurement of Ef
made by the antenna under the
specific polarization. If zero, it
indicates no unflagged timestamp data
was found for the antenna and will
not contribute to the complex grid
illumination and electric fields
'gridind' [numpy vector] one-dimensional index
into the three-dimensional grid
locations where the antenna
contributes illumination and
electric fields. The one-dimensional
indices are obtained using numpy's
multi_ravel_index() using the grid
shape, n_u x n_v x nchan
'illumination' [numpy vector] complex grid
illumination contributed by the
antenna to different grid
locations in 'gridind'. It is
mapped to the grid as specified by
indices in key 'gridind'
'Ef' [numpy vector] complex grid
electric fields contributed by the
antenna. It is mapped to the
grid as specified by indices in
key 'gridind'
'ant' [dictionary] dictionary with information on
contribution of all antenna lookup weights. This
contains another dictionary with the following
keys:
'ind_freq' [list] each element in the list is
for a frequency channel and
consists of a numpy vector which
consists of indices of the
contributing antennas
'ind_all' [numpy vector] consists of numpy
vector which consists of indices
of the contributing antennas
for all frequencies appended
together. Effectively, this is just
values in 'ind_freq' of all
frequencies appended together.
'uniq_ind_all' [numpy vector] consists of numpy
vector which consists of unique
indices of contributing antennas
for all frequencies.
'rev_ind_all' [numpy vector] reverse indices of
'ind_all' with reference to bins of
'uniq_ind_all'
'illumination' [numpy vector] complex grid
illumination weights contributed by
each antenna (including associated
kernel weight locations) and has a
size equal to that in 'ind_all'
'grid' [dictionary] contains information about populated
portions of the grid. It consists of values in the
following keys:
'ind_all' [numpy vector] indices of all grid
locations raveled to one dimension
from three dimensions of size
n_u x n_v x nchan
'per_ant2grid'
[list] each element in the list is a dictionary
corresponding to an antenna with information on
its mapping and contribution to the grid. Each
dictionary has the following keys and values:
'label' [string] antenna label
'f_gridind' [numpy array] mapping information
with indices to the frequency axis
of the grid
'u_gridind' [numpy array] mapping information
with indices to the u-axis
of the grid. Must be of same size
as array under 'f_gridind'
'v_gridind' [numpy array] mapping information
with indices to the v-axis
of the grid. Must be of same size
as array under 'f_gridind'
'per_ant_per_freq_norm_wts'
[numpy array] mapping information
on the (complex) normalizing
multiplicative factor required to
make the sum of illumination/weights
per antenna per frequency on the
grid equal to unity. Must be of same
size as array under 'f_gridind'
'illumination' [numpy array] Complex aperture
illumination/weights contributed
by the antenna onto the grid. The
grid pixels to which it contributes
is given by 'f_gridind', 'u_gridind',
'v_gridind'. Must be of same size
as array under 'f_gridind'
'Ef' [numpy array] Complex electric fields
contributed by the antenna onto the
grid. The grid pixels to which it
contributes is given by 'f_gridind',
'u_gridind', 'v_gridind'. Must be of
same size as array under 'f_gridind'
'all_ant2grid'
[dictionary] contains the combined information of
mapping of all antennas to the grid. It consists of
the following keys and values:
'antind' [numpy array] all antenna indices (to
attribute ordered labels) that map to
the uvf-grid
'u_gridind' [numpy array] all indices to the
u-axis of the uvf-grid mapped to by
all antennas whose indices are given
in key 'antind'. Must be of same size
as the array under key 'antind'
'v_gridind' [numpy array] all indices to the
v-axis of the uvf-grid mapped to by
all antennas whose indices are given
in key 'antind'. Must be of same size
as the array under key 'antind'
'f_gridind' [numpy array] all indices to the
f-axis of the uvf-grid mapped to by
all antennas whose indices are given
in key 'antind'. Must be of same size
as the array under key 'antind'
'indNN_list' [list of lists] Each item in the top
level list corresponds to an antenna
in the same order as in the attribute
ordered_labels. Each of these items
is another list consisting of the
unraveled grid indices it contributes
to. The unraveled indices are what
are used to obtain the u-, v- and f-
indices in the grid using a
conversion assuming f is the
first axis, v is the second and u is
the third
'illumination' [numpy array] complex values of
aperture illumination contributed by
all antennas to the grid. The antenna
indices are in 'antind' and the grid
indices are in 'u_gridind',
'v_gridind' and 'f_gridind'. Must be
of same size as these indices
'per_ant_per_freq_norm_wts'
[numpy array] mapping information
on the (complex) normalizing
multiplicative factor required to
make the sum of illumination/weights
per antenna per frequency on the
grid equal to unity. This is appended
for all antennas together. Must be of
same size as array under
'illumination'
'Ef' [numpy array] Complex electric fields
contributed by all antennas onto the
grid. The grid pixels to which it
contributes is given by 'f_gridind',
'u_gridind', 'v_gridind'. Must be of
same size as array under 'f_gridind'
and 'illumination'
ant2grid_mapper
[sparse matrix] contains the antenna array to grid mapping
information in sparse matrix format. When converted to a dense
array, it will have dimensions nrows equal to size of the 3D
cube and ncols equal to number of electric field spectra of all
antennas over all channels. In other words,
nrows = nu x nv x nchan and ncols = n_ant x nchan. Dot product
of this matrix with flattened electric field spectra or antenna
weights will give the 3D cubes of gridded electric fields and
antenna array illumination respectively
Member Functions:
__init__() Initializes an instance of class AntennaArray which
manages information about an array of antennas.
__str__() Prints a summary of current attributes
__add__() Operator overloading for adding antenna(s)
__radd__() Operator overloading for adding antenna(s)
__sub__() Operator overloading for removing antenna(s)
pairTypetags() Combine antenna typetags to create pairwise typetags for
antenna pairs and update attribute pairwise_typetags
add_antennas() Routine to add antenna(s) to the antenna array instance.
A wrapper for operator overloading __add__() and
__radd__()
remove_antennas() Routine to remove antenna(s) from the antenna array
instance. A wrapper for operator overloading __sub__()
grid() Routine to produce a grid based on the antenna array
grid_convolve() Routine to project the electric field illumination pattern
and the electric fields on the grid. It can operate on the
entire antenna array or incrementally project the electric
fields and illumination patterns from specific antennas on
to an already existing grid.
grid_convolve_new()
Routine to project the electric field illumination pattern
and the electric fields on the grid.
genMappingMatrix()
Routine to construct sparse antenna-to-grid mapping matrix
that will be used in projecting illumination and electric
fields from the array of antennas onto the grid. It has
elements very common to grid_convolve_new()
applyMappingMatrix()
Constructs the grid of complex field illumination and
electric fields using the sparse antenna-to-grid mapping
matrix. Intended to serve as a "matrix" alternative to
make_grid_cube_new()
grid_unconvolve() Routine to de-project the electric field illumination
pattern and the electric fields on the grid. It can
operate on the entire antenna array or incrementally
de-project the electric fields and illumination patterns
from specific antennas from an already existing grid.
get_E_fields() Routine to return the antenna labels, time-based weight
flags and electric fields (sorted by antenna label if
specified) based on selection criteria specified by flags,
timestamps, frequency channels, labels and data pool (most
recent or stack)
make_grid_cube() Constructs the grid of complex field illumination and
electric fields using the gridding information determined
for every antenna. Flags are taken into account while
constructing this grid.
make_grid_cube_new()
Constructs the grid of complex field illumination and
electric fields using the gridding information determined
for every antenna. Flags are taken into account while
constructing this grid.
evalAntennaPairCorrWts()
Evaluate correlation of pair of antenna illumination
weights on grid. It will be computed only if it was not
computed or stored in attribute
pairwise_typetag_crosswts_vuf earlier
evalAntennaPairPBeam()
Evaluate power pattern response on sky of an antenna pair
avgAutoCorr() Accumulates and averages auto-correlation of electric
fields of individual antennas under each polarization
evalAutoCorr() Estimates antenna-wise E-field auto-correlations under
both polarizations. It can be for the msot recent
timestamp, stacked or averaged along timestamps.
evalAntennaAutoCorrWts()
Evaluate auto-correlation of aperture illumination of
each antenna on the UVF-plane
evalAllAntennaPairCorrWts()
Evaluate zero-centered cross-correlation of aperture
illumination of each antenna pair on the UVF-plane
makeAutoCorrCube()
Constructs the grid of antenna aperture illumination
auto-correlation using the gridding information
determined for every antenna. Flags are taken into
account while constructing this grid
makeCrossCorrWtsCube()
Constructs the grid of zero-centered cross-correlation
of antenna aperture pairs using the gridding information
determined for every antenna. Flags are taken into account
while constructing this grid
quick_beam_synthesis()
A quick generator of synthesized beam using antenna array
field illumination pattern using the center frequency. Not
intended to be used rigorously but rather for comparison
purposes and making quick plots
update(): Updates the antenna array instance with newer attribute
values
save(): Saves the antenna array information to disk.
Read the member function docstrings for details.
----------------------------------------------------------------------------
"""
def __init__(self, antenna_array=None):
"""
------------------------------------------------------------------------
Initialize the AntennaArray Class which manages information about an
array of antennas.
Class attributes initialized are:
antennas, blc, trc, gridu, gridv, grid_ready, timestamp,
grid_illumination, grid_Ef, f, f0, t, ordered_labels, grid_mapper,
antennas_center, latitude, longitude, tbinsize, auto_corr_data,
antenna_autowts_set, typetags, pairwise_typetags, antenna_crosswts_set,
pairwise_typetag_crosswts_vuf, antenna_pair_to_typetag
Read docstring of class AntennaArray for details on these attributes.
Inputs:
antenna_array
[Instance of class AntennaArray, dictionary holding
instance(s) instance(s) of class Antenna, list of instances
of class Antenna, or a single instance of class Antenna]
Read docstring of member funtion __add__() for more details
on this input. If provided, this will be used to initialize
the instance.
------------------------------------------------------------------------
"""
self.antennas = {}
self.blc = NP.zeros(2)
self.trc = NP.zeros(2)
self.grid_blc = NP.zeros(2)
self.grid_trc = NP.zeros(2)
self.gridu, self.gridv = None, None
self.antennas_center = NP.zeros(2, dtype=NP.float).reshape(1,-1)
self.grid_ready = False
self.grid_illumination = {}
self.grid_Ef = {}
self.caldata = {}
self.latitude = None
self.longitude = None
self.f = None
self.f0 = None
self.t = None
self.timestamp = None
self.timestamps = []
self.typetags = {}
self.pairwise_typetags = {}
self.antenna_pair_to_typetag = {}
self.auto_corr_data = {}
self.pairwise_typetag_crosswts_vuf = {}
self.antenna_autowts_set = False
self.antenna_crosswts_set = False
self._ant_contribution = {}
self.ordered_labels = [] # Usually output from member function baseline_vectors() or get_visibilities()
self.grid_mapper = {}
self.ant2grid_mapper = {} # contains the sparse mapping matrix
for pol in ['P1', 'P2']:
self.grid_mapper[pol] = {}
self.grid_mapper[pol]['labels'] = {}
self.grid_mapper[pol]['refind'] = []
# self.grid_mapper[pol]['ant_ind'] = []
self.grid_mapper[pol]['gridind'] = []
self.grid_mapper[pol]['refwts'] = None
self.grid_mapper[pol]['ant'] = {}
self.grid_mapper[pol]['ant']['ind_freq'] = []
self.grid_mapper[pol]['ant']['ind_all'] = None
self.grid_mapper[pol]['ant']['uniq_ind_all'] = None
self.grid_mapper[pol]['ant']['rev_ind_all'] = None
self.grid_mapper[pol]['ant']['illumination'] = None
self.grid_mapper[pol]['grid'] = {}
self.grid_mapper[pol]['grid']['ind_all'] = None
self.grid_mapper[pol]['per_ant2grid'] = []
self.grid_mapper[pol]['all_ant2grid'] = {}
self.grid_illumination[pol] = None
self.grid_Ef[pol] = None
self._ant_contribution[pol] = {}
self.caldata[pol] = None
self.ant2grid_mapper[pol] = None
if antenna_array is not None:
self += antenna_array
self.f = NP.copy(self.antennas.itervalues().next().f)
self.f0 = NP.copy(self.antennas.itervalues().next().f0)
self.t = NP.copy(self.antennas.itervalues().next().t)
if self.latitude is None:
self.latitude = NP.copy(self.antennas.itervalues().next().latitude)
self.longitude = NP.copy(self.antennas.itervalues().next().longitude)
self.timestamp = copy.deepcopy(self.antennas.itervalues().next().timestamp)
self.timestamps += [copy.deepcopy(self.timestamp)]
############################################################################
def __add__(self, others):
"""
------------------------------------------------------------------------
Operator overloading for adding antenna(s)
Inputs:
others [Instance of class AntennaArray, dictionary holding
instance(s) of class Antenna, list of instances of class
Antenna, or a single instance of class Antenna] If a
dictionary is provided, the keys should be the antenna
labels and the values should be instances of class Antenna.
If a list is provided, it should be a list of valid instances
of class Antenna. These instance(s) of class Antenna will be
added to the existing instance of AntennaArray class.
------------------------------------------------------------------------
"""
retval = self
if isinstance(others, AntennaArray):
# for k,v in others.antennas.items():
for k,v in others.antennas.iteritems():
if k in retval.antennas:
print "Antenna {0} already included in the list of antennas.".format(k)
print "For updating, use the update() method. Ignoring antenna {0}".format(k)
else:
retval.antennas[k] = v
if v.typetag not in retval.typetags:
retval.typetags[v.typetag] = {v.label}
else:
retval.typetags[v.typetag].add(v.label)
print 'Antenna "{0}" added to the list of antennas.'.format(k)
if retval.latitude is None:
retval.latitude = others.latitude
retval.longitude = others.longitude
elif isinstance(others, dict):
# for item in others.values():
for item in others.itervalues():
if isinstance(item, Antenna):
if item.label in retval.antennas:
print "Antenna {0} already included in the list of antennas.".format(item.label)
print "For updating, use the update() method. Ignoring antenna {0}".format(item.label)
else:
retval.antennas[item.label] = item
if item.typetag not in retval.typetags:
retval.typetags[item.typetag] = {item.label}
else:
retval.typetags[item.typetag].add(item.label)
print 'Antenna "{0}" added to the list of antennas.'.format(item.label)
if retval.latitude is None:
retval.latitude = item.latitude
retval.longitude = item.longitude
elif isinstance(others, list):
for i in range(len(others)):
if isinstance(others[i], Antenna):
if others[i].label in retval.antennas:
print "Antenna {0} already included in the list of antennas.".format(others[i].label)
print "For updating, use the update() method. Ignoring antenna {0}".format(others[i].label)
else:
retval.antennas[others[i].label] = others[i]
if others[i].typetag not in retval.typetags:
retval.typetags[others[i].typetag] = {others[i].label}
else:
retval.typetags[others[i].typetag].add(others[i].label)
print 'Antenna "{0}" added to the list of antennas.'.format(others[i].label)
else:
print 'Element \# {0} is not an instance of class Antenna.'.format(i)
if retval.latitude is None:
retval.latitude = others[i].latitude
retval.longitude = others[i].longitude
elif isinstance(others, Antenna):
if others.label in retval.antennas:
print "Antenna {0} already included in the list of antennas.".format(others.label)
print "For updating, use the update() method. Ignoring antenna {0}".format(others.label)
else:
retval.antennas[others.label] = others
if others.typetag not in retval.typetags:
retval.typetags[others.typetag] = {others.label}
else:
retval.typetags[others.typetag].add(others.label)
print 'Antenna "{0}" added to the list of antennas.'.format(others.label)
if retval.latitude is None:
retval.latitude = others.latitude
retval.longitude = others.longitude
else:
print 'Input(s) is/are not instance(s) of class Antenna.'
return retval
############################################################################
def __radd__(self, others):
"""
------------------------------------------------------------------------
Operator overloading for adding antenna(s)
Inputs:
others [Instance of class AntennaArray, dictionary holding
instance(s) of class Antenna, list of instances of class
Antenna, or a single instance of class Antenna] If a
dictionary is provided, the keys should be the antenna
labels and the values should be instances of class Antenna.
If a list is provided, it should be a list of valid
instances of class Antenna. These instance(s) of class
Antenna will be added to the existing instance of
AntennaArray class.
------------------------------------------------------------------------
"""
return self.__add__(others)
############################################################################
def __sub__(self, others):
"""
------------------------------------------------------------------------
Operator overloading for removing antenna(s)
Inputs:
others [Instance of class AntennaArray, dictionary holding
instance(s) of class Antenna, list of instances of class
Antenna, list of strings containing antenna labels or a
single instance of class Antenna] If a dictionary is
provided, the keys should be the antenna labels and the
values should be instances of class Antenna. If a list is
provided, it should be a list of valid instances of class
Antenna. These instance(s) of class Antenna will be removed
from the existing instance of AntennaArray class.
------------------------------------------------------------------------
"""
retval = self
if isinstance(others, dict):
for item in others.values():
if isinstance(item, Antenna):
if item.label not in retval.antennas:
print "Antenna {0} does not exist in the list of antennas.".format(item.label)
else:
del retval.antennas[item.label]
retval.typetags[item.typetag].remove(item.label)
print 'Antenna "{0}" removed from the list of antennas.'.format(item.label)
elif isinstance(others, list):
for i in range(0,len(others)):
if isinstance(others[i], str):
if others[i] in retval.antennas:
retval.typetags[retval.antennas[others[i]].typetag].remove(others[i])
del retval.antennas[others[i]]
print 'Antenna {0} removed from the list of antennas.'.format(others[i])
elif isinstance(others[i], Antenna):
if others[i].label in retval.antennas:
retval.typetags[others[i].typetag].remove(others[i].label)
del retval.antennas[others[i].label]
print 'Antenna {0} removed from the list of antennas.'.format(others[i].label)
else:
print "Antenna {0} does not exist in the list of antennas.".format(others[i].label)
else:
print 'Element \# {0} has no matches in the list of antennas.'.format(i)
elif others in retval.antennas:
retval.typetags[retval.antennas[others].typetag].remove(others)
del retval.antennas[others]
print 'Antenna "{0}" removed from the list of antennas.'.format(others)
elif isinstance(others, Antenna):
if others.label in retval.antennas:
retval.typetags[others.typetag].remove(others.label)
del retval.antennas[others.label]
print 'Antenna "{0}" removed from the list of antennas.'.format(others.label)
else:
print "Antenna {0} does not exist in the list of antennas.".format(others.label)
else:
print 'No matches found in existing list of antennas.'
return retval
############################################################################
def add_antennas(self, A=None):
"""
------------------------------------------------------------------------
Routine to add antenna(s) to the antenna array instance. A wrapper for
operator overloading __add__() and __radd__()
Inputs:
A [Instance of class AntennaArray, dictionary holding
instance(s) of class Antenna, list of instances of class
Antenna, or a single instance of class Antenna] If a
dictionary is provided, the keys should be the antenna
labels and the values should be instances of class Antenna.
If a list is provided, it should be a list of valid
instances of class Antenna. These instance(s) of class
Antenna will be added to the existing instance of
AntennaArray class.
------------------------------------------------------------------------
"""
if A is None:
print 'No antenna(s) supplied.'
elif isinstance(A, (list, Antenna)):
self = self.__add__(A)
else:
print 'Input(s) is/are not instance(s) of class Antenna.'
############################################################################
def remove_antennas(self, A=None):
"""
------------------------------------------------------------------------
Routine to remove antenna(s) from the antenna array instance. A wrapper
for operator overloading __sub__()
Inputs:
A [Instance of class AntennaArray, dictionary holding
instance(s) of class Antenna, list of instances of class
Antenna, or a single instance of class Antenna] If a
dictionary is provided, the keys should be the antenna
labels and the values should be instances of class Antenna.
If a list is provided, it should be a list of
valid instances of class Antenna. These instance(s) of class
Antenna will be removed from the existing instance of
AntennaArray class.
------------------------------------------------------------------------
"""
if A is None:
print 'No antenna specified for removal.'
else:
self = self.__sub__(A)
############################################################################
def pairTypetags(self):
"""
------------------------------------------------------------------------
Combine antenna typetags to create pairwise typetags for antenna pairs
and update attribute pairwise_typetags
------------------------------------------------------------------------
"""
typekeys = self.typetags.keys()
pairwise_typetags = {}
for i in range(len(typekeys)):
labels1 = list(self.typetags[typekeys[i]])
for j in range(i,len(typekeys)):
labels2 = list(self.typetags[typekeys[j]])
pairwise_typetags[(typekeys[i],typekeys[j])] = {}
if i == j:
pairwise_typetags[(typekeys[i],typekeys[j])]['auto'] = set([(l1,l1) for l1 in labels1])
pairwise_typetags[(typekeys[i],typekeys[j])]['cross'] = set([(l1,l2) for i1,l1 in enumerate(labels1) for i2,l2 in enumerate(labels2) if i1 < i2])
else:
pairwise_typetags[(typekeys[i],typekeys[j])]['cross'] = set([(l1,l2) for l1 in labels1 for l2 in labels2])
self.pairwise_typetags = pairwise_typetags
self.antenna_pair_to_typetag = {}
for k,val in pairwise_typetags.iteritems():
for subkey in val:
for v in list(val[subkey]):
self.antenna_pair_to_typetag[v] = k
############################################################################
def antenna_positions(self, pol=None, flag=False, sort=True,
centering=False):
"""
------------------------------------------------------------------------
Routine to return the antenna label and position vectors (sorted by
antenna label if specified)
Keyword Inputs:
pol [string] select positions of this polarization that are either
flagged or unflagged as specified by input parameter flag.
Allowed values are 'P1' and 'P2'. Default=None.
This means all positions are returned irrespective of the flags
flag [boolean] If False, return unflagged positions, otherwise
return flagged ones. Default=None means return all positions
independent of flagging or polarization
sort [boolean] If True, returned antenna information is sorted
by antenna label. Default = True.
centering
[boolean] If False (default), does not subtract the mid-point
between the bottom left corner and the top right corner. If
True, subtracts the mid-point and makes it the origin
Output:
outdict [dictionary] Output consists of a dictionary with the following
keys and information:
'labels': list of strings of antenna labels
'positions': position vectors of antennas (3-column
array)
------------------------------------------------------------------------
"""
if not isinstance(sort, bool):
raise TypeError('sort keyword has to be a Boolean value.')
if flag is not None:
if not isinstance(flag, bool):
raise TypeError('flag keyword has to be a Boolean value.')
if pol is None:
if sort: # sort by antenna label
xyz = NP.asarray([[self.antennas[label].location.x, self.antennas[label].location.y, self.antennas[label].location.z] for label in sorted(self.antennas.keys())])
labels = sorted(self.antennas.keys())
else:
xyz = NP.asarray([[self.antennas[label].location.x, self.antennas[label].location.y, self.antennas[label].location.z] for label in self.antennas.keys()])
labels = self.antennas.keys()
else:
if not isinstance(pol, str):
raise TypeError('Input parameter must be a string')
if pol not in ['P1', 'P2']:
raise ValueError('Invalid specification for input parameter pol')
if sort: # sort by antenna label
if flag is None: # get all positions
xyz = NP.asarray([[self.antennas[label].location.x, self.antennas[label].location.y, self.antennas[label].location.z] for label in sorted(self.antennas.keys())])
labels = sorted(self.antennas.keys())
else:
if flag: # get flagged positions
xyz = NP.asarray([[self.antennas[label].location.x, self.antennas[label].location.y, self.antennas[label].location.z] for label in sorted(self.antennas.keys()) if self.antennas[label].antpol.flag[pol]])
labels = [label for label in sorted(self.antennas.keys()) if self.antennas[label].antpol.flag[pol]]
else: # get unflagged positions
xyz = NP.asarray([[self.antennas[label].location.x, self.antennas[label].location.y, self.antennas[label].location.z] for label in sorted(self.antennas.keys()) if not self.antennas[label].antpol.flag[pol]])
labels = [label for label in sorted(self.antennas.keys()) if not self.antennas[label].antpol.flag[pol]]
else: # no sorting
if flag is None: # get all positions
xyz = NP.asarray([[self.antennas[label].location.x, self.antennas[label].location.y, self.antennas[label].location.z] for label in self.antennas.keys()])
labels = [label for label in self.antennas.keys()]
else:
if flag: # get flagged positions
xyz = NP.asarray([[self.antennas[label].location.x, self.antennas[label].location.y, self.antennas[label].location.z] for label in self.antennas.keys() if self.antennas[label].antpol.flag[pol]])
labels = [label for label in self.antennas.keys() if self.antennas[label].antpol.flag[pol]]
else: # get unflagged positions
xyz = NP.asarray([[self.antennas[label].location.x, self.antennas[label].location.y, self.antennas[label].location.z] for label in self.antennas.keys() if not self.antennas[label].antpol.flag[pol]])
labels = [label for label in self.antennas.keys() if not self.antennas[label].antpol.flag[pol]]
if centering:
xyzcenter = 0.5 * (NP.amin(xyz, axis=0, keepdims=True) + NP.amax(xyz, axis=0, keepdims=True))
xyz = xyz - xyzcenter
self.antennas_center = xyzcenter[0,:2].reshape(1,-1)
outdict = {}
outdict['labels'] = labels
outdict['positions'] = xyz
return outdict
############################################################################
def get_E_fields_old(self, pol, flag=False, sort=True):
"""
------------------------------------------------------------------------
Routine to return the antenna label and Electric fields (sorted by
antenna label if specified)
Keyword Inputs:
pol [string] select antenna positions of this polarization that are
either flagged or unflagged as specified by input parameter
flag. Allowed values are 'P1' and 'P22'. Only one of these
values must be specified.
flag [boolean] If False, return electric fields of unflagged
antennas, otherwise return flagged ones. Default=None means
all electric fields independent of flagging are returned.
sort [boolean] If True, returned antenna information is sorted
by antenna label. Default = True.
Output:
outdict [dictionary] Output consists of a dictionary with the following
keys and information:
'labels': Contains a numpy array of strings of antenna
labels
'E-fields': measured electric fields (n_ant x nchan array)
------------------------------------------------------------------------
"""
try:
pol
except NameError:
raise NameError('Input parameter pol must be specified.')
if not isinstance(pol, str):
raise TypeError('Input parameter must be a string')
if not pol in ['P1', 'P2']:
raise ValueError('Invalid specification for input parameter pol')
if not isinstance(sort, bool):
raise TypeError('sort keyword has to be a Boolean value.')
if flag is not None:
if not isinstance(flag, bool):
raise TypeError('flag keyword has to be a Boolean value.')
if sort: # sort by first antenna label
if flag is None: # get all antenna positions
efields = NP.asarray([self.antennas[label].antpol.Ef[pol] for label in sorted(self.antennas.keys(), key=lambda tup: tup[0])])
labels = [label for label in sorted(self.antennas.keys(), key=lambda tup: tup[0])]
else:
if flag: # get flagged antenna positions
efields = NP.asarray([self.antennas[label].antpol.Ef[pol] for label in sorted(self.antennas.keys(), key=lambda tup: tup[0]) if self.antennas[label].antpol.flag[pol]])
labels = [label for label in sorted(self.antennas.keys(), key=lambda tup: tup[0]) if self.antennas[label].antpol.flag[pol]]
else: # get unflagged antenna positions
efields = NP.asarray([self.antennas[label].antpol.Ef[pol] for label in sorted(self.antennas.keys(), key=lambda tup: tup[0]) if not self.antennas[label].antpol.flag[pol]])
labels = [label for label in sorted(self.antennas.keys(), key=lambda tup: tup[0]) if not self.antennas[label].antpol.flag[pol]]
else: # no sorting
if flag is None:
efields = NP.asarray([self.antennas[label].antpol.Ef[pol] for label in self.antennas.keys()])
labels = [label for label in self.antennas.keys()]
else:
if flag: # get flagged antenna positions
efields = NP.asarray([self.antennas[label].antpol.Ef[pol] for label in self.antennas.keys() if self.antennas[label].antpol.flag[pol]])
labels = [label for label in self.antennas.keys() if self.antennas[label].antpol.flag[pol]]
else: # get unflagged antenna positions
efields = NP.asarray([self.antennas[label].antpol.Ef[pol] for label in self.antennas.keys() if not self.antennas[label].antpol.flag[pol]])
labels = [label for label in sorted(self.antennas.keys(), key=lambda tup: tup[0]) if not self.antennas[label].antpol.flag[pol]]
outdict = {}
outdict['labels'] = labels
outdict['E-fields'] = efields
return outdict
############################################################################
def get_E_fields(self, pol, flag=None, tselect=None, fselect=None,
aselect=None, datapool=None, sort=True):
"""
------------------------------------------------------------------------
Routine to return the antenna labels, time-based weight flags and
electric fields (sorted by antenna label if specified) based on
selection criteria specified by flags, timestamps, frequency channels,
labels and data pool (most recent or stack)
Keyword Inputs:
pol [string] select baselines of this polarization that are either
flagged or unflagged as specified by input parameter flag.
Allowed values are 'P1' and 'P2'. Only one of these values
must be specified.
flag [boolean] If False, return electric fields of unflagged
antennas, otherwise return flagged ones. Default=None means
all electric fields independent of flagging are returned.
tselect [scalar, list, numpy array] timestamp index for electric
fields selection. For most recent electric fields, it must
be set to -1. For all other selections, indices in tselect
must be in the valid range of indices along time axis for
stacked electric fields. Default=None means most recent data
is selected.
fselect [scalar, list, numpy array] frequency channel index for
electric fields selection. Indices must be in the valid range
of indices along the frequency axis for electric fields.
Default=None selects all frequency channels
aselect [list of strings] labels of antennas to select. If set
to None (default) all antennas are selected.
datapool [string] denotes the data pool from which electric fields are
to be selected. Accepted values are 'current', 'stack' and
None (default, same as 'current'). If set to None or
'current', the value in tselect is ignored and only
electric fields of the most recent timestamp are selected. If
set to None or 'current' the attribute Ef_stack is checked
first and if unavailable, attribute antpol.Ef is used. For
'stack' attribute Ef_stack is used
sort [boolean] If True, returned antenna information is sorted
by antenna label. Default = True.
Output:
outdict [dictionary] Output consists of a dictionary with the following
keys and information:
'labels' [list of strings] Contains a list of antenna
labels
'E-fields' [list or numpy array] antenna electric fields
under the specified polarization. In general,
it is a list of numpy arrays where each
array in the list corresponds to
an individual antenna and the size of
each numpy array is n_ts x nchan. If input
keyword flag is set to None, the electric
fields are rearranged into a numpy array of
size n_ts x n_ant x nchan.
'twts' [list or numpy array] weights along time axis
under the specified polarization. In general
it is a list of numpy arrays where each array
in the list corresponds to an individual
antenna and the size of each array is n_ts x 1.
If input keyword flag is set to None, the
time weights are rearranged into a numpy array
of size n_ts x n_ant x 1
------------------------------------------------------------------------
"""
if not isinstance(sort, bool):
raise TypeError('sort keyword has to be a Boolean value.')
if aselect is None:
labels = self.antennas.keys()
elif isinstance(aselect, list):
labels = [label for label in aselect if label in self.antennas]
if sort:
labels = sorted(labels)
efinfo = [self.antennas[label].get_E_fields(pol, flag=flag, tselect=tselect, fselect=fselect, datapool=datapool) for label in labels]
outdict = {}
outdict['labels'] = labels
outdict['twts'] = [einfo['twts'] for einfo in efinfo]
outdict['E-fields'] = [einfo['E-fields'] for einfo in efinfo]
if flag is None:
outdict['E-fields'] = NP.swapaxes(NP.asarray(outdict['E-fields']), 0, 1)
outdict['twts'] = NP.swapaxes(NP.asarray(outdict['twts']), 0, 1)
outdict['twts'] = outdict['twts'][:,:,NP.newaxis]
return outdict
############################################################################
def avgAutoCorr(self, pol=None, tbinsize=None):
"""
------------------------------------------------------------------------
Accumulates and averages auto-correlation of electric fields of
individual antennas under each polarization
Inputs:
pol [String] The polarization to be averaged. Can be set to 'P1' or
'P2'. If set to None, averaging for all the polarizations is
performed. Default=None
tbinsize [scalar or dictionary] Contains bin size of timestamps while
averaging. Default = None means all antenna E-field
auto-correlation spectra over all timestamps are averaged. If
scalar, the same (positive) value applies to all polarizations.
If dictionary, timestamp bin size (positive) in seconds is
provided under each key 'P1' and 'P2'. If any of the keys is
missing the auto-correlated antenna E-field spectra for that
polarization are averaged over all timestamps.
------------------------------------------------------------------------
"""
timestamps = NP.asarray(self.timestamps).astype(NP.float)
twts = {}
auto_corr_data = {}
if pol is None:
pol = ['P1', 'P2']
pol = NP.unique(NP.asarray(pol))
for p in pol:
Ef_info = self.get_E_fields(p, flag=None, tselect=NP.arange(len(self.timestamps)), fselect=None, aselect=None, datapool='stack', sort=True)
twts[p] = []
auto_corr_data[p] = {}
if tbinsize is None: # Average across all timestamps
auto_corr_data[p]['E-fields'] = NP.nansum(NP.abs(Ef_info['E-fields'])**2, axis=0, keepdims=True)
auto_corr_data[p]['twts'] = NP.sum(Ef_info['twts'], axis=0, keepdims=True).astype(NP.float)
auto_corr_data[p]['labels'] = Ef_info['labels']
self.tbinsize = tbinsize
elif isinstance(tbinsize, (int,float)): # Apply same time bin size to all polarizations
split_ind = NP.arange(timestamps.min()+tbinsize, timstamps.max(), tbinsize)
twts_split = NP.array_split(Ef_info['twts'], split_ind, axis=0)
Ef_split = NP.array_split(Ef_info['E-fields'], split_ind, axis=0)
for i in xrange(split_ind.size):
if 'E-fields' not in auto_corr_data[p]:
auto_corr_data[p]['E-fields'] = NP.nansum(NP.abs(Ef_info['E-fields'])**2, axis=0, keepdims=True)
auto_corr_data[p]['twts'] = NP.sum(Ef_info['twts'], axis=0, keepdims=True).astype(NP.float)
else:
auto_corr_data[p]['E-fields'] = NP.vstack((auto_corr_data[p]['E-fields'], NP.nansum(NP.abs(Ef_info['E-fields'])**2, axis=0, keepdims=True)))
auto_corr_data[p]['twts'] = NP.vstack((auto_corr_data[p]['twts'], NP.sum(Ef_info['twts'], axis=0, keepdims=True))).astype(NP.float)
auto_corr_data[p]['labels'] = Ef_info['labels']
self.tbinsize = tbinsize
elif isinstance(tbinsize, dict):
tbsize = {}
if p not in tbinsize:
auto_corr_data[p]['E-fields'] = NP.nansum(NP.abs(Ef_info['E-fields'])**2, axis=0, keepdims=True)
auto_corr_data[p]['twts'] = NP.sum(Ef_info['twts'], axis=0, keepdims=True).astype(NP.float)
tbsize[p] = None
elif tbinsize[p] is None:
auto_corr_data[p]['E-fields'] = NP.nansum(NP.abs(Ef_info['E-fields'])**2, axis=0, keepdims=True)
auto_corr_data[p]['twts'] = NP.sum(Ef_info['twts'], axis=0, keepdims=True).astype(NP.float)
tbsize[p] = None
elif isinstance(tbinsize[p], (int,float)):
split_ind = NP.arange(timestamps.min()+tbinsize, timstamps.max(), tbinsize)
twts_split = NP.array_split(Ef_info['twts'], split_ind, axis=0)
Ef_split = NP.array_split(Ef_info['E-fields'], split_ind, axis=0)
for i in xrange(split_ind.size):
if 'E-fields' not in auto_corr_data[p]:
auto_corr_data[p]['E-fields'] = NP.nansum(NP.abs(Ef_info['E-fields'])**2, axis=0, keepdims=True)
auto_corr_data[p]['twts'] = NP.sum(Ef_info['twts'], axis=0, keepdims=True).astype(NP.float)
else:
auto_corr_data[p]['E-fields'] = NP.vstack((auto_corr_data[p]['E-fields'], NP.nansum(NP.abs(Ef_info['E-fields'])**2, axis=0, keepdims=True)))
auto_corr_data[p]['twts'] = NP.vstack((auto_corr_data[p]['twts'], NP.sum(Ef_info['twts'], axis=0, keepdims=True))).astype(NP.float)
tbsize[pol] = tbinsize[pol]
else:
raise ValueError('Input tbinsize is invalid')
auto_corr_data[p]['labels'] = Ef_info['labels']
self.tbinsize = tbsize
else:
raise ValueError('Input tbinsize is invalid')
auto_corr_data[p]['E-fields'] = NP.nan_to_num(auto_corr_data[p]['E-fields'] / auto_corr_data[p]['twts']) # nan_to_num() just in case there are NaN
self.auto_corr_data['avg'] = auto_corr_data
############################################################################
def evalAutoCorr(self, pol=None, datapool=None, tbinsize=None):
"""
------------------------------------------------------------------------
Estimates antenna-wise E-field auto-correlations under both
polarizations. It can be for the most recent timestamp, stacked or
averaged along timestamps.
Inputs:
pol [String] The polarization for which auto-correlation is to be
estimated. Can be set to 'P1' or 'P2'. If set to None,
auto-correlation is estimated for all the polarizations.
Default=None
datapool [string] denotes the data pool from which electric fields are
to be selected. Accepted values are 'current', 'stack', avg' or
None (default, same as 'current'). If set to None or
'current', the value in tselect is ignored and only
electric fields of the most recent timestamp are selected. If
set to 'avg', the auto-correlations from the stack are
averaged along the timestamps using time bin size specified
in tbinsize
tbinsize [scalar or dictionary] Contains bin size of timestamps while
averaging. Will be used only if datapool is set to 'avg'.
Default = None means all antenna E-field
auto-correlation spectra over all timestamps are averaged. If
scalar, the same (positive) value applies to all polarizations.
If dictionary, timestamp bin size (positive) in seconds is
provided under each key 'P1' and 'P2'. If any of the keys is
missing the auto-correlated antenna E-field spectra for that
polarization are averaged over all timestamps.
------------------------------------------------------------------------
"""
if datapool not in [None, 'current', 'stack', 'avg']:
raise ValueError('Input datapool must be set to None, "current", "stack" or "avg"')
if pol is None:
pol = ['P1', 'P2']
pol = NP.unique(NP.asarray(pol))
if datapool in [None, 'current']:
self.auto_corr_data['current'] = {}
for p in pol:
Ef_info = self.get_E_fields(p, flag=None, tselect=-1, fselect=None, aselect=None, datapool='current', sort=True)
Ef_info['E-fields'] = NP.abs(Ef_info['E-fields'])**2
self.auto_corr_data['current'][p] = Ef_info
if datapool in [None, 'stack']:
self.auto_corr_data['stack'] = {}
for p in pol:
Ef_info = self.get_E_fields(p, flag=None, tselect=NP.arange(len(self.timestamps)), fselect=None, aselect=None, datapool='stack', sort=True)
Ef_info['E-fields'] = NP.abs(Ef_info['E-fields'])**2
self.auto_corr_data['stack'][p] = Ef_info
if datapool in [None, 'avg']:
self.avgAutoCorr(pol=pol, tbinsize=tbinsize)
############################################################################
def FT(self, pol=None, parallel=False, nproc=None):
"""
------------------------------------------------------------------------
Computes the Fourier transform of the time series of the antennas in the
antenna array to compute the visibility spectra
------------------------------------------------------------------------
"""
if not parallel:
for label in self.antennas:
self.antennas[label].FX()
elif parallel or (nproc is not None):
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
pool = MP.Pool(processes=nproc)
updated_antennas = pool.map(unwrap_antenna_FT, IT.izip(self.antennas.values()))
pool.close()
pool.join()
for antenna in updated_antennas:
self.antennas[antenna.label] = antenna
del updated_antennas
############################################################################
def grid(self, uvspacing=0.5, xypad=None, pow2=True):
"""
------------------------------------------------------------------------
Routine to produce a grid based on the antenna array
Inputs:
uvspacing [Scalar] Positive value indicating the maximum uv-spacing
desirable at the lowest wavelength (max frequency).
Default = 0.5
xypad [List] Padding to be applied around the antenna locations
before forming a grid. Units in meters. List elements should
be positive. If it is a one-element list, the element is
applicable to both x and y axes. If list contains three or
more elements, only the first two elements are considered
one for each axis. Default = None.
pow2 [Boolean] If set to True, the grid is forced to have a size
a next power of 2 relative to the actual sie required. If
False, gridding is done with the appropriate size as
determined by uvspacing. Default = True.
------------------------------------------------------------------------
"""
if self.f is None:
self.f = self.antennas.itervalues().next().f
if self.f0 is None:
self.f0 = self.antennas.itervalues().next().f0
wavelength = FCNST.c / self.f
min_lambda = NP.abs(wavelength).min()
# Change itervalues() to values() when porting to Python 3.x
# May have to change *blc and *trc with zip(*blc) and zip(*trc) when using Python 3.x
blc = NP.asarray([[self.antennas[label].blc[0,0], self.antennas[label].blc[0,1]] for label in self.antennas]).reshape(-1,2)
trc = NP.asarray([[self.antennas[label].trc[0,0], self.antennas[label].trc[0,1]] for label in self.antennas]).reshape(-1,2)
xycenter = 0.5 * (NP.amin(blc, axis=0, keepdims=True) + NP.amax(trc, axis=0, keepdims=True))
blc = blc - xycenter
trc = trc - xycenter
self.trc = NP.amax(NP.abs(NP.vstack((blc, trc))), axis=0).ravel() / min_lambda
self.blc = -1 * self.trc
self.antennas_center = xycenter
if xypad is None:
xypad = 0.0
self.gridu, self.gridv = GRD.grid_2d([(self.blc[0], self.trc[0]), (self.blc[1], self.trc[1])], pad=xypad/min_lambda, spacing=uvspacing, pow2=True)
self.grid_blc = NP.asarray([self.gridu.min(), self.gridv.min()])
self.grid_trc = NP.asarray([self.gridu.max(), self.gridv.max()])
self.grid_ready = True
############################################################################
def grid_convolve(self, pol=None, ants=None, unconvolve_existing=False,
normalize=False, method='NN', distNN=NP.inf, tol=None,
maxmatch=None, identical_antennas=True, cal_loop=False,
gridfunc_freq=None, mapping='weighted', wts_change=False,
parallel=False, nproc=None, pp_method='pool',
verbose=True):
"""
------------------------------------------------------------------------
Routine to project the complex illumination field pattern and the
electric fields on the grid. It can operate on the entire antenna array
or incrementally project the electric fields and complex illumination
field patterns from specific antennas on to an already existing grid.
(The latter is not implemented yet)
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1'
or 'P2'. If set to None, gridding for all the polarizations
is performed. Default = None
ants [instance of class AntennaArray, single instance or list
of instances of class Antenna, or a dictionary holding
instances of class Antenna] If a dictionary is provided,
the keys should be the antenna labels and the values
should be instances of class Antenna. If a list is
provided, it should be a list of valid instances of class
Antenna. These instance(s) of class Antenna will
be merged to the existing grid contained in the instance of
AntennaArray class. If ants is not provided (set to
None), the gridding operations will be performed on the
set of antennas contained in the instance of class
entire AntennaArray. Default = None.
unconvolve_existing
[Boolean] Default = False. If set to True, the effects of
gridding convolution contributed by the antenna(s)
specified will be undone before updating the antenna
measurements on the grid, if the antenna(s) is/are
already found to in the set of antennas held by the
instance of AntennaArray. If False and if one or more
antenna instances specified are already found to be held
in the instance of class AntennaArray, the code will stop
raising an error indicating the gridding oepration cannot
proceed.
normalize [Boolean] Default = False. If set to True, the gridded
weights are divided by the sum of weights so that the gridded
weights add up to unity. (Need to work on normaliation)
method [string] The gridding method to be used in applying the
antenna weights on to the antenna array grid.
Accepted values are 'NN' (nearest neighbour - default), 'CS'
(cubic spline), or 'BL' (Bi-linear). In case of applying grid
weights by 'NN' method, an optional distance upper bound for
the nearest neighbour can be provided in the parameter distNN
to prune the search and make it efficient. Currently, only
the nearest neighbour method is operational.
distNN [scalar] A positive value indicating the upper bound on
distance to the nearest neighbour in the gridding process. It
has units of distance, the same units as the antenna
attribute location and antenna array attribute gridx
and gridy. Default is NP.inf (infinite distance). It will be
internally converted to have same units as antenna
attributes wtspos (units in number of wavelengths)
maxmatch [scalar] A positive value indicating maximum number of input
locations in the antenna grid to be assigned. Default = None.
If set to None, all the antenna array grid elements specified
are assigned values for each antenna. For instance, to have
only one antenna array grid element to be populated per
antenna, use maxmatch=1.
tol [scalar] If set, only lookup data with abs(val) > tol will be
considered for nearest neighbour lookup. Default=None implies
all lookup values will be considered for nearest neighbour
determination. tol is to be interpreted as a minimum value
considered as significant in the lookup table.
identical_antennas
[boolean] indicates if all antenna elements are to be
treated as identical. If True (default), they are identical
and their gridding kernels are identical. If False, they are
not identical and each one has its own gridding kernel.
cal_loop [boolean] If True, the calibration loop is assumed to be ON
and hence the calibrated electric fields are set in the
calibration loop. If False (default), the calibration loop is
assumed to be OFF and the current electric fields are assumed
to be the calibrated data to be mapped to the grid
via gridding convolution.
gridfunc_freq
[String scalar] If set to None (not provided) or to 'scale'
assumes that attribute wtspos is given for a
reference frequency which need to be scaled for the frequency
channels. Will be ignored if the number of elements of list
in this attribute under the specific polarization are the
same as the number of frequency channels.
mapping [string] indicates the type of mapping between antenna
locations and the grid locations. Allowed values are
'sampled' and 'weighted' (default). 'sampled' means only the
antenna measurement closest ot a grid location contributes to
that grid location, whereas, 'weighted' means that all the
antennas contribute in a weighted fashion to their nearest
grid location. The former is faster but possibly discards
antenna data whereas the latter is slower but includes all
data along with their weights.
wts_change [boolean] indicates if weights and/or their lcoations have
changed from the previous intergration or snapshot.
Default=False means they have not changed. In such a case the
antenna-to-grid mapping and grid illumination pattern do not
have to be determined, and mapping and values from the
previous snapshot can be used. If True, a new mapping has to
be determined.
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
pp_method [string] specifies if the parallelization method is handled
automatically using multirocessing pool or managed manually
by individual processes and collecting results in a queue.
The former is specified by 'pool' (default) and the latter
by 'queue'. These are the two allowed values. The pool method
has easier bookkeeping and can be fast if the computations
not expected to be memory bound. The queue method is more
suited for memory bound processes but can be slower or
inefficient in terms of CPU management.
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
eps = 1.0e-10
if pol is None:
pol = ['P1', 'P2']
elif not isinstance(pol, list):
pol = [pol]
if not self.grid_ready:
self.grid()
antpol = ['P1', 'P2']
for apol in antpol:
if apol in pol:
if ants is not None:
if isinstance(ants, Antenna):
ants = [ants]
if isinstance(ants, (dict, AntennaArray)):
# Check if these antennas are new or old and compatible
for key in ants:
if isinstance(ants[key], Antenna): # required if ants is a dictionary and not instance of AntennaArray
if key in self.antennas:
if unconvolve_existing: # Effects on the grid of antennas already existing must be removed
if self.antennas[key]._gridinfo[apol]: # if gridding info is not empty
for i in range(len(self.f)):
self.grid_unconvolve(ants[key].label)
else:
raise KeyError('Antenna {0} already found to exist in the dictionary of antennas but cannot proceed grid_convolve() without unconvolving first.'.format(ants[key].label))
else:
del ants[key] # remove the dictionary element since it is not an Antenna instance
if identical_antennas and (gridfunc_freq == 'scale'):
ant_dict = self.antenna_positions(pol=apol, flag=False, sort=True, centering=True)
ant_xy = ant_dict['positions'][:,:2]
self.ordered_labels = ant_dict['labels']
n_ant = ant_xy.shape[0]
Ef_dict = self.get_E_fields_old(apol, flag=False, sort=True)
Ef = Ef_dict['E-fields'].astype(NP.complex64)
# Since antennas are identical, read from first antenna, since wtspos are scaled with frequency, read from first frequency channel
wtspos_xy = ants[0].wtspos[apol][0] * FCNST.c/self.f[0]
wts = ants[0].wts[apol][0]
n_wts = wts.size
reflocs_xy = ant_xy[:,NP.newaxis,:] + wtspos_xy[NP.newaxis,:,:]
refwts_xy = wts.reshape(1,-1) * NP.ones((n_ant,1))
reflocs_xy = reflocs_xy.reshape(-1,ant_xy.shape[1])
refwts_xy = refwts_xy.reshape(-1,1).astype(NP.complex64)
reflocs_uv = reflocs_xy[:,NP.newaxis,:] * self.f.reshape(1,-1,1) / FCNST.c
refwts_uv = refwts_xy * NP.ones((1,self.f.size))
reflocs_uv = reflocs_uv.reshape(-1,ant_xy.shape[1])
refwts_uv = refwts_uv.reshape(-1,1).ravel()
inplocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
ibind, nnval = LKP.lookup_1NN(reflocs_uv, refwts_uv, inplocs,
distance_ULIM=distNN*self.f.max()/FCNST.c,
remove_oob=True, tol=tol, maxmatch=maxmatch)[:2]
else:
ant_dict = self.antenna_positions(pol=apol, flag=None, sort=True, centering=True)
self.ordered_labels = ant_dict['labels']
ant_xy = ant_dict['positions'][:,:2] # n_ant x 2
n_ant = ant_xy.shape[0]
# Ef_dict = self.get_E_fields(apol, flag=None, sort=True)
# Ef = Ef_dict['E-fields'].astype(NP.complex64) # n_ant x nchan
if not cal_loop:
self.caldata[apol] = self.get_E_fields(apol, flag=None, tselect=-1, fselect=None, aselect=None, datapool='current', sort=True)
else:
if self.caldata[apol] is None:
self.caldata[apol] = self.get_E_fields(apol, flag=None, tselect=-1, fselect=None, aselect=None, datapool='current', sort=True)
Ef = self.caldata[apol]['E-fields'].astype(NP.complex64) # (n_ts=1) x n_ant x nchan
Ef = NP.squeeze(Ef, axis=0) # n_ant x nchan
if Ef.shape[0] != n_ant:
raise ValueError('Encountered unexpected behavior. Need to debug.')
ant_labels = self.caldata[apol]['labels']
twts = self.caldata[apol]['twts'] # (n_ts=1) x n_ant x (nchan=1)
twts = NP.squeeze(twts, axis=(0,2)) # n_ant
if verbose:
print 'Gathered antenna data for gridding convolution for timestamp {0}'.format(self.timestamp)
if wts_change or (not self.grid_mapper[apol]['labels']):
if gridfunc_freq == 'scale':
if identical_antennas:
wts_tol = 1e-6
# Since antennas are identical, read from first antenna, since wtspos are scaled with frequency, read from first frequency channel
wtspos_xy = self.antennas.itervalues().next().wtspos[apol][0] * FCNST.c/self.f[0]
wts = self.antennas.itervalues().next().wts[apol][0].astype(NP.complex64)
wtspos_xy = wtspos_xy[NP.abs(wts) >= wts_tol, :]
wts = wts[NP.abs(wts) >= wts_tol]
n_wts = wts.size
reflocs_xy = ant_xy[:,NP.newaxis,:] + wtspos_xy[NP.newaxis,:,:] # n_ant x n_wts x 2
refwts = wts.reshape(1,-1) * NP.ones((n_ant,1)) # n_ant x n_wts
else:
for i,label in enumerate(self.ordered_labels):
ant_wtspos = self.antennas[label].wtspos[apol][0]
ant_wts = self.antennas[label].wts[apol][0].astype(NP.complex64)
if i == 0:
wtspos = ant_wtspos[NP.newaxis,:,:] # 1 x n_wts x 2
refwts = ant_wts.reshape(1,-1) # 1 x n_wts
else:
wtspos = NP.vstack((wtspos, ant_wtspos[NP.newaxis,:,:])) # n_ant x n_wts x 2
refwts = NP.vstack((refwts, ant_wts.reshape(1,-1))) # n_ant x n_wts
reflocs_xy = ant_xy[:,NP.newaxis,:] + wtspos * FCNST.c/self.f[0] # n_ant x n_wts x 2
reflocs_xy = reflocs_xy.reshape(-1,ant_xy.shape[1]) # (n_ant x n_wts) x 2
refwts = refwts.ravel()
self.grid_mapper[apol]['refwts'] = NP.copy(refwts.ravel()) # (n_ant x n_wts)
else: # Weights do not scale with frequency (needs serious development)
pass
gridlocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
contributed_ant_grid_Ef = None
if parallel: # Use parallelization over frequency to determine gridding convolution
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
if pp_method == 'queue': ## Use MP.Queue(): useful for memory intensive parallelizing but can be slow
job_chunk_begin = range(0,self.f.size,nproc)
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} job chunks '.format(len(job_chunk_begin)), PGB.ETA()], maxval=len(job_chunk_begin)).start()
for ijob, job_start in enumerate(job_chunk_begin):
pjobs = []
out_q = MP.Queue()
for job_ind in xrange(job_start, min(job_start+nproc, self.f.size)): # Start the processes and store outputs in the queue
if mapping == 'weighted':
pjob = MP.Process(target=LKP.find_1NN_pp, args=(gridlocs, reflocs_xy * self.f[job_ind]/FCNST.c, job_ind, out_q, distNN*self.f.max()/FCNST.c, True), name='process-{0:0d}-channel-{1:0d}'.format(job_ind-job_start, job_ind))
else:
pjob = MP.Process(target=LKP.find_1NN_pp, args=(reflocs_xy * self.f[job_ind]/FCNST.c, gridlocs, job_ind, out_q, distNN*self.f.max()/FCNST.c, True), name='process-{0:0d}-channel-{1:0d}'.format(job_ind-job_start, job_ind))
pjob.start()
pjobs.append(pjob)
for p in xrange(len(pjobs)): # Unpack the queue output
outdict = out_q.get()
chan = outdict.keys()[0]
if mapping == 'weighted':
refind, gridind = outdict[chan]['inpind'], outdict[chan]['refind']
else:
gridind, refind = outdict[chan]['inpind'], outdict[chan]['refind']
self.grid_mapper[apol]['refind'] += [refind]
self.grid_mapper[apol]['gridind'] += [gridind]
ant_ind, lkp_ind = NP.unravel_index(refind, (n_ant, n_wts))
self.grid_mapper[apol]['ant']['ind_freq'] += [ant_ind]
gridind_unraveled = NP.unravel_index(gridind, self.gridu.shape) + (chan+NP.zeros(gridind.size,dtype=int),)
gridind_raveled = NP.ravel_multi_index(gridind_unraveled, self.gridu.shape+(self.f.size,))
if self.grid_mapper[apol]['ant']['ind_all'] is None:
self.grid_mapper[apol]['ant']['ind_all'] = NP.copy(ant_ind)
self.grid_mapper[apol]['ant']['illumination'] = refwts[refind]
contributed_ant_grid_Ef = refwts[refind] * Ef[ant_ind,chan]
self.grid_mapper[apol]['grid']['ind_all'] = NP.copy(gridind_raveled)
else:
self.grid_mapper[apol]['ant']['ind_all'] = NP.append(self.grid_mapper[apol]['ant']['ind_all'], ant_ind)
self.grid_mapper[apol]['ant']['illumination'] = NP.append(self.grid_mapper[apol]['ant']['illumination'], refwts[refind])
contributed_ant_grid_Ef = NP.append(contributed_ant_grid_Ef, refwts[refind] * Ef[ant_ind,chan])
self.grid_mapper[apol]['grid']['ind_all'] = NP.append(self.grid_mapper[apol]['grid']['ind_all'], gridind_raveled)
for pjob in pjobs:
pjob.join()
del out_q
if verbose:
progress.update(ijob+1)
if verbose:
progress.finish()
elif pp_method == 'pool': ## Using MP.Pool.map(): Can be faster if parallelizing is not memory intensive
list_of_gridlocs = [gridlocs] * self.f.size
list_of_reflocs = [reflocs_xy * f/FCNST.c for f in self.f]
list_of_dist_NN = [distNN*self.f.max()/FCNST.c] * self.f.size
list_of_remove_oob = [True] * self.f.size
pool = MP.Pool(processes=nproc)
if mapping == 'weighted':
list_of_NNout = pool.map(find_1NN_arg_splitter, IT.izip(list_of_gridlocs, list_of_reflocs, list_of_dist_NN, list_of_remove_oob))
else:
list_of_NNout = pool.map(find_1NN_arg_splitter, IT.izip(list_of_reflocs, list_of_gridlocs, list_of_dist_NN, list_of_remove_oob))
pool.close()
pool.join()
for chan, NNout in enumerate(list_of_NNout): # Unpack the pool output
if mapping == 'weighted':
refind, gridind = NNout[0], NNout[1]
else:
gridind, refind = NNout[0], NNout[1]
self.grid_mapper[apol]['refind'] += [refind]
self.grid_mapper[apol]['gridind'] += [gridind]
ant_ind, lkp_ind = NP.unravel_index(refind, (n_ant, n_wts))
self.grid_mapper[apol]['ant']['ind_freq'] += [ant_ind]
gridind_unraveled = NP.unravel_index(gridind, self.gridu.shape) + (chan+NP.zeros(gridind.size,dtype=int),)
gridind_raveled = NP.ravel_multi_index(gridind_unraveled, self.gridu.shape+(self.f.size,))
if chan == 0:
self.grid_mapper[apol]['ant']['ind_all'] = NP.copy(ant_ind)
self.grid_mapper[apol]['ant']['illumination'] = refwts[refind]
contributed_ant_grid_Ef = refwts[refind] * Ef[ant_ind,chan]
self.grid_mapper[apol]['grid']['ind_all'] = NP.copy(gridind_raveled)
else:
self.grid_mapper[apol]['ant']['ind_all'] = NP.append(self.grid_mapper[apol]['ant']['ind_all'], ant_ind)
self.grid_mapper[apol]['ant']['illumination'] = NP.append(self.grid_mapper[apol]['ant']['illumination'], refwts[refind])
contributed_ant_grid_Ef = NP.append(contributed_ant_grid_Ef, refwts[refind] * Ef[ant_ind,chan])
self.grid_mapper[apol]['grid']['ind_all'] = NP.append(self.grid_mapper[apol]['grid']['ind_all'], gridind_raveled)
else:
raise ValueError('Parallel processing method specified by input parameter ppmethod has to be "pool" or "queue"')
else: # Use serial processing over frequency to determine gridding convolution
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Frequency channels '.format(self.f.size), PGB.ETA()], maxval=self.f.size).start()
for i in xrange(self.f.size):
if mapping == 'weighted':
refind, gridind = LKP.find_1NN(gridlocs, reflocs_xy * self.f[i]/FCNST.c,
distance_ULIM=distNN*self.f.max()/FCNST.c,
remove_oob=True)[:2]
else:
gridind, refind = LKP.find_1NN(reflocs_xy * self.f[i]/FCNST.c, gridlocs,
distance_ULIM=distNN*self.f.max()/FCNST.c,
remove_oob=True)[:2]
self.grid_mapper[apol]['refind'] += [refind]
self.grid_mapper[apol]['gridind'] += [gridind]
ant_ind, lkp_ind = NP.unravel_index(refind, (n_ant, n_wts))
self.grid_mapper[apol]['ant']['ind_freq'] += [ant_ind]
gridind_unraveled = NP.unravel_index(gridind, self.gridu.shape) + (i+NP.zeros(gridind.size,dtype=int),)
gridind_raveled = NP.ravel_multi_index(gridind_unraveled, self.gridu.shape+(self.f.size,))
if i == 0:
self.grid_mapper[apol]['ant']['ind_all'] = NP.copy(ant_ind)
self.grid_mapper[apol]['ant']['illumination'] = refwts[refind]
contributed_ant_grid_Ef = refwts[refind] * Ef[ant_ind,i]
self.grid_mapper[apol]['grid']['ind_all'] = NP.copy(gridind_raveled)
else:
self.grid_mapper[apol]['ant']['ind_all'] = NP.append(self.grid_mapper[apol]['ant']['ind_all'], ant_ind)
self.grid_mapper[apol]['ant']['illumination'] = NP.append(self.grid_mapper[apol]['ant']['illumination'], refwts[refind])
contributed_ant_grid_Ef = NP.append(contributed_ant_grid_Ef, refwts[refind] * Ef[ant_ind,i])
self.grid_mapper[apol]['grid']['ind_all'] = NP.append(self.grid_mapper[apol]['grid']['ind_all'], gridind_raveled)
if verbose:
progress.update(i+1)
if verbose:
progress.finish()
self.grid_mapper[apol]['ant']['uniq_ind_all'] = NP.unique(self.grid_mapper[apol]['ant']['ind_all'])
self.grid_mapper[apol]['ant']['rev_ind_all'] = OPS.binned_statistic(self.grid_mapper[apol]['ant']['ind_all'], statistic='count', bins=NP.append(self.grid_mapper[apol]['ant']['uniq_ind_all'], self.grid_mapper[apol]['ant']['uniq_ind_all'].max()+1))[3]
if parallel and (mapping == 'weighted'): # Use parallel processing over antennas to determine antenna-grid mapping of gridded aperture illumination and electric fields
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
if pp_method == 'queue': ## Use MP.Queue(): useful for memory intensive parallelizing but can be slow
num_ant = self.grid_mapper[apol]['ant']['uniq_ind_all'].size
job_chunk_begin = range(0,num_ant,nproc)
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} job chunks '.format(len(job_chunk_begin)), PGB.ETA()], maxval=len(job_chunk_begin)).start()
for ijob, job_start in enumerate(job_chunk_begin):
pjobs1 = []
pjobs2 = []
out_q1 = MP.Queue()
out_q2 = MP.Queue()
for job_ind in xrange(job_start, min(job_start+nproc, num_ant)): # Start the parallel processes and store the output in the queue
label = self.ordered_labels[self.grid_mapper[apol]['ant']['uniq_ind_all'][job_ind]]
if self.grid_mapper[apol]['ant']['rev_ind_all'][job_ind] < self.grid_mapper[apol]['ant']['rev_ind_all'][job_ind+1]:
self.grid_mapper[apol]['labels'][label] = {}
self.grid_mapper[apol]['labels'][label]['twts'] = twts[ant_labels.index(label)]
# self.grid_mapper[apol]['labels'][label]['flag'] = self.antennas[label].antpol.flag[apol]
select_ant_ind = self.grid_mapper[apol]['ant']['rev_ind_all'][self.grid_mapper[apol]['ant']['rev_ind_all'][job_ind]:self.grid_mapper[apol]['ant']['rev_ind_all'][job_ind+1]]
gridind_raveled_around_ant = self.grid_mapper[apol]['grid']['ind_all'][select_ant_ind]
uniq_gridind_raveled_around_ant = NP.unique(gridind_raveled_around_ant)
self.grid_mapper[apol]['labels'][label]['gridind'] = uniq_gridind_raveled_around_ant
pjob1 = MP.Process(target=antenna_grid_mapper, args=(gridind_raveled_around_ant, contributed_ant_grid_Ef[select_ant_ind], NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1), label, out_q1), name='process-{0:0d}-{1}-E-field'.format(job_ind, label))
pjob2 = MP.Process(target=antenna_grid_mapper, args=(gridind_raveled_around_ant, self.grid_mapper[apol]['ant']['illumination'][select_ant_ind], NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1), label, out_q2), name='process-{0:0d}-{1}-illumination'.format(job_ind, label))
pjob1.start()
pjob2.start()
pjobs1.append(pjob1)
pjobs2.append(pjob2)
for p in xrange(len(pjobs1)): # Unpack the E-fields and aperture illumination information from the pool output
outdict = out_q1.get()
label = outdict.keys()[0]
self.grid_mapper[apol]['labels'][label]['Ef'] = outdict[label]
outdict = out_q2.get()
label = outdict.keys()[0]
self.grid_mapper[apol]['labels'][label]['illumination'] = outdict[label]
for pjob in pjobs1:
pjob1.join()
for pjob in pjobs2:
pjob2.join()
del out_q1, out_q2
if verbose:
progress.update(ijob+1)
if verbose:
progress.finish()
elif pp_method == 'pool': ## Using MP.Pool.map(): Can be faster if parallelizing is not memory intensive
list_of_gridind_raveled_around_ant = []
list_of_ant_grid_values = []
list_of_ant_Ef_contribution = []
list_of_ant_illumination = []
list_of_uniq_gridind_raveled_around_ant = []
list_of_ant_labels = []
for j in xrange(self.grid_mapper[apol]['ant']['uniq_ind_all'].size): # re-determine gridded electric fields due to each antenna
label = self.ordered_labels[self.grid_mapper[apol]['ant']['uniq_ind_all'][j]]
if self.grid_mapper[apol]['ant']['rev_ind_all'][j] < self.grid_mapper[apol]['ant']['rev_ind_all'][j+1]:
self.grid_mapper[apol]['labels'][label] = {}
self.grid_mapper[apol]['labels'][label]['twts'] = twts[ant_labels.index(label)]
# self.grid_mapper[apol]['labels'][label]['flag'] = self.antennas[label].antpol.flag[apol]
select_ant_ind = self.grid_mapper[apol]['ant']['rev_ind_all'][self.grid_mapper[apol]['ant']['rev_ind_all'][j]:self.grid_mapper[apol]['ant']['rev_ind_all'][j+1]]
gridind_raveled_around_ant = self.grid_mapper[apol]['grid']['ind_all'][select_ant_ind]
uniq_gridind_raveled_around_ant = NP.unique(gridind_raveled_around_ant)
self.grid_mapper[apol]['labels'][label]['gridind'] = uniq_gridind_raveled_around_ant
list_of_ant_labels += [label]
list_of_gridind_raveled_around_ant += [gridind_raveled_around_ant]
list_of_uniq_gridind_raveled_around_ant += [NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1)]
list_of_ant_Ef_contribution += [contributed_ant_grid_Ef[select_ant_ind]]
list_of_ant_illumination += [self.grid_mapper[apol]['ant']['illumination'][select_ant_ind]]
pool = MP.Pool(processes=nproc)
list_of_ant_grid_values = pool.map(antenna_grid_mapping_arg_splitter, IT.izip(list_of_gridind_raveled_around_ant, list_of_ant_Ef_contribution, list_of_uniq_gridind_raveled_around_ant))
pool.close()
pool.join()
for label,grid_values in IT.izip(list_of_ant_labels, list_of_ant_grid_values): # Unpack the gridded visibility information from the pool output
self.grid_mapper[apol]['labels'][label]['Ef'] = grid_values
if nproc is not None:
pool = MP.Pool(processes=nproc)
else:
pool = MP.Pool()
list_of_ant_grid_values = pool.map(antenna_grid_mapping_arg_splitter, IT.izip(list_of_gridind_raveled_around_ant, list_of_ant_illumination, list_of_uniq_gridind_raveled_around_ant))
pool.close()
pool.join()
for label,grid_values in IT.izip(list_of_ant_labels, list_of_ant_grid_values): # Unpack the gridded visibility and aperture illumination information from the pool output
self.grid_mapper[apol]['labels'][label]['illumination'] = grid_values
del list_of_ant_grid_values, list_of_gridind_raveled_around_ant, list_of_ant_Ef_contribution, list_of_ant_illumination, list_of_uniq_gridind_raveled_around_ant, list_of_ant_labels
else:
raise ValueError('Parallel processing method specified by input parameter ppmethod has to be "pool" or "queue"')
else: # Use serial processing over antennas to determine antenna-grid mapping of gridded aperture illumination and electric fields
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Antennas '.format(self.grid_mapper[apol]['ant']['uniq_ind_all'].size), PGB.ETA()], maxval=self.grid_mapper[apol]['ant']['uniq_ind_all'].size).start()
for j in xrange(self.grid_mapper[apol]['ant']['uniq_ind_all'].size):
label = self.ordered_labels[self.grid_mapper[apol]['ant']['uniq_ind_all'][j]]
if self.grid_mapper[apol]['ant']['rev_ind_all'][j] < self.grid_mapper[apol]['ant']['rev_ind_all'][j+1]:
select_ant_ind = self.grid_mapper[apol]['ant']['rev_ind_all'][self.grid_mapper[apol]['ant']['rev_ind_all'][j]:self.grid_mapper[apol]['ant']['rev_ind_all'][j+1]]
self.grid_mapper[apol]['labels'][label] = {}
self.grid_mapper[apol]['labels'][label]['twts'] = twts[ant_labels.index(label)]
# self.grid_mapper[apol]['labels'][label]['flag'] = self.antennas[label].antpol.flag[apol]
if mapping == 'weighted':
gridind_raveled_around_ant = self.grid_mapper[apol]['grid']['ind_all'][select_ant_ind]
uniq_gridind_raveled_around_ant = NP.unique(gridind_raveled_around_ant)
self.grid_mapper[apol]['labels'][label]['gridind'] = uniq_gridind_raveled_around_ant
self.grid_mapper[apol]['labels'][label]['Ef'] = OPS.binned_statistic(gridind_raveled_around_ant, contributed_ant_grid_Ef[select_ant_ind].real, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1))[0]
self.grid_mapper[apol]['labels'][label]['Ef'] = self.grid_mapper[apol]['labels'][label]['Ef'].astype(NP.complex64)
self.grid_mapper[apol]['labels'][label]['Ef'] += 1j * OPS.binned_statistic(gridind_raveled_around_ant, contributed_ant_grid_Ef[select_ant_ind].imag, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1))[0]
self.grid_mapper[apol]['labels'][label]['illumination'] = OPS.binned_statistic(gridind_raveled_around_ant, self.grid_mapper[apol]['ant']['illumination'][select_ant_ind].real, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1))[0]
self.grid_mapper[apol]['labels'][label]['illumination'] = self.grid_mapper[apol]['labels'][label]['illumination'].astype(NP.complex64)
self.grid_mapper[apol]['labels'][label]['illumination'] += 1j * OPS.binned_statistic(gridind_raveled_around_ant, self.grid_mapper[apol]['ant']['illumination'][select_ant_ind].imag, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1))[0]
else:
self.grid_mapper[apol]['labels'][label]['gridind'] = self.grid_mapper[apol]['grid']['ind_all'][select_ant_ind]
self.grid_mapper[apol]['labels'][label]['Ef'] = contributed_ant_grid_Ef[select_ant_ind]
self.grid_mapper[apol]['labels'][label]['illumination'] = self.grid_mapper[apol]['ant']['illumination'][select_ant_ind]
if verbose:
progress.update(j+1)
if verbose:
progress.finish()
else: # Only re-determine gridded electric fields
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Frequency channels '.format(self.f.size), PGB.ETA()], maxval=self.f.size).start()
for i in xrange(self.f.size): # Only re-estimate electric fields contributed by antennas
ant_refwts = self.grid_mapper[apol]['refwts'][self.grid_mapper[apol]['refind'][i]]
ant_Ef = Ef[self.grid_mapper[apol]['ant']['ind_freq'][i],i]
if i == 0:
contributed_ant_grid_Ef = ant_refwts * ant_Ef
else:
contributed_ant_grid_Ef = NP.append(contributed_ant_grid_Ef, ant_refwts * ant_Ef)
if verbose:
progress.update(i+1)
if verbose:
progress.finish()
if parallel and (mapping == 'weighted'): # Use parallel processing
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
if pp_method == 'queue': ## Use MP.Queue(): useful for memory intensive parallelizing but can be slow
num_ant = self.grid_mapper[apol]['ant']['uniq_ind_all'].size
job_chunk_begin = range(0,num_ant,nproc)
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} job chunks '.format(len(job_chunk_begin)), PGB.ETA()], maxval=len(job_chunk_begin)).start()
for ijob, job_start in enumerate(job_chunk_begin):
pjobs = []
out_q = MP.Queue()
for job_ind in xrange(job_start, min(job_start+nproc, num_ant)): # Start the parallel processes and store the outputs in a queue
label = self.ordered_labels[self.grid_mapper[apol]['ant']['uniq_ind_all'][job_ind]]
self.grid_mapper[apol]['labels'][label]['twts'] = twts[ant_labels.index(label)]
if self.grid_mapper[apol]['ant']['rev_ind_all'][job_ind] < self.grid_mapper[apol]['ant']['rev_ind_all'][job_ind+1]:
select_ant_ind = self.grid_mapper[apol]['ant']['rev_ind_all'][self.grid_mapper[apol]['ant']['rev_ind_all'][job_ind]:self.grid_mapper[apol]['ant']['rev_ind_all'][job_ind+1]]
gridind_raveled_around_ant = self.grid_mapper[apol]['grid']['ind_all'][select_ant_ind]
uniq_gridind_raveled_around_ant = self.grid_mapper[apol]['labels'][label]['gridind']
pjob = MP.Process(target=antenna_grid_mapper, args=(gridind_raveled_around_ant, contributed_ant_grid_Ef[select_ant_ind], NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1), label, out_q), name='process-{0:0d}-{1}-E-field'.format(job_ind, label))
pjob.start()
pjobs.append(pjob)
for p in xrange(len(pjobs)): # Unpack the gridded visibility information from the queue
outdict = out_q.get()
label = outdict.keys()[0]
self.grid_mapper[apol]['labels'][label]['Ef'] = outdict[label]
for pjob in pjobs:
pjob.join()
del out_q
if verbose:
progress.update(ijob+1)
if verbose:
progress.finish()
else: ## Use MP.Pool.map(): Can be faster if parallelizing is not memory intensive
list_of_gridind_raveled_around_ant = []
list_of_ant_Ef_contribution = []
list_of_uniq_gridind_raveled_around_ant = []
list_of_ant_labels = []
for j in xrange(self.grid_mapper[apol]['ant']['uniq_ind_all'].size): # re-determine gridded electric fields due to each antenna
if self.grid_mapper[apol]['ant']['rev_ind_all'][j] < self.grid_mapper[apol]['ant']['rev_ind_all'][j+1]:
select_ant_ind = self.grid_mapper[apol]['ant']['rev_ind_all'][self.grid_mapper[apol]['ant']['rev_ind_all'][j]:self.grid_mapper[apol]['ant']['rev_ind_all'][j+1]]
label = self.ordered_labels[self.grid_mapper[apol]['ant']['uniq_ind_all'][j]]
self.grid_mapper[apol]['labels'][label]['twts'] = twts[ant_labels.index(label)]
gridind_raveled_around_ant = self.grid_mapper[apol]['grid']['ind_all'][select_ant_ind]
uniq_gridind_raveled_around_ant = NP.unique(gridind_raveled_around_ant)
list_of_ant_labels += [label]
list_of_gridind_raveled_around_ant += [gridind_raveled_around_ant]
list_of_uniq_gridind_raveled_around_ant += [NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1)]
list_of_ant_Ef_contribution += [contributed_ant_grid_Ef[select_ant_ind]]
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
pool = MP.Pool(processes=nproc)
list_of_grid_Ef = pool.map(antenna_grid_mapping_arg_splitter, IT.izip(list_of_gridind_raveled_around_ant, list_of_ant_Ef_contribution, list_of_uniq_gridind_raveled_around_ant))
pool.close()
pool.join()
for label,grid_Ef in IT.izip(list_of_ant_labels, list_of_grid_Ef): # Unpack the gridded visibility information from the pool output
self.grid_mapper[apol]['labels'][label]['Ef'] = grid_Ef
del list_of_gridind_raveled_around_ant, list_of_grid_Ef, list_of_ant_Ef_contribution, list_of_uniq_gridind_raveled_around_ant, list_of_ant_labels
else: # use serial processing
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Antennas '.format(self.grid_mapper[apol]['ant']['uniq_ind_all'].size), PGB.ETA()], maxval=self.grid_mapper[apol]['ant']['uniq_ind_all'].size).start()
for j in xrange(self.grid_mapper[apol]['ant']['uniq_ind_all'].size): # re-determine gridded electric fields due to each antenna
if self.grid_mapper[apol]['ant']['rev_ind_all'][j] < self.grid_mapper[apol]['ant']['rev_ind_all'][j+1]:
select_ant_ind = self.grid_mapper[apol]['ant']['rev_ind_all'][self.grid_mapper[apol]['ant']['rev_ind_all'][j]:self.grid_mapper[apol]['ant']['rev_ind_all'][j+1]]
label = self.ordered_labels[self.grid_mapper[apol]['ant']['uniq_ind_all'][j]]
self.grid_mapper[apol]['labels'][label]['twts'] = twts[ant_labels.index(label)]
self.grid_mapper[apol]['labels'][label]['Ef'] = {}
if mapping == 'weighted':
gridind_raveled_around_ant = self.grid_mapper[apol]['grid']['ind_all'][select_ant_ind]
uniq_gridind_raveled_around_ant = self.grid_mapper[apol]['labels'][label]['gridind']
# uniq_gridind_raveled_around_ant = NP.unique(gridind_raveled_around_ant)
self.grid_mapper[apol]['labels'][label]['Ef'] = OPS.binned_statistic(gridind_raveled_around_ant, contributed_ant_grid_Ef[select_ant_ind].real, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1))[0]
self.grid_mapper[apol]['labels'][label]['Ef'] = self.grid_mapper[apol]['labels'][label]['Ef'].astype(NP.complex64)
self.grid_mapper[apol]['labels'][label]['Ef'] += 1j * OPS.binned_statistic(gridind_raveled_around_ant, contributed_ant_grid_Ef[select_ant_ind].imag, statistic='sum', bins=NP.append(uniq_gridind_raveled_around_ant, uniq_gridind_raveled_around_ant.max()+1))[0]
else:
self.grid_mapper[apol]['labels'][label]['Ef'] = contributed_ant_grid_Ef[select_ant_ind]
if verbose:
progress.update(j+1)
if verbose:
progress.finish()
############################################################################
def grid_convolve_new(self, pol=None, normalize=False, method='NN',
distNN=NP.inf, identical_antennas=True,
cal_loop=False, gridfunc_freq=None, wts_change=False,
parallel=False, nproc=None, pp_method='pool',
verbose=True):
"""
------------------------------------------------------------------------
Routine to project the complex illumination field pattern and the
electric fields on the grid from the antenna array
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1'
or 'P2'. If set to None, gridding for all the polarizations
is performed. Default = None
normalize [Boolean] Default = False. If set to True, the gridded
weights are divided by the sum of weights so that the gridded
weights add up to unity. (Need to work on normaliation)
method [string] The gridding method to be used in applying the
antenna weights on to the antenna array grid.
Accepted values are 'NN' (nearest neighbour - default), 'CS'
(cubic spline), or 'BL' (Bi-linear). In case of applying grid
weights by 'NN' method, an optional distance upper bound for
the nearest neighbour can be provided in the parameter distNN
to prune the search and make it efficient. Currently, only
the nearest neighbour method is operational.
distNN [scalar] A positive value indicating the upper bound on
distance to the nearest neighbour in the gridding process. It
has units of distance, the same units as the antenna
attribute location and antenna array attribute gridx
and gridy. Default is NP.inf (infinite distance). It will be
internally converted to have same units as antenna
attributes wtspos (units in number of wavelengths). To ensure
all relevant pixels in the grid, the search distance used
internally will be a fraction more than distNN
identical_antennas
[boolean] indicates if all antenna elements are to be
treated as identical. If True (default), they are identical
and their gridding kernels are identical. If False, they are
not identical and each one has its own gridding kernel.
cal_loop [boolean] If True, the calibration loop is assumed to be ON
and hence the calibrated electric fields are set in the
calibration loop. If False (default), the calibration loop is
assumed to be OFF and the current electric fields are assumed
to be the calibrated data to be mapped to the grid
via gridding convolution.
gridfunc_freq
[String scalar] If set to None (not provided) or to 'scale'
assumes that attribute wtspos is given for a
reference frequency which need to be scaled for the frequency
channels. Will be ignored if the number of elements of list
in this attribute under the specific polarization are the
same as the number of frequency channels.
wts_change [boolean] indicates if weights and/or their lcoations have
changed from the previous intergration or snapshot.
Default=False means they have not changed. In such a case the
antenna-to-grid mapping and grid illumination pattern do not
have to be determined, and mapping and values from the
previous snapshot can be used. If True, a new mapping has to
be determined.
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
pp_method [string] specifies if the parallelization method is handled
automatically using multirocessing pool or managed manually
by individual processes and collecting results in a queue.
The former is specified by 'pool' (default) and the latter
by 'queue'. These are the two allowed values. The pool method
has easier bookkeeping and can be fast if the computations
not expected to be memory bound. The queue method is more
suited for memory bound processes but can be slower or
inefficient in terms of CPU management.
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
elif not isinstance(pol, list):
pol = [pol]
if not self.grid_ready:
self.grid()
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
wavelength = FCNST.c / self.f
min_lambda = NP.abs(wavelength).min()
rmaxNN = 0.5 * NP.sqrt(du**2 + dv**2) * min_lambda
krn = {}
antpol = ['P1', 'P2']
for apol in antpol:
krn[apol] = None
if apol in pol:
ant_dict = self.antenna_positions(pol=apol, flag=None, sort=True, centering=True)
self.ordered_labels = ant_dict['labels']
ant_xy = ant_dict['positions'][:,:2] # n_ant x 2
n_ant = ant_xy.shape[0]
if not cal_loop:
self.caldata[apol] = self.get_E_fields(apol, flag=None, tselect=-1, fselect=None, aselect=None, datapool='current', sort=True)
else:
if self.caldata[apol] is None:
self.caldata[apol] = self.get_E_fields(apol, flag=None, tselect=-1, fselect=None, aselect=None, datapool='current', sort=True)
Ef = self.caldata[apol]['E-fields'].astype(NP.complex64) # (n_ts=1) x n_ant x nchan
Ef = NP.squeeze(Ef, axis=0) # n_ant x nchan
if Ef.shape[0] != n_ant:
raise ValueError('Encountered unexpected behavior. Need to debug.')
ant_labels = self.caldata[apol]['labels']
twts = self.caldata[apol]['twts'] # (n_ts=1) x n_ant x (nchan=1)
twts = NP.squeeze(twts, axis=(0,2)) # n_ant
if verbose:
print 'Gathered antenna data for gridding convolution for timestamp {0}'.format(self.timestamp)
if wts_change or (not self.grid_mapper[apol]['all_ant2grid']):
self.grid_mapper[apol]['per_ant2grid'] = []
self.grid_mapper[apol]['all_ant2grid'] = {}
gridlocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
if gridfunc_freq == 'scale':
grid_xy = gridlocs[NP.newaxis,:,:] * wavelength.reshape(-1,1,1) # nchan x nv x nu
wl = NP.ones(gridlocs.shape[0])[NP.newaxis,:] * wavelength.reshape(-1,1)
grid_xy = grid_xy.reshape(-1,2)
wl = wl.reshape(-1)
indNN_list, antind, fvu_gridind = LKP.find_NN(ant_xy, grid_xy, distance_ULIM=2.0*distNN, flatten=True, parallel=False)
dxy = grid_xy[fvu_gridind,:] - ant_xy[antind,:]
fvu_gridind_unraveled = NP.unravel_index(fvu_gridind, (self.f.size,)+self.gridu.shape) # f-v-u order since temporary grid was created as nchan x nv x nu
self.grid_mapper[apol]['all_ant2grid']['antind'] = NP.copy(antind)
self.grid_mapper[apol]['all_ant2grid']['u_gridind'] = NP.copy(fvu_gridind_unraveled[2])
self.grid_mapper[apol]['all_ant2grid']['v_gridind'] = NP.copy(fvu_gridind_unraveled[1])
self.grid_mapper[apol]['all_ant2grid']['f_gridind'] = NP.copy(fvu_gridind_unraveled[0])
self.grid_mapper[apol]['all_ant2grid']['indNN_list'] = copy.deepcopy(indNN_list)
if identical_antennas:
arbitrary_antenna_aperture = self.antennas.itervalues().next().aperture
krn = arbitrary_antenna_aperture.compute(dxy, wavelength=wl[fvu_gridind], pol=apol, rmaxNN=rmaxNN, load_lookup=False)
else:
# This block #1 is one way to go about per antenna
for ai,gi in enumerate(indNN_list):
if len(gi) > 0:
label = self.ordered_labels[ai]
ind = NP.asarray(gi)
diffxy = grid_xy[ind,:].reshape(-1,2) - ant_xy[ai,:].reshape(-1,2)
krndict = self.antennas[label].aperture.compute(diffxy, wavelength=wl[ind], pol=apol, rmaxNN=rmaxNN, load_lookup=False)
if krn[apol] is None:
krn[apol] = NP.copy(krndict[apol])
else:
krn[apol] = NP.append(krn[apol], krndict[apol])
# # This block #2 is another way equivalent to above block #1
# uniq_antind = NP.unique(antind)
# anthist, antbe, antbn, antri = OPS.binned_statistic(antind, statistic='count', bins=NP.append(uniq_antind, uniq_antind.max()+1))
# for i,uantind in enumerate(uniq_antind):
# label = self.ordered_labels[uantind]
# ind = antri[antri[i]:antri[i+1]]
# krndict = self.antennas[label].aperture.compute(dxy[ind,:], wavelength=wl[ind], pol=apol, rmaxNN=rmaxNN, load_lookup=False)
# if krn[apol] is None:
# krn[apol] = NP.copy(krndict[apol])
# else:
# krn[apol] = NP.append(krn[apol], krndict[apol])
self.grid_mapper[apol]['all_ant2grid']['illumination'] = NP.copy(krn[apol])
else: # Weights do not scale with frequency (needs serious development)
pass
# Determine weights that can normalize sum of kernel per antenna per frequency to unity
per_ant_per_freq_norm_wts = NP.zeros(antind.size, dtype=NP.complex64)
# per_ant_per_freq_norm_wts = NP.ones(antind.size, dtype=NP.complex64)
runsum = 0
for ai,gi in enumerate(indNN_list):
if len(gi) > 0:
fvu_ind = NP.asarray(gi)
unraveled_fvu_ind = NP.unravel_index(fvu_ind, (self.f.size,)+self.gridu.shape)
f_ind = unraveled_fvu_ind[0]
v_ind = unraveled_fvu_ind[1]
u_ind = unraveled_fvu_ind[2]
chanhist, chanbe, chanbn, chanri = OPS.binned_statistic(f_ind, statistic='count', bins=NP.arange(self.f.size+1))
for ci in xrange(self.f.size):
if chanhist[ci] > 0.0:
select_chan_ind = chanri[chanri[ci]:chanri[ci+1]]
per_ant_per_freq_kernel_sum = NP.sum(krn[apol][runsum:runsum+len(gi)][select_chan_ind])
per_ant_per_freq_norm_wts[runsum:runsum+len(gi)][select_chan_ind] = 1.0 / per_ant_per_freq_kernel_sum
per_ant2grid_info = {}
per_ant2grid_info['label'] = self.ordered_labels[ai]
per_ant2grid_info['f_gridind'] = NP.copy(f_ind)
per_ant2grid_info['u_gridind'] = NP.copy(u_ind)
per_ant2grid_info['v_gridind'] = NP.copy(v_ind)
# per_ant2grid_info['fvu_gridind'] = NP.copy(gi)
per_ant2grid_info['per_ant_per_freq_norm_wts'] = per_ant_per_freq_norm_wts[runsum:runsum+len(gi)]
per_ant2grid_info['illumination'] = krn[apol][runsum:runsum+len(gi)]
self.grid_mapper[apol]['per_ant2grid'] += [copy.deepcopy(per_ant2grid_info)]
runsum += len(gi)
self.grid_mapper[apol]['all_ant2grid']['per_ant_per_freq_norm_wts'] = NP.copy(per_ant_per_freq_norm_wts)
# Determine the gridded electric fields
Ef_on_grid = Ef[(self.grid_mapper[apol]['all_ant2grid']['antind'], self.grid_mapper[apol]['all_ant2grid']['f_gridind'])]
self.grid_mapper[apol]['all_ant2grid']['Ef'] = copy.deepcopy(Ef_on_grid)
runsum = 0
for ai,gi in enumerate(self.grid_mapper[apol]['all_ant2grid']['indNN_list']):
if len(gi) > 0:
self.grid_mapper[apol]['per_ant2grid'][ai]['Ef'] = Ef_on_grid[runsum:runsum+len(gi)]
runsum += len(gi)
############################################################################
def genMappingMatrix(self, pol=None, normalize=True, method='NN',
distNN=NP.inf, identical_antennas=True,
gridfunc_freq=None, wts_change=False, parallel=False,
nproc=None, verbose=True):
"""
------------------------------------------------------------------------
Routine to construct sparse antenna-to-grid mapping matrix that will be
used in projecting illumination and electric fields from the array of
antennas onto the grid. It has elements very common to
grid_convolve_new()
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1'
or 'P2'. If set to None, gridding for all the polarizations
is performed. Default = None
normalize [Boolean] Default = False. If set to True, the gridded
weights are divided by the sum of weights so that the gridded
weights add up to unity. (Need to work on normalization)
method [string] The gridding method to be used in applying the
antenna weights on to the antenna array grid.
Accepted values are 'NN' (nearest neighbour - default), 'CS'
(cubic spline), or 'BL' (Bi-linear). In case of applying grid
weights by 'NN' method, an optional distance upper bound for
the nearest neighbour can be provided in the parameter distNN
to prune the search and make it efficient. Currently, only
the nearest neighbour method is operational.
distNN [scalar] A positive value indicating the upper bound on
distance to the nearest neighbour in the gridding process. It
has units of distance, the same units as the antenna
attribute location and antenna array attribute gridx
and gridy. Default is NP.inf (infinite distance). It will be
internally converted to have same units as antenna
attributes wtspos (units in number of wavelengths). To ensure
all relevant pixels in the grid, the search distance used
internally will be a fraction more than distNN
identical_antennas
[boolean] indicates if all antenna elements are to be
treated as identical. If True (default), they are identical
and their gridding kernels are identical. If False, they are
not identical and each one has its own gridding kernel.
gridfunc_freq
[String scalar] If set to None (not provided) or to 'scale'
assumes that attribute wtspos is given for a
reference frequency which need to be scaled for the frequency
channels. Will be ignored if the number of elements of list
in this attribute under the specific polarization are the
same as the number of frequency channels.
wts_change [boolean] indicates if weights and/or their lcoations have
changed from the previous intergration or snapshot.
Default=False means they have not changed. In such a case the
antenna-to-grid mapping and grid illumination pattern do not
have to be determined, and mapping and values from the
previous snapshot can be used. If True, a new mapping has to
be determined.
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
NOTE: Although certain portions are parallelizable, the overheads in
these processes seem to make it worse than serial processing. It is
advisable to stick to serialized version unless testing with larger
data sets clearly indicates otherwise.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
elif not isinstance(pol, list):
pol = [pol]
if not self.grid_ready:
self.grid()
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
wavelength = FCNST.c / self.f
min_lambda = NP.abs(wavelength).min()
rmaxNN = 0.5 * NP.sqrt(du**2 + dv**2) * min_lambda
krn = {}
# self.ant2grid_mapper = {}
antpol = ['P1', 'P2']
for apol in antpol:
krn[apol] = None
# self.ant2grid_mapper[apol] = None
if apol in pol:
ant_dict = self.antenna_positions(pol=apol, flag=None, sort=True, centering=True)
self.ordered_labels = ant_dict['labels']
ant_xy = ant_dict['positions'][:,:2] # n_ant x 2
n_ant = ant_xy.shape[0]
if verbose:
print 'Gathered antenna data for gridding convolution for timestamp {0}'.format(self.timestamp)
if wts_change or (not self.grid_mapper[apol]['all_ant2grid']):
self.ant2grid_mapper[apol] = None
self.grid_mapper[apol]['per_ant2grid'] = []
self.grid_mapper[apol]['all_ant2grid'] = {}
gridlocs = NP.hstack((self.gridu.reshape(-1,1), self.gridv.reshape(-1,1)))
if gridfunc_freq == 'scale':
grid_xy = gridlocs[NP.newaxis,:,:] * wavelength.reshape(-1,1,1) # nchan x nv x nu
wl = NP.ones(gridlocs.shape[0])[NP.newaxis,:] * wavelength.reshape(-1,1)
grid_xy = grid_xy.reshape(-1,2)
wl = wl.reshape(-1)
indNN_list, antind, fvu_gridind = LKP.find_NN(ant_xy, grid_xy, distance_ULIM=2.0*distNN, flatten=True, parallel=False)
dxy = grid_xy[fvu_gridind,:] - ant_xy[antind,:]
fvu_gridind_unraveled = NP.unravel_index(fvu_gridind, (self.f.size,)+self.gridu.shape) # f-v-u order since temporary grid was created as nchan x nv x nu
self.grid_mapper[apol]['all_ant2grid']['antind'] = NP.copy(antind)
self.grid_mapper[apol]['all_ant2grid']['u_gridind'] = NP.copy(fvu_gridind_unraveled[2])
self.grid_mapper[apol]['all_ant2grid']['v_gridind'] = NP.copy(fvu_gridind_unraveled[1])
self.grid_mapper[apol]['all_ant2grid']['f_gridind'] = NP.copy(fvu_gridind_unraveled[0])
# self.grid_mapper[apol]['all_ant2grid']['indNN_list'] = copy.deepcopy(indNN_list)
if identical_antennas:
arbitrary_antenna_aperture = self.antennas.itervalues().next().aperture
krn = arbitrary_antenna_aperture.compute(dxy, wavelength=wl[fvu_gridind], pol=apol, rmaxNN=rmaxNN, load_lookup=False)
else:
# This block #1 is one way to go about per antenna
for ai,gi in enumerate(indNN_list):
if len(gi) > 0:
label = self.ordered_labels[ai]
ind = NP.asarray(gi)
diffxy = grid_xy[ind,:].reshape(-1,2) - ant_xy[ai,:].reshape(-1,2)
krndict = self.antennas[label].aperture.compute(diffxy, wavelength=wl[ind], pol=apol, rmaxNN=rmaxNN, load_lookup=False)
if krn[apol] is None:
krn[apol] = NP.copy(krndict[apol])
else:
krn[apol] = NP.append(krn[apol], krndict[apol])
# # This block #2 is another way equivalent to above block #1
# uniq_antind = NP.unique(antind)
# anthist, antbe, antbn, antri = OPS.binned_statistic(antind, statistic='count', bins=NP.append(uniq_antind, uniq_antind.max()+1))
# for i,uantind in enumerate(uniq_antind):
# label = self.ordered_labels[uantind]
# ind = antri[antri[i]:antri[i+1]]
# krndict = self.antennas[label].aperture.compute(dxy[ind,:], wavelength=wl[ind], pol=apol, rmaxNN=rmaxNN, load_lookup=False)
# if krn[apol] is None:
# krn[apol] = NP.copy(krndict[apol])
# else:
# krn[apol] = NP.append(krn[apol], krndict[apol])
self.grid_mapper[apol]['all_ant2grid']['illumination'] = NP.copy(krn[apol])
else: # Weights do not scale with frequency (needs serious development)
pass
# Determine weights that can normalize sum of kernel per antenna per frequency to unity
per_ant_per_freq_norm_wts = NP.zeros(antind.size, dtype=NP.complex64)
# per_ant_per_freq_norm_wts = NP.ones(antind.size, dtype=NP.complex64)
if parallel or (nproc is not None):
list_of_val = []
list_of_rowcol_tuple = []
else:
spval = []
sprow = []
spcol = []
runsum = 0
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Antennas '.format(n_ant), PGB.ETA()], maxval=n_ant).start()
for ai,gi in enumerate(indNN_list):
if len(gi) > 0:
fvu_ind = NP.asarray(gi)
unraveled_fvu_ind = NP.unravel_index(fvu_ind, (self.f.size,)+self.gridu.shape)
f_ind = unraveled_fvu_ind[0]
v_ind = unraveled_fvu_ind[1]
u_ind = unraveled_fvu_ind[2]
chanhist, chanbe, chanbn, chanri = OPS.binned_statistic(f_ind, statistic='count', bins=NP.arange(self.f.size+1))
for ci in xrange(self.f.size):
if chanhist[ci] > 0.0:
select_chan_ind = chanri[chanri[ci]:chanri[ci+1]]
per_ant_per_freq_kernel_sum = NP.sum(krn[apol][runsum:runsum+len(gi)][select_chan_ind])
per_ant_per_freq_norm_wts[runsum:runsum+len(gi)][select_chan_ind] = 1.0 / per_ant_per_freq_kernel_sum
per_ant2grid_info = {}
per_ant2grid_info['label'] = self.ordered_labels[ai]
per_ant2grid_info['f_gridind'] = NP.copy(f_ind)
per_ant2grid_info['u_gridind'] = NP.copy(u_ind)
per_ant2grid_info['v_gridind'] = NP.copy(v_ind)
# per_ant2grid_info['fvu_gridind'] = NP.copy(gi)
per_ant2grid_info['per_ant_per_freq_norm_wts'] = per_ant_per_freq_norm_wts[runsum:runsum+len(gi)]
per_ant2grid_info['illumination'] = krn[apol][runsum:runsum+len(gi)]
self.grid_mapper[apol]['per_ant2grid'] += [copy.deepcopy(per_ant2grid_info)]
runsum += len(gi)
# determine the sparse interferometer-to-grid mapping matrix pre-requisites
val = per_ant2grid_info['per_ant_per_freq_norm_wts']*per_ant2grid_info['illumination']
vuf_gridind_unraveled = (per_ant2grid_info['v_gridind'],per_ant2grid_info['u_gridind'],per_ant2grid_info['f_gridind'])
vuf_gridind_raveled = NP.ravel_multi_index(vuf_gridind_unraveled, (self.gridu.shape+(self.f.size,)))
if (not parallel) and (nproc is None):
spval += val.tolist()
sprow += vuf_gridind_raveled.tolist()
spcol += (per_ant2grid_info['f_gridind'] + ai*self.f.size).tolist()
else:
list_of_val += [per_ant2grid_info['per_ant_per_freq_norm_wts']*per_ant2grid_info['illumination']]
list_of_rowcol_tuple += [(vuf_gridind_raveled, per_ant2grid_info['f_gridind'])]
if verbose:
progress.update(ai+1)
if verbose:
progress.finish()
# determine the sparse interferometer-to-grid mapping matrix
if parallel or (nproc is not None):
list_of_shapes = [(self.gridu.size*self.f.size, self.f.size)] * n_ant
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
pool = MP.Pool(processes=nproc)
list_of_spmat = pool.map(genMatrixMapper_arg_splitter, IT.izip(list_of_val, list_of_rowcol_tuple, list_of_shapes))
self.ant2grid_mapper[apol] = SpM.hstack(list_of_spmat, format='csr')
else:
spval = NP.asarray(spval)
sprowcol = (NP.asarray(sprow), NP.asarray(spcol))
self.ant2grid_mapper[apol] = SpM.csr_matrix((spval, sprowcol), shape=(self.gridu.size*self.f.size, n_ant*self.f.size))
self.grid_mapper[apol]['all_ant2grid']['per_ant_per_freq_norm_wts'] = NP.copy(per_ant_per_freq_norm_wts)
############################################################################
def applyMappingMatrix(self, pol=None, cal_loop=False, verbose=True):
"""
------------------------------------------------------------------------
Constructs the grid of complex field illumination and electric fields
using the sparse antenna-to-grid mapping matrix. Intended to serve as a
"matrix" alternative to make_grid_cube_new()
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1' or
'P2'. If set to None, gridding for all the polarizations is
performed. Default=None
cal_loop
[boolean] If True, the calibration loop is assumed to be ON
and hence the calibrated electric fields are set in the
calibration loop. If False (default), the calibration loop is
assumed to be OFF and the current electric fields are assumed
to be the calibrated data to be mapped to the grid
via gridding convolution.
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
pol = NP.unique(NP.asarray(pol))
for apol in pol:
if verbose:
print 'Gridding aperture illumination and electric fields for polarization {0} ...'.format(apol)
if apol not in ['P1', 'P2']:
raise ValueError('Invalid specification for input parameter pol')
if not cal_loop:
self.caldata[apol] = self.get_E_fields(apol, flag=None, tselect=-1, fselect=None, aselect=None, datapool='current', sort=True)
else:
if self.caldata[apol] is None:
self.caldata[apol] = self.get_E_fields(apol, flag=None, tselect=-1, fselect=None, aselect=None, datapool='current', sort=True)
Ef = self.caldata[apol]['E-fields'].astype(NP.complex64) # (n_ts=1) x n_ant x nchan
Ef = NP.squeeze(Ef, axis=0) # n_ant x nchan
twts = self.caldata[apol]['twts'] # (n_ts=1) x n_ant x 1
twts = NP.squeeze(twts, axis=0) # n_ant x 1
Ef = Ef * twts # applies antenna flagging, n_ant x nchan
wts = twts * NP.ones(self.f.size).reshape(1,-1) # n_ant x nchan
wts[NP.isnan(Ef)] = 0.0
Ef[NP.isnan(Ef)] = 0.0
Ef = Ef.ravel()
wts = wts.ravel()
sparse_Ef = SpM.csr_matrix(Ef)
sparse_wts = SpM.csr_matrix(wts)
# Store as sparse matrices
self.grid_illumination[apol] = self.ant2grid_mapper[apol].dot(sparse_wts.T)
self.grid_Ef[apol] = self.ant2grid_mapper[apol].dot(sparse_Ef.T)
# # Store as dense matrices
# self.grid_illumination[apol] = self.ant2grid_mapper[apol].dot(wts).reshape(self.gridu.shape+(self.f.size,))
# self.grid_Ef[apol] = self.ant2grid_mapper[apol].dot(Ef).reshape(self.gridu.shape+(self.f.size,))
if verbose:
print 'Gridded aperture illumination and electric fields for polarization {0} from {1:0d} unflagged contributing antennas'.format(apol, NP.sum(twts).astype(int))
############################################################################
def make_grid_cube(self, pol=None, verbose=True):
"""
------------------------------------------------------------------------
Constructs the grid of complex field illumination and electric fields
using the gridding information determined for every antenna. Flags are
taken into account while constructing this grid.
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1' or
'P2'. If set to None, gridding for all the polarizations is
performed. Default=None
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
pol = NP.unique(NP.asarray(pol))
for apol in pol:
if verbose:
print 'Gridding aperture illumination and electric fields for polarization {0} ...'.format(apol)
if apol not in ['P1', 'P2']:
raise ValueError('Invalid specification for input parameter pol')
if apol not in self._ant_contribution:
raise KeyError('Key {0} not found in attribute _ant_contribution'.format(apol))
self.grid_illumination[apol] = NP.zeros((self.gridu.shape + (self.f.size,)), dtype=NP.complex_)
self.grid_Ef[apol] = NP.zeros((self.gridu.shape + (self.f.size,)), dtype=NP.complex_)
labels = self.grid_mapper[apol]['labels'].keys()
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Antennas '.format(len(labels)), PGB.ETA()], maxval=len(labels)).start()
loopcount = 0
num_unflagged = 0
# while loopcount < len(labels):
# antinfo = self.grid_mapper[apol]['labels'].itervalues().next()
for antlabel, antinfo in self.grid_mapper[apol]['labels'].iteritems():
if not self.antennas[antlabel].antpol.flag[apol]:
num_unflagged += 1
gridind_unraveled = NP.unravel_index(antinfo['gridind'], self.gridu.shape+(self.f.size,))
self.grid_illumination[apol][gridind_unraveled] += antinfo['illumination']
self.grid_Ef[apol][gridind_unraveled] += antinfo['Ef']
if verbose:
progress.update(loopcount+1)
loopcount += 1
if verbose:
progress.finish()
if verbose:
print 'Gridded aperture illumination and electric fields for polarization {0} from {1:0d} unflagged contributing antennas'.format(apol, num_unflagged)
############################################################################
def make_grid_cube_new(self, pol=None, verbose=True):
"""
------------------------------------------------------------------------
Constructs the grid of complex field illumination and electric fields
using the gridding information determined for every antenna. Flags are
taken into account while constructing this grid.
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1' or
'P2'. If set to None, gridding for all the polarizations is
performed. Default=None
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
pol = NP.unique(NP.asarray(pol))
for apol in pol:
if verbose:
print 'Gridding aperture illumination and electric fields for polarization {0} ...'.format(apol)
if apol not in ['P1', 'P2']:
raise ValueError('Invalid specification for input parameter pol')
if apol not in self._ant_contribution:
raise KeyError('Key {0} not found in attribute _ant_contribution'.format(apol))
self.grid_illumination[apol] = NP.zeros((self.gridu.shape + (self.f.size,)), dtype=NP.complex_)
self.grid_Ef[apol] = NP.zeros((self.gridu.shape + (self.f.size,)), dtype=NP.complex_)
nlabels = len(self.grid_mapper[apol]['per_ant2grid'])
loopcount = 0
num_unflagged = 0
if verbose:
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Antennas '.format(nlabels), PGB.ETA()], maxval=nlabels).start()
for ai,per_ant2grid_info in enumerate(self.grid_mapper[apol]['per_ant2grid']):
antlabel = per_ant2grid_info['label']
if not self.antennas[antlabel].antpol.flag[apol]:
num_unflagged += 1
vuf_gridind_unraveled = (per_ant2grid_info['v_gridind'],per_ant2grid_info['u_gridind'],per_ant2grid_info['f_gridind'])
self.grid_illumination[apol][vuf_gridind_unraveled] += per_ant2grid_info['per_ant_per_freq_norm_wts'] * per_ant2grid_info['illumination']
self.grid_Ef[apol][vuf_gridind_unraveled] += per_ant2grid_info['per_ant_per_freq_norm_wts'] * per_ant2grid_info['Ef'] * per_ant2grid_info['illumination']
if verbose:
progress.update(loopcount+1)
loopcount += 1
if verbose:
progress.finish()
if verbose:
print 'Gridded aperture illumination and electric fields for polarization {0} from {1:0d} unflagged contributing antennas'.format(apol, num_unflagged)
############################################################################
def evalAntennaPairCorrWts(self, label1, label2=None, forceeval=False):
"""
------------------------------------------------------------------------
Evaluate correlation of pair of antenna illumination weights on grid.
It will be computed only if it was not computed or stored in attribute
pairwise_typetag_crosswts_vuf earlier
Inputs:
label1 [string] Label of first antenna. Must be specified (no default)
label2 [string] Label of second antenna. If specified as None
(default), it will be set equal to label1 in which case the
auto-correlation of antenna weights is evaluated
forceeval
[boolean] When set to False (default) the correlation in
the UV plane is not evaluated if it was already evaluated
earlier. If set to True, it will be forcibly evaluated
independent of whether they were already evaluated or not
------------------------------------------------------------------------
"""
try:
label1
except NameError:
raise NameError('Input label1 must be specified')
if label1 not in self.antennas:
raise KeyError('Input label1 not found in current instance of class AntennaArray')
if label2 is None:
label2 = label1
if label2 not in self.antennas:
raise KeyError('Input label2 not found in current instance of class AntennaArray')
if (label1, label2) in self.antenna_pair_to_typetag:
typetag_pair = self.antenna_pair_to_typetag[(label1,label2)]
elif (label2, label1) in self.antenna_pair_to_typetag:
typetag_pair = self.antenna_pair_to_typetag[(label2,label1)]
else:
raise KeyError('Antenna pair not found in attribute antenna_pair_to_type. Needs debugging')
do_update = False
typetag1, typetag2 = typetag_pair
if forceeval or (typetag_pair not in self.pairwise_typetag_crosswts_vuf):
if forceeval:
if typetag_pair not in self.pairwise_typetag_crosswts_vuf:
do_update = True
else:
if 'last_updated' not in self.pairwise_typetag_crosswts_vuf[typetag_pair]:
do_update = True
else:
if self.timestamp - self.pairwise_typetag_crosswts_vuf[typetag_pair]['last_updated'] > 1e-10:
do_update = True
if typetag_pair not in self.pairwise_typetag_crosswts_vuf:
do_update = True
if do_update:
pol = ['P1', 'P2']
self.pairwise_typetag_crosswts_vuf[typetag_pair] = {}
self.pairwise_typetag_crosswts_vuf[typetag_pair]['last_updated'] = self.timestamp
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
if (typetag1 == typetag2) and (self.antennas[label1].aperture.kernel_type['P1'] == 'func') and (self.antennas[label1].aperture.kernel_type['P2'] == 'func'):
gridu, gridv = NP.meshgrid(du*(NP.arange(2*self.gridu.shape[1])-self.gridu.shape[1]), dv*(NP.arange(2*self.gridu.shape[0])-self.gridu.shape[0]))
wavelength = FCNST.c / self.f
min_lambda = NP.abs(wavelength).min()
rmaxNN = 0.5 * NP.sqrt(du**2 + dv**2) * min_lambda
gridx = gridu[:,:,NP.newaxis] * wavelength.reshape(1,1,-1)
gridy = gridv[:,:,NP.newaxis] * wavelength.reshape(1,1,-1)
gridxy = NP.hstack((gridx.reshape(-1,1), gridy.reshape(-1,1)))
wl = NP.ones(gridu.shape)[:,:,NP.newaxis] * wavelength.reshape(1,1,-1)
ant_aprtr = copy.deepcopy(self.antennas[label1].aperture)
pol_type = 'dual'
kerntype = ant_aprtr.kernel_type
shape = ant_aprtr.shape
kernshapeparms = {p: {'xmax': ant_aprtr.xmax[p], 'ymax': ant_aprtr.ymax[p], 'rmax': ant_aprtr.rmax[p], 'rmin': ant_aprtr.rmin[p], 'rotangle': ant_aprtr.rotangle[p]} for p in pol}
for p in pol:
if shape[p] == 'rect':
shape[p] = 'auto_convolved_rect'
elif shape[p] == 'square':
shape[p] = 'auto_convolved_square'
elif shape[p] == 'circular':
shape[p] = 'auto_convolved_circular'
else:
raise ValueError('Aperture kernel footprint shape - {0} - currently unsupported'.format(shape[p]))
aprtr = APR.Aperture(pol_type=pol_type, kernel_type=kerntype,
shape=shape, parms=kernshapeparms,
lkpinfo=None, load_lookup=True)
max_aprtr_size = max([NP.sqrt(aprtr.xmax['P1']**2 + NP.sqrt(aprtr.ymax['P1']**2)), NP.sqrt(aprtr.xmax['P2']**2 + NP.sqrt(aprtr.ymax['P2']**2)), aprtr.rmax['P1'], aprtr.rmax['P2']])
distNN = 2.0 * max_aprtr_size
indNN_list, blind, vuf_gridind = LKP.find_NN(NP.zeros(2).reshape(1,-1), gridxy, distance_ULIM=distNN, flatten=True, parallel=False)
dxy = gridxy[vuf_gridind,:]
unraveled_vuf_ind = NP.unravel_index(vuf_gridind, gridu.shape+(self.f.size,))
unraveled_vu_ind = (unraveled_vuf_ind[0], unraveled_vuf_ind[1])
raveled_vu_ind = NP.ravel_multi_index(unraveled_vu_ind, (gridu.shape[0], gridu.shape[1]))
for p in pol:
krn = aprtr.compute(dxy, wavelength=wl.ravel()[vuf_gridind], pol=p, rmaxNN=rmaxNN, load_lookup=False)
krn_sparse = SpM.csr_matrix((krn[p], (raveled_vu_ind,)+(unraveled_vuf_ind[2],)), shape=(gridu.size,)+(self.f.size,), dtype=NP.complex64)
krn_sparse_sumuv = krn_sparse.sum(axis=0)
krn_sparse_norm = krn_sparse.A / krn_sparse_sumuv.A
sprow = raveled_vu_ind
spcol = unraveled_vuf_ind[2]
spval = krn_sparse_norm[(sprow,)+(spcol,)]
self.pairwise_typetag_crosswts_vuf[typetag_pair][p] = SpM.csr_matrix((spval, (sprow,)+(spcol,)), shape=(gridu.size,)+(self.f.size,), dtype=NP.complex64)
else:
ulocs = du*(NP.arange(2*self.gridu.shape[1])-self.gridu.shape[1])
vlocs = dv*(NP.arange(2*self.gridu.shape[0])-self.gridu.shape[0])
antenna_grid_wts_vuf_1 = self.antennas[label1].evalGridIllumination(uvlocs=(ulocs, vlocs), xy_center=NP.zeros(2))
shape_tuple = (vlocs.size, ulocs.size) + (self.f.size,)
eps = 1e-10
if label1 == label2:
for p in pol:
sum_wts1 = antenna_grid_wts_vuf_1[p].sum(axis=0).A
sum_wts = NP.abs(sum_wts1)**2
antpair_beam = NP.abs(NP.fft.fft2(antenna_grid_wts_vuf_1[p].toarray().reshape(shape_tuple), axes=(0,1)))**2
antpair_grid_wts_vuf = NP.fft.ifft2(antpair_beam/sum_wts[NP.newaxis,:,:], axes=(0,1)) # Inverse FFT
antpair_grid_wts_vuf = NP.fft.ifftshift(antpair_grid_wts_vuf, axes=(0,1))
antpair_grid_wts_vuf[NP.abs(antpair_grid_wts_vuf) < eps] = 0.0
self.pairwise_typetag_crosswts_vuf[typetag_pair][p] = SpM.csr_matrix(antpair_grid_wts_vuf.reshape(-1,self.f.size))
else:
antenna_grid_wts_vuf_2 = self.antennas[label2].evalGridIllumination(uvlocs=(ulocs, vlocs), xy_center=NP.zeros(2))
for p in pol:
sum_wts1 = antenna_grid_wts_vuf_1[p].sum(axis=0).A
sum_wts2 = antenna_grid_wts_vuf_2[p].sum(axis=0).A
sum_wts = sum_wts1 * sum_wts2.conj()
antpair_beam = NP.fft.fft2(antenna_grid_wts_vuf_1[p].toarray().reshape(shape_tuple), axes=(0,1)) * NP.fft.fft2(antenna_grid_wts_vuf_1[p].toarray().reshape(shape_tuple).conj(), axes=(0,1))
antpair_grid_wts_vuf = NP.fft.ifft2(antpair_beam/sum_wts[NP.newaxis,:,:], axes=(0,1)) # Inverse FFT
antpair_grid_wts_vuf = NP.fft.ifftshift(antpair_grid_wts_vuf, axes=(0,1))
antpair_grid_wts_vuf[NP.abs(antpair_grid_wts_vuf) < eps] = 0.0
self.pairwise_typetag_crosswts_vuf[typetag_pair][p] = SpM.csr_matrix(antpair_grid_wts_vuf.reshape(-1,self.f.size))
else:
print 'Specified antenna pair correlation weights have already been evaluated'
############################################################################
def evalAntennaAutoCorrWts(self, forceeval=False):
"""
------------------------------------------------------------------------
Evaluate auto-correlation of aperture illumination of each antenna on
the UVF-plane
Inputs:
forceeval [boolean] When set to False (default) the auto-correlation in
the UV plane is not evaluated if it was already evaluated
earlier. If set to True, it will be forcibly evaluated
independent of whether they were already evaluated or not
------------------------------------------------------------------------
"""
if forceeval or (not self.antenna_autowts_set):
self.antenna_autowts_set = False
for antkey in self.antennas:
self.evalAntennaPairCorrWts(antkey, label2=None, forceeval=forceeval)
self.antenna_autowts_set = True
############################################################################
def evalAllAntennaPairCorrWts(self, forceeval=False):
"""
------------------------------------------------------------------------
Evaluate zero-centered cross-correlation of aperture illumination of
each antenna pair on the UVF-plane
Inputs:
forceeval [boolean] When set to False (default) the zero-centered
cross-correlation of antenna illumination weights on
the UV plane is not evaluated if it was already evaluated
earlier. If set to True, it will be forcibly evaluated
independent of whether they were already evaluated or not
------------------------------------------------------------------------
"""
if forceeval or (not self.antenna_crosswts_set):
for label_pair in self.antenna_pair_to_typetag:
label1, label2 = label_pair
self.evalAntennaPairCorrWts(label1, label2=label2, forceeval=forceeval)
self.antenna_crosswts_set = True
############################################################################
def makeAutoCorrCube(self, pol=None, data=None, datapool='stack',
tbinsize=None, forceeval_autowts=False,
forceeval_autocorr=False, nproc=None, verbose=True):
"""
------------------------------------------------------------------------
Constructs the grid of antenna aperture illumination auto-correlation
using the gridding information determined for every antenna. Flags are
taken into account while constructing this grid
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1' or
'P2'. If set to None, gridding for all the polarizations is
performed. Default=None
data [dictionary] dictionary containing data that will be used to
determine the auto-correlations of antennas. This will be used
only if input datapool is set to 'custom'. It consists of the
following keys and information:
'labels' Contains a numpy array of strings of antenna
labels
'data' auto-correlated electric fields
(n_ant x nchan array)
datapool
[string] Specifies whether data to be used in determining the
auto-correlation the E-fields to be used come from
'stack' (default), 'current', 'avg' or 'custom'. If set to
'custom', the data provided in input data will be used.
Otherwise squared electric fields will be used if set to
'current' or 'stack', and averaged squared electric fields if
set to 'avg'
tbinsize
[scalar or dictionary] Contains bin size of timestamps while
averaging. Only used when datapool is set to 'avg' and if the
attribute auto_corr_data does not contain the key 'avg'. In
that case, default = None means all antenna E-field
auto-correlation spectra over all timestamps are averaged. If
scalar, the same (positive) value applies to all polarizations.
If dictionary, timestamp bin size (positive) in seconds is
provided under each key 'P1' and 'P2'. If any of the keys is
missing the auto-correlated antenna E-field spectra for that
polarization are averaged over all timestamps.
forceeval_autowts
[boolean] When set to False (default) the auto-correlation
weights in the UV plane is not evaluated if it was already
evaluated earlier. If set to True, it will be forcibly evaluated
independent of whether they were already evaluated or not
forceeval_autocorr
[boolean] When set to False (default) the auto-correlation
data in the UV plane is not evaluated if it was already
evaluated earlier. If set to True, it will be forcibly
evaluated independent of whether they were already evaluated
or not
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
Outputs:
Tuple (autocorr_wts_cube, autocorr_data_cube). autocorr_wts_cube is a
dictionary with polarization keys 'P1' and 'P2. Under each key is a
matrix of size nt x nv x nu x nchan. autocorr_data_cube is also a
dictionary with polarization keys 'P1' and 'P2. Under each key is a
matrix of size nt x nv x nu x nchan where nt=1, nt=n_timestamps,
or nt=n_tavg if datapool is set to 'current', 'stack' or 'avg'
respectively
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
pol = NP.unique(NP.asarray(pol))
if datapool not in ['stack', 'current', 'avg', 'custom']:
raise ValueError('Input datapool must be set to "stack" or "current"')
if not isinstance(forceeval_autowts, bool):
raise TypeError('Input forceeval_autowts must be boolean')
if not isinstance(forceeval_autocorr, bool):
raise TypeError('Input forceeval_autocorr must be boolean')
self.evalAntennaAutoCorrWts(forceeval=forceeval_autowts)
data_info = {}
if datapool in ['current', 'stack', 'avg']:
if datapool not in self.auto_corr_data:
self.evalAutoCorr(pol=pol, datapool=datapool, tbinsize=tbinsize)
for apol in pol:
data_info[apol] = {'labels': self.auto_corr_data[datapool][apol]['labels'], 'twts': self.auto_corr_data[datapool][apol]['twts'], 'data': NP.nan_to_num(self.auto_corr_data[datapool][apol]['E-fields'])}
else:
if not isinstance(data, dict):
raise TypeError('Input data must be a dictionary')
for apol in pol:
if apol not in data:
raise KeyError('Key {)} not found in input data'.format(apol))
if not isinstance(data[apol], dict):
raise TypeError('Value under polarization key "{0}" under input data must be a dictionary'.format(apol))
if ('labels' not in data[apol]) or ('data' not in data[apol]):
raise KeyError('Keys "labels" and "data" not found under input data[{0}]'.format(apol))
autocorr_wts_cube = {p: None for p in ['P1', 'P2']}
autocorr_data_cube = {p: None for p in ['P1', 'P2']}
for apol in pol:
if verbose:
print 'Gridding auto-correlation of aperture illumination and electric fields for polarization {0} ...'.format(apol)
if apol not in ['P1', 'P2']:
raise ValueError('Invalid specification for input parameter pol')
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
if nproc > 1:
list_antind = []
list_antkey = []
list_typetag_pair = []
list_shape_tuple = []
list_sparse_crosswts_vuf = []
list_twts = []
list_acorr_data = []
for antind, antkey in enumerate(data_info[apol]['labels']):
typetag_pair = self.antenna_pair_to_typetag[(antkey,antkey)]
list_shape_tuple += [tuple(2*NP.asarray(self.gridu.shape))+(self.f.size,)]
list_sparse_crosswts_vuf += [self.pairwise_typetag_crosswts_vuf[typetag_pair][apol]]
list_twts += [data_info[apol]['twts'][:,antind,:]]
list_acorr_data += [data_info[apol]['data'][:,antind,:]]
for qty in ['wts', 'data']:
pool = MP.Pool(processes=nproc)
if qty == 'wts':
outqtylist = pool.map(unwrap_multidim_product, IT.izip(list_sparse_crosswts_vuf, list_twts, [1.0]*len(data_info[apol]['labels']), list_shape_tuple))
else:
outqtylist = pool.map(unwrap_multidim_product, IT.izip(list_sparse_crosswts_vuf, list_twts, list_acorr_data, list_shape_tuple))
pool.close()
pool.join()
if qty == 'wts':
autocorr_wts_cube[apol] = NP.sum(NP.array(outqtylist), axis=0)
else:
autocorr_data_cube[apol] = NP.sum(NP.array(outqtylist), axis=0)
del outqtylist
else: # Serial processing of autocorr accumulation into cube
progress = PGB.ProgressBar(widgets=[PGB.Percentage(), PGB.Bar(marker='-', left=' |', right='| '), PGB.Counter(), '/{0:0d} Antennas '.format(len(data_info[apol]['labels'])), PGB.ETA()], maxval=len(data_info[apol]['labels'])).start()
for antind, antkey in enumerate(data_info[apol]['labels']):
typetag_pair = self.antenna_pair_to_typetag[(antkey,antkey)] # auto pair
shape_tuple = tuple(2*NP.asarray(self.gridu.shape))+(self.f.size,)
if autocorr_wts_cube[apol] is None:
autocorr_wts_cube[apol] = self.pairwise_typetag_crosswts_vuf[typetag_pair][apol].toarray().reshape(shape_tuple)[NP.newaxis,:,:,:] * data_info[apol]['twts'][:,antind,:][:,NP.newaxis,NP.newaxis,:] # nt x nv x nu x nchan
autocorr_data_cube[apol] = self.pairwise_typetag_crosswts_vuf[typetag_pair][apol].toarray().reshape(shape_tuple)[NP.newaxis,:,:,:] * data_info[apol]['twts'][:,antind,:][:,NP.newaxis,NP.newaxis,:] * data_info[apol]['data'][:,antind,:][:,NP.newaxis,NP.newaxis,:] # nt x nv x nu x nchan
else:
autocorr_wts_cube[apol] += self.pairwise_typetag_crosswts_vuf[typetag_pair][apol].toarray().reshape(shape_tuple)[NP.newaxis,:,:,:] * data_info[apol]['twts'][:,antind,:][:,NP.newaxis,NP.newaxis,:] # nt x nv x nu x nchan
autocorr_data_cube[apol] += self.pairwise_typetag_crosswts_vuf[typetag_pair][apol].toarray().reshape(shape_tuple)[NP.newaxis,:,:,:] * data_info[apol]['twts'][:,antind,:][:,NP.newaxis,NP.newaxis,:] * data_info[apol]['data'][:,antind,:][:,NP.newaxis,NP.newaxis,:] # nt x nv x nu x nchan
progress.update(antind+1)
progress.finish()
sum_wts = NP.sum(data_info[apol]['twts'], axis=1) # nt x 1
autocorr_wts_cube[apol] = NP.nan_to_num(autocorr_wts_cube[apol] / sum_wts[:,NP.newaxis,NP.newaxis,:]) # nt x nv x nu x nchan, nan_to_num() just in case there are NaN
autocorr_data_cube[apol] = NP.nan_to_num(autocorr_data_cube[apol] / sum_wts[:,NP.newaxis,NP.newaxis,:]) # nt x nv x nu x nchan, nan_to_num() just in case there are NaN
return (autocorr_wts_cube, autocorr_data_cube)
############################################################################
def makeCrossCorrWtsCube(self, pol=None, data=None, datapool='stack',
verbose=True):
"""
------------------------------------------------------------------------
Constructs the grid of zero-centered cross-correlation of antenna
aperture pairs using the gridding information determined for every
antenna. Flags are taken into account while constructing this grid
Inputs:
pol [String] The polarization to be gridded. Can be set to 'P1' or
'P2'. If set to None, gridding for all the polarizations is
performed. Default=None
datapool
[string] Specifies whether flags that come from data to be
used in determining the zero-centered cross-correlation come
from 'stack' (default), 'current', or 'avg'.
verbose [boolean] If True, prints diagnostic and progress messages.
If False (default), suppress printing such messages.
Outputs:
centered_crosscorr_wts_vuf is a dictionary with polarization keys
'P1' and 'P2. Under each key is a sparse matrix of size
(nv x nu) x nchan.
------------------------------------------------------------------------
"""
if pol is None:
pol = ['P1', 'P2']
pol = NP.unique(NP.asarray(pol))
if datapool not in ['stack', 'current', 'avg', 'custom']:
raise ValueError('Input datapool must be set to "stack" or "current"')
if not self.antenna_crosswts_set:
self.evalAllAntennaPairCorrWts()
centered_crosscorr_wts_cube = {p: None for p in ['P1', 'P2']}
for apol in pol:
if verbose:
print 'Gridding centered cross-correlation of aperture illumination for polarization {0} ...'.format(apol)
if apol not in ['P1', 'P2']:
raise ValueError('Invalid specification for input parameter pol')
for typetag_pair in self.pairwise_typetags:
if 'cross' in self.pairwise_typetags[typetag_pair]:
n_bl = len(self.pairwise_typetags[typetag_pair]['cross'])
if centered_crosscorr_wts_cube[apol] is None:
centered_crosscorr_wts_cube[apol] = n_bl * self.pairwise_typetag_crosswts_vuf[typetag_pair][apol]
else:
centered_crosscorr_wts_cube[apol] += n_bl * self.pairwise_typetag_crosswts_vuf[typetag_pair][apol]
return centered_crosscorr_wts_cube
############################################################################
def evalAntennaPairPBeam(self, typetag_pair=None, label_pair=None,
pad=0, skypos=None):
"""
------------------------------------------------------------------------
Evaluate power pattern response on sky of an antenna pair
Inputs:
typetag_pair
[dictionary] dictionary with two keys '1' and '2' denoting
the antenna typetag. At least one of them must be specified.
If one of them is not specified, it is assumed to be the
same as the other. Only one of the inputs typetag_pair or
label_pair must be set
label_pair [dictionary] dictionary with two keys '1' and '2' denoting
the antenna label. At least one of them must be specified.
If one of them is not specified, it is assumed to be the
same as the other. Only one of the inputs typetag_pair or
label_pair must be set
pad [integer] indicates the amount of padding before estimating
power pattern. Applicable only when skypos is set to None.
The output power pattern will be of size 2**pad-1 times the
size of the UV-grid along l- and m-axes. Value must
not be negative. Default=0 (implies no padding). pad=1
implies padding by factor 2 along u- and v-axes
skypos [numpy array] Positions on sky at which power pattern is
to be esimated. It is a 2- or 3-column numpy array in
direction cosine coordinates. It must be of size nsrc x 2
or nsrc x 3. If set to None (default), the power pattern is
estimated over a grid on the sky. If a numpy array is
specified, then power pattern at the given locations is
estimated.
Outputs:
pbinfo is a dictionary with the following keys and values:
'pb' [dictionary] Dictionary with keys 'P1' and 'P2' for
polarization. Under each key is a numpy array of estimated
power patterns. If skypos was set to None, the numpy array is
3D masked array of size nm x nl x nchan. The mask is based on
which parts of the grid are valid direction cosine coordinates
on the sky. If skypos was a numpy array denoting specific sky
locations, the value in this key is a 2D numpy array of size
nsrc x nchan
'llocs' [None or numpy array] If the power pattern estimated is a grid
(if input skypos was set to None), it contains the l-locations
of the grid on the sky. If input skypos was not set to None,
the value under this key is set to None
'mlocs' [None or numpy array] If the power pattern estimated is a grid
(if input skypos was set to None), it contains the m-locations
of the grid on the sky. If input skypos was not set to None,
the value under this key is set to None
------------------------------------------------------------------------
"""
if (typetag_pair is None) and (label_pair is None):
raise ValueError('One of the inputs typetag_pair or label_pair must be specified')
elif (typetag_pair is not None) and (label_pair is not None):
raise ValueError('Only one of the inputs typetag_pair or label_pair must be specified')
if typetag_pair is not None:
if ('1' not in typetag_pair) and ('2' not in typetag_pair):
raise KeyError('Required keys not found in input typetag_pair')
elif ('1' not in typetag_pair) and ('2' in typetag_pair):
typetag_pair['1'] = typetag_pair['2']
elif ('1' in typetag_pair) and ('2' not in typetag_pair):
typetag_pair['2'] = typetag_pair['1']
typetag_tuple = (typetag_pair['1'], typetag_pair['2'])
if typetag_tuple not in self.pairwise_typetags:
if typetag_tuple[::-1] not in self.pairwise_typetags:
raise KeyError('typetag pair not found in antenna cross weights')
else:
typetag_tuple = typetag_tuple[::-1]
if 'auto' in self.pairwise_typetags[typetag_tuple]:
label1, label2 = list(self.pairwise_typetags[typetag_tuple]['auto'])[0]
else:
label1, label2 = list(self.pairwise_typetags[typetag_tuple]['cross'])[0]
else:
if ('1' not in label_pair) and ('2' not in label_pair):
raise KeyError('Required keys not found in input label_pair')
elif ('1' not in label_pair) and ('2' in label_pair):
label_pair['1'] = label_pair['2']
elif ('1' in label_pair) and ('2' not in label_pair):
label_pair['2'] = label_pair['1']
label1 = label_pair['1']
label2 = label_pair['2']
label_tuple = (label1, label2)
if label_tuple not in self.antenna_pair_to_typetag:
if label_tuple[::-1] not in self.antenna_pair_to_typetag:
raise KeyError('label pair not found in antenna pairs')
else:
label_tuple = label_tuple[::-1]
label1, label2 = label_tuple
typetag_tuple = self.antenna_pair_to_typetag[label_tuple]
if typetag_tuple not in self.pairwise_typetag_crosswts_vuf:
self.evalAntennaPairCorrWts(label1, label2=label2)
centered_crosscorr_wts_vuf = self.pairwise_typetag_crosswts_vuf[typetag_tuple]
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
ulocs = du*(NP.arange(2*self.gridu.shape[1])-self.gridu.shape[1])
vlocs = dv*(NP.arange(2*self.gridv.shape[0])-self.gridv.shape[0])
pol = ['P1', 'P2']
pbinfo = {'pb': {}}
for p in pol:
pb = evalApertureResponse(centered_crosscorr_wts_vuf[p], ulocs, vlocs, pad=pad, skypos=skypos)
pbinfo['pb'][p] = pb['pb']
pbinfo['llocs'] = pb['llocs']
pbinfo['mlocs'] = pb['mlocs']
return pbinfo
############################################################################
def quick_beam_synthesis(self, pol=None, keep_zero_spacing=True):
"""
------------------------------------------------------------------------
A quick generator of synthesized beam using antenna array field
illumination pattern using the center frequency. Not intended to be used
rigorously but rather for comparison purposes and making quick plots
Inputs:
pol [String] The polarization of the synthesized beam. Can be set
to 'P1' or 'P2'. If set to None, synthesized beam for all the
polarizations are generated. Default=None
keep_zero_spacing
[boolean] If set to True (default), keep the zero spacing in
uv-plane grid illumination and as a result the average value
of the synthesized beam could be non-zero. If False, the zero
spacing is forced to zero by removing the average value fo the
synthesized beam
Outputs:
Dictionary with the following keys and information:
'syn_beam' [numpy array] synthesized beam of size twice as that of the
antenna array grid. It is FFT-shifted to place the
origin at the center of the array. The peak value of the
synthesized beam is fixed at unity
'grid_power_illumination'
[numpy array] complex grid illumination obtained from
inverse fourier transform of the synthesized beam in
'syn_beam' and has size twice as that of the antenna
array grid. It is FFT-shifted to have the origin at the
center. The sum of this array is set to unity to match the
peak of the synthesized beam
'l' [numpy vector] x-values of the direction cosine grid
corresponding to x-axis (axis=1) of the synthesized beam
'm' [numpy vector] y-values of the direction cosine grid
corresponding to y-axis (axis=0) of the synthesized beam
------------------------------------------------------------------------
"""
if not self.grid_ready:
raise ValueError('Need to perform gridding of the antenna array before an equivalent UV grid can be simulated')
if pol is None:
pol = ['P1', 'P2']
elif isinstance(pol, str):
if pol in ['P1', 'P2']:
pol = [pol]
else:
raise ValueError('Invalid polarization specified')
elif isinstance(pol, list):
p = [apol for apol in pol if apol in ['P1', 'P2']]
if len(p) == 0:
raise ValueError('Invalid polarization specified')
pol = p
else:
raise TypeError('Input keyword pol must be string, list or set to None')
pol = sorted(pol)
for apol in pol:
if self.grid_illumination[apol] is None:
raise ValueError('Grid illumination for the specified polarization is not determined yet. Must use make_grid_cube()')
chan = NP.argmin(NP.abs(self.f - self.f0))
grid_field_illumination = NP.empty(self.gridu.shape+(len(pol),), dtype=NP.complex)
for pind, apol in enumerate(pol):
grid_field_illumination[:,:,pind] = self.grid_illumination[apol][:,:,chan]
syn_beam = NP.fft.fft2(grid_field_illumination, s=[4*self.gridu.shape[0], 4*self.gridv.shape[1]], axes=(0,1))
syn_beam = NP.abs(syn_beam)**2
if not keep_zero_spacing:
dclevel = NP.sum(syn_beam, axis=(0,1), keepdims=True) / (1.0*syn_beam.size/len(pol))
syn_beam = syn_beam - dclevel
syn_beam /= syn_beam.max() # Normalize to get unit peak for PSF
syn_beam_in_uv = NP.fft.ifft2(syn_beam, axes=(0,1)) # Inverse FT
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
# if not keep_zero_spacing: # Filter out the interferometer aperture kernel footprint centered on zero
# l4 = DSP.spectax(4*self.gridu.shape[1], resolution=du, shift=False)
# m4 = DSP.spectax(4*self.gridv.shape[0], resolution=dv, shift=False)
# u4 = DSP.spectax(l4.size, resolution=l4[1]-l4[0], shift=False)
# v4 = DSP.spectax(m4.size, resolution=m4[1]-m4[0], shift=False)
# gridu4, gridv4 = NP.meshgrid(u4,v4)
# gridxy4 = NP.hstack((gridu4.reshape(-1,1), gridv4.reshape(-1,1))) * FCNST.c/self.f[chan]
# # assume identical antennas
# aperture = self.antennas.itervalues().next().aperture
# zero_vind = []
# zero_uind = []
# zero_pind = []
# for pi,apol in enumerate(pol):
# if aperture.kernel_type[apol] == 'func':
# if aperture.shape[apol] == 'circular':
# z_ind = NP.where(NP.sqrt(NP.sum(gridxy4**2, axis=1)) <= 2*aperture.rmax[apol])[0]
# else:
# rotang = aperture.rotangle[apol]
# rotmat = NP.asarray([[NP.cos(-rotang), -NP.sin(-rotang)],
# [NP.sin(-rotang), NP.cos(-rotang)]])
# gridxy4 = NP.dot(gridxy4, rotmat.T)
# if aperture.shape[apol] == 'square':
# z_ind = NP.where(NP.logical_and(NP.abs(gridxy4[:,0]) <= 2*aperture.xmax[apol], NP.abs(gridxy4[:,1]) <= 2*aperture.xmax[apol]))[0]
# else:
# z_ind = NP.where(NP.logical_and(NP.abs(gridxy4[:,0]) <= 2*aperture.xmax[apol], NP.abs(gridxy4[:,1]) <= 2*aperture.ymax[apol]))[0]
# z_vind, z_uind = NP.unravel_index(z_ind, gridu4.shape)
# zero_vind += z_vind.tolist()
# zero_uind += z_uind.tolist()
# zero_pind += [pi]*z_vind.size
# zero_vind = NP.asarray(zero_vind).ravel()
# zero_uind = NP.asarray(zero_uind).ravel()
# zero_pind = NP.asarray(zero_pind).ravel()
# syn_beam_in_uv[(zero_vind, zero_uind, zero_pind)] = 0.0
# syn_beam = NP.fft.fft2(syn_beam_in_uv, axes=(0,1)) # FT
# if NP.abs(syn_beam.imag).max() > 1e-10:
# raise ValueError('Synthesized beam after zero spacing aperture removal has significant imaginary component')
# else:
# syn_beam = syn_beam.real
# norm_factor = 1.0 / syn_beam.max()
# syn_beam *= norm_factor # Normalize to get unit peak for PSF
# syn_beam_in_uv *= norm_factor # Normalize to get unit peak for PSF
# shift the array to be centered
syn_beam_in_uv = NP.fft.ifftshift(syn_beam_in_uv, axes=(0,1)) # Shift array to be centered
# Discard pads at either end and select only the central values of twice the original size
syn_beam_in_uv = syn_beam_in_uv[grid_field_illumination.shape[0]:3*grid_field_illumination.shape[0],grid_field_illumination.shape[1]:3*grid_field_illumination.shape[1],:]
syn_beam = NP.fft.fftshift(syn_beam[::2,::2,:], axes=(0,1)) # Downsample by factor 2 to get native resolution and shift to be centered
l = DSP.spectax(2*self.gridu.shape[1], resolution=du, shift=True)
m = DSP.spectax(2*self.gridv.shape[0], resolution=dv, shift=True)
return {'syn_beam': syn_beam, 'grid_power_illumination': syn_beam_in_uv, 'l': l, 'm': m}
############################################################################
def quick_beam_synthesis_new(self, pol=None, keep_zero_spacing=True):
"""
------------------------------------------------------------------------
A quick generator of synthesized beam using antenna array field
illumination pattern using the center frequency. Not intended to be used
rigorously but rather for comparison purposes and making quick plots
Inputs:
pol [String] The polarization of the synthesized beam. Can be set
to 'P1' or 'P2'. If set to None, synthesized beam for all the
polarizations are generated. Default=None
keep_zero_spacing
[boolean] If set to True (default), keep the zero spacing in
uv-plane grid illumination and as a result the average value
of the synthesized beam could be non-zero. If False, the zero
spacing is forced to zero by removing the average value fo the
synthesized beam
Outputs:
Dictionary with the following keys and information:
'syn_beam' [numpy array] synthesized beam of size twice as that of the
antenna array grid. It is FFT-shifted to place the
origin at the center of the array. The peak value of the
synthesized beam is fixed at unity
'grid_power_illumination'
[numpy array] complex grid illumination obtained from
inverse fourier transform of the synthesized beam in
'syn_beam' and has size twice as that of the antenna
array grid. It is FFT-shifted to have the origin at the
center. The sum of this array is set to unity to match the
peak of the synthesized beam
'l' [numpy vector] x-values of the direction cosine grid
corresponding to x-axis (axis=1) of the synthesized beam
'm' [numpy vector] y-values of the direction cosine grid
corresponding to y-axis (axis=0) of the synthesized beam
------------------------------------------------------------------------
"""
if not self.grid_ready:
raise ValueError('Need to perform gridding of the antenna array before an equivalent UV grid can be simulated')
if pol is None:
pol = ['P1', 'P2']
elif isinstance(pol, str):
if pol in ['P1', 'P2']:
pol = [pol]
else:
raise ValueError('Invalid polarization specified')
elif isinstance(pol, list):
p = [apol for apol in pol if apol in ['P1', 'P2']]
if len(p) == 0:
raise ValueError('Invalid polarization specified')
pol = p
else:
raise TypeError('Input keyword pol must be string, list or set to None')
pol = sorted(pol)
for apol in pol:
if self.grid_illumination[apol] is None:
raise ValueError('Grid illumination for the specified polarization is not determined yet. Must use make_grid_cube()')
chan = NP.argmin(NP.abs(self.f - self.f0))
grid_field_illumination = NP.empty(self.gridu.shape+(len(pol),), dtype=NP.complex)
for pind, apol in enumerate(pol):
grid_field_illumination[:,:,pind] = self.grid_illumination[apol][:,:,chan]
syn_beam = NP.fft.fft2(grid_field_illumination, s=[4*self.gridu.shape[0], 4*self.gridv.shape[1]], axes=(0,1))
syn_beam = NP.abs(syn_beam)**2
# if not keep_zero_spacing:
# dclevel = NP.sum(syn_beam, axis=(0,1), keepdims=True) / (1.0*syn_beam.size/len(pol))
# syn_beam = syn_beam - dclevel
syn_beam /= syn_beam.max() # Normalize to get unit peak for PSF
syn_beam_in_uv = NP.fft.ifft2(syn_beam, axes=(0,1)) # Inverse FT
norm_factor = 1.0
du = self.gridu[0,1] - self.gridu[0,0]
dv = self.gridv[1,0] - self.gridv[0,0]
if not keep_zero_spacing: # Filter out the interferometer aperture kernel footprint centered on zero
l4 = DSP.spectax(4*self.gridu.shape[1], resolution=du, shift=False)
m4 = DSP.spectax(4*self.gridv.shape[0], resolution=dv, shift=False)
u4 = DSP.spectax(l4.size, resolution=l4[1]-l4[0], shift=False)
v4 = DSP.spectax(m4.size, resolution=m4[1]-m4[0], shift=False)
gridu4, gridv4 = NP.meshgrid(u4,v4)
gridxy4 = NP.hstack((gridu4.reshape(-1,1), gridv4.reshape(-1,1))) * FCNST.c/self.f[chan]
# assume identical antennas
aperture = self.antennas.itervalues().next().aperture
zero_vind = []
zero_uind = []
zero_pind = []
for pi,apol in enumerate(pol):
if aperture.kernel_type[apol] == 'func':
if aperture.shape[apol] == 'circular':
z_ind = NP.where(NP.sqrt(NP.sum(gridxy4**2, axis=1)) <= 2*aperture.rmax[apol])[0]
else:
rotang = aperture.rotangle[apol]
rotmat = NP.asarray([[NP.cos(-rotang), -NP.sin(-rotang)],
[NP.sin(-rotang), NP.cos(-rotang)]])
gridxy4 = NP.dot(gridxy4, rotmat.T)
if aperture.shape[apol] == 'square':
z_ind = NP.where(NP.logical_and(NP.abs(gridxy4[:,0]) <= 2*aperture.xmax[apol], NP.abs(gridxy4[:,1]) <= 2*aperture.xmax[apol]))[0]
else:
z_ind = NP.where(NP.logical_and(NP.abs(gridxy4[:,0]) <= 2*aperture.xmax[apol], NP.abs(gridxy4[:,1]) <= 2*aperture.ymax[apol]))[0]
z_vind, z_uind = NP.unravel_index(z_ind, gridu4.shape)
zero_vind += z_vind.tolist()
zero_uind += z_uind.tolist()
zero_pind += [pi]*z_vind.size
zero_vind = NP.asarray(zero_vind).ravel()
zero_uind = NP.asarray(zero_uind).ravel()
zero_pind = NP.asarray(zero_pind).ravel()
syn_beam_in_uv[(zero_vind, zero_uind, zero_pind)] = 0.0
syn_beam = NP.fft.fft2(syn_beam_in_uv, axes=(0,1)) # FT
if NP.abs(syn_beam.imag).max() > 1e-10:
raise ValueError('Synthesized beam after zero spacing aperture removal has significant imaginary component')
else:
syn_beam = syn_beam.real
norm_factor = 1.0 / syn_beam.max()
syn_beam *= norm_factor # Normalize to get unit peak for PSF
syn_beam_in_uv *= norm_factor # Normalize to get unit peak for PSF
# shift the array to be centered
syn_beam_in_uv = NP.fft.ifftshift(syn_beam_in_uv, axes=(0,1)) # Shift array to be centered
# Discard pads at either end and select only the central values of twice the original size
syn_beam_in_uv = syn_beam_in_uv[grid_field_illumination.shape[0]:3*grid_field_illumination.shape[0],grid_field_illumination.shape[1]:3*grid_field_illumination.shape[1],:]
syn_beam = NP.fft.fftshift(syn_beam[::2,::2,:], axes=(0,1)) # Downsample by factor 2 to get native resolution and shift to be centered
l = DSP.spectax(2*self.gridu.shape[1], resolution=du, shift=True)
m = DSP.spectax(2*self.gridv.shape[0], resolution=dv, shift=True)
return {'syn_beam': syn_beam, 'grid_power_illumination': syn_beam_in_uv, 'l': l, 'm': m}
############################################################################
def update_flags(self, dictflags=None, stack=True, verify=False):
"""
------------------------------------------------------------------------
Updates all flags in the antenna array followed by any flags that
need overriding through inputs of specific flag information
Inputs:
dictflags [dictionary] contains flag information overriding after
default flag updates are determined. Antenna based flags are
given as further dictionaries with each under under a key
which is the same as the antenna label. Flags for each
antenna are specified as a dictionary holding boolean flags
for each of the two polarizations which are stored under
keys 'P1' and 'P2'. An absent key just means it is not a
part of the update. Flag information under each antenna must
be of same type as input parameter flags in member function
update_flags() of class PolInfo
stack [boolean] If True (default), appends the updated flag to the
end of the stack of flags as a function of timestamp. If
False, updates the last flag in the stack with the updated
flag and does not append
verify [boolean] If True, verify and update the flags, if necessary.
Electric fields are checked for NaN values and if found, the
flag in the corresponding polarization is set to True.
Default=False.
------------------------------------------------------------------------
"""
for label in self.antennas:
self.antennas[label].update_flags(stack=stack, verify=verify)
if dictflags is not None: # Performs flag overriding. Use stack=False
if not isinstance(dictflags, dict):
raise TypeError('Input parameter dictflags must be a dictionary')
for label in dictflags:
if label in self.antennas:
self.antennas[label].update_flags(flags=dictflags[label], stack=False, verify=True)
############################################################################
def update(self, updates=None, parallel=False, nproc=None, verbose=False):
"""
------------------------------------------------------------------------
Updates the antenna array instance with newer attribute values. Can also
be used to add and/or remove antennas with/without affecting the
existing grid.
Inputs:
updates [Dictionary] Consists of information updates under the
following principal keys:
'antenna_array': Consists of updates for the AntennaArray
instance. This is a dictionary which consists of
the following keys:
'timestamp' Unique identifier of the time
series. It is optional to set this
to a scalar. If not given, no
change is made to the existing
timestamp attribute
'do_grid' [boolean] If set to True, create
or recreate a grid. To be
specified when the antenna
locations are updated.
'antennas': Holds a list of dictionaries consisting of
updates for individual antennas. Each element
in the list contains update for one antenna.
For each of these dictionaries, one of the keys
is 'label' which indicates an antenna label. If
absent, the code execution stops by throwing an
exception. The other optional keys and the
information they hold are listed below:
'action' [String scalar] Indicates the type
of update operation. 'add' adds
the Antenna instance to the
AntennaArray instance. 'remove'
removes the antenna from the
antenna array instance. 'modify'
modifies the antenna attributes in
the antenna array instance. This
key has to be set. No default.
'grid_action' [Boolean] If set to True, will
apply the grdding operations
(grid(), grid_convolve(), and
grid_unconvolve()) appropriately
according to the value of the
'action' key. If set to None or
False, gridding effects will
remain unchanged. Default=None
(=False).
'antenna' [instance of class Antenna]
Updated Antenna class instance.
Can work for action key 'remove'
even if not set (=None) or set to
an empty string '' as long as
'label' key is specified.
'gridpol' [Optional. String scalar]
Initiates the specified action on
polarization 'P1' or 'P2'. Can be
set to 'P1' or 'P2'. If not
provided (=None), then the
specified action applies to both
polarizations. Default = None.
'Et' [Optional. Dictionary] Complex
Electric field time series under
two polarizations which are under
keys 'P1' and 'P2'. Is used only
if set and if 'action' key value
is set to 'modify'.
Default = None.
'Ef' [Optional. Dictionary] Complex
Electric field spectra under
two polarizations which are under
keys 'P1' and 'P2'. Is used only
if set and if 'action' key value
is set to 'modify'.
Default = None.
'stack' [boolean] If True (default),
appends the updated flag and data
to the end of the stack as a
function of timestamp. If False,
updates the last flag and data in
the stack and does not append
't' [Optional. Numpy array] Time axis
of the time series. Is used only
if set and if 'action' key value
is set to 'modify'. Default=None.
'timestamp' [Optional. Scalar] Unique
identifier of the time series. Is
used only if set and if 'action'
key value is set to 'modify'.
Default = None.
'location' [Optional. instance of GEOM.Point
class]
Antenna location in the local ENU
coordinate system. Used only if
set and if 'action' key value is
set to 'modify'. Default = None.
'aperture' [instance of class
APR.Aperture] aperture
information for the antenna. Read
docstring of class
Aperture for details
'wtsinfo' [Optional. Dictionary]
See description in Antenna class
member function update(). Is used
only if set and if 'action' key
value is set to 'modify'.
Default = None.
'flags' [Optional. Dictionary] holds
boolean flags for each of the 2
polarizations which are stored
under keys 'P1' and 'P2'.
Default=None means no updates for
flags. If True, that polarization
will be flagged. If not set
(=None), the previous or default
flag status will continue to
apply. If set to False, the
antenna status will be updated to
become unflagged.
'gridfunc_freq'
[Optional. String scalar] Read the
description of inputs to Antenna
class member function update(). If
set to None (not provided), this
attribute is determined based on
the size of wtspos under each
polarization. It is applicable
only when 'action' key is set to
'modify'. Default = None.
'delaydict' [Dictionary] contains information
on delay compensation to be
applied to the fourier transformed
electric fields under each
polarization which are stored
under keys 'P1' and 'P2'.
Default=None (no delay
compensation to be applied). Refer
to the docstring of member
function delay_compensation() of
class PolInfo for more details.
'ref_freq' [Optional. Scalar] Positive value
(in Hz) of reference frequency
(used if gridfunc_freq is set to
'scale') at which wtspos in
wtsinfo are provided. If set to
None, the reference frequency
already set in antenna array
instance remains unchanged.
Default = None.
'pol_type' [Optional. String scalar] 'Linear'
or 'Circular'. Used only when
action key is set to 'modify'. If
not provided, then the previous
value remains in effect.
Default = None.
'norm_wts' [Optional. Boolean] Default=False.
If set to True, the gridded
weights are divided by the sum of
weights so that the gridded
weights add up to unity. This is
used only when grid_action keyword
is set when action keyword is set
to 'add' or 'modify'
'gridmethod' [Optional. String] Indicates
gridding method. It accepts the
following values 'NN' (nearest
neighbour), 'BL' (Bi-linear
interpolation), and'CS' (Cubic
Spline interpolation).
Default='NN'
'distNN' [Optional. Scalar] Indicates the
upper bound on distance for a
nearest neighbour search if the
value of 'gridmethod' is set to
'NN'. The units are of physical
distance, the same as what is
used for antenna locations.
Default = NP.inf
'maxmatch' [scalar] A positive value
indicating maximum number of input
locations in the antenna grid to
be assigned. Default = None. If
set to None, all the antenna array
grid elements specified are
assigned values for each antenna.
For instance, to have only one
antenna array grid element to be
populated per antenna, use
maxmatch=1.
'tol' [scalar] If set, only lookup data
with abs(val) > tol will be
considered for nearest neighbour
lookup. Default = None implies
all lookup values will be
considered for nearest neighbour
determination. tol is to be
interpreted as a minimum value
considered as significant in the
lookup table.
parallel [boolean] specifies if parallelization is to be invoked.
False (default) means only serial processing
nproc [integer] specifies number of independent processes to spawn.
Default = None, means automatically determines the number of
process cores in the system and use one less than that to
avoid locking the system for other processes. Applies only
if input parameter 'parallel' (see above) is set to True.
If nproc is set to a value more than the number of process
cores in the system, it will be reset to number of process
cores in the system minus one to avoid locking the system out
for other processes
verbose [Boolean] Default = False. If set to True, prints some
diagnotic or progress messages.
------------------------------------------------------------------------
"""
if updates is not None:
if not isinstance(updates, dict):
raise TypeError('Input parameter updates must be a dictionary')
if 'antennas' in updates: # contains updates at level of individual antennas
if not isinstance(updates['antennas'], list):
updates['antennas'] = [updates['antennas']]
if parallel:
list_of_antenna_updates = []
list_of_antennas = []
for dictitem in updates['antennas']:
if not isinstance(dictitem, dict):
raise TypeError('Updates to {0} instance should be provided in the form of a list of dictionaries.'.format(self.__class__.__name__))
elif 'label' not in dictitem:
raise KeyError('No antenna label specified in the dictionary item to be updated.')
if 'action' not in dictitem:
raise KeyError('No action specified for update. Action key should be set to "add", "remove" or "modify".')
elif dictitem['action'] == 'add':
if dictitem['label'] in self.antennas:
if verbose:
print 'Antenna {0} for adding already exists in current instance of {1}. Skipping over to the next item to be updated.'.format(dictitem['label'], self.__class__.__name__)
else:
if verbose:
print 'Adding antenna {0}...'.format(dictitem['label'])
self.add_antennas(dictitem['antenna'])
if 'grid_action' in dictitem:
self.grid_convolve(pol=dictitem['gridpol'], ants=dictitem['antenna'], unconvolve_existing=False)
elif dictitem['action'] == 'remove':
if dictitem['label'] not in self.antennas:
if verbose:
print 'Antenna {0} for removal not found in current instance of {1}. Skipping over to the next item to be updated.'.format(dictitem['label'], self.__class__.__name__)
else:
if verbose:
print 'Removing antenna {0}...'.format(dictitem['label'])
if 'grid_action' in dictitem:
self.grid_unconvolve(dictitem['label'], dictitem['gridpol'])
self.remove_antennas(dictitem['label'])
elif dictitem['action'] == 'modify':
if dictitem['label'] not in self.antennas:
if verbose:
print 'Antenna {0} for modification not found in current instance of {1}. Skipping over to the next item to be updated.'.format(dictitem['label'], self.__class__.__name__)
else:
if verbose:
print 'Modifying antenna {0}...'.format(dictitem['label'])
if 'Ef' not in dictitem: dictitem['Ef']=None
if 'Et' not in dictitem: dictitem['Et']=None
if 't' not in dictitem: dictitem['t']=None
if 'timestamp' not in dictitem: dictitem['timestamp']=None
if 'location' not in dictitem: dictitem['location']=None
if 'wtsinfo' not in dictitem: dictitem['wtsinfo']=None
if 'flags' not in dictitem: dictitem['flags']=None
if 'stack' not in dictitem: dictitem['stack']=True
if 'gridfunc_freq' not in dictitem: dictitem['gridfunc_freq']=None
if 'ref_freq' not in dictitem: dictitem['ref_freq']=None
if 'pol_type' not in dictitem: dictitem['pol_type']=None
if 'norm_wts' not in dictitem: dictitem['norm_wts']=False
if 'gridmethod' not in dictitem: dictitem['gridmethod']='NN'
if 'distNN' not in dictitem: dictitem['distNN']=NP.inf
if 'maxmatch' not in dictitem: dictitem['maxmatch']=None
if 'tol' not in dictitem: dictitem['tol']=None
if 'delaydict' not in dictitem: dictitem['delaydict']=None
if 'aperture' not in dictitem: dictitem['aperture']=None
if not parallel:
self.antennas[dictitem['label']].update(dictitem, verbose)
else:
list_of_antennas += [self.antennas[dictitem['label']]]
list_of_antenna_updates += [dictitem]
if 'grid_action' in dictitem:
self.grid_convolve(pol=dictitem['gridpol'], ants=dictitem['antenna'], unconvolve_existing=True, normalize=dictitem['norm_wts'], method=dictitem['gridmethod'], distNN=dictitem['distNN'], tol=dictitem['tol'], maxmatch=dictitem['maxmatch'])
else:
raise ValueError('Update action should be set to "add", "remove" or "modify".')
if parallel:
if nproc is None:
nproc = max(MP.cpu_count()-1, 1)
else:
nproc = min(nproc, max(MP.cpu_count()-1, 1))
pool = MP.Pool(processes=nproc)
updated_antennas = pool.map(unwrap_antenna_update, IT.izip(list_of_antennas, list_of_antenna_updates))
pool.close()
pool.join()
# Necessary to make the returned and updated antennas current, otherwise they stay unrelated
for antenna in updated_antennas:
self.antennas[antenna.label] = antenna
del updated_antennas
if 'antenna_array' in updates: # contains updates at 'antenna array' level
if not isinstance(updates['antenna_array'], dict):
raise TypeError('Input parameter in updates for antenna array must be a dictionary with key "antenna_array"')
if 'timestamp' in updates['antenna_array']:
self.timestamp = updates['antenna_array']['timestamp']
self.timestamps += [copy.deepcopy(self.timestamp)] # Stacks new timestamp
if 'do_grid' in updates['antenna_array']:
if isinstance(updates['antenna_array']['do_grid'], boolean):
self.grid()
else:
raise TypeError('Value in key "do_grid" inside key "antenna_array" of input dictionary updates must be boolean.')
self.t = self.antennas.itervalues().next().t # Update time axis
self.f = self.antennas.itervalues().next().f # Update frequency axis
self.update_flags(stack=False, verify=True) # Refreshes current flags, no stacking
################################################################################
| 59.563252 | 351 | 0.501518 | 87,703 | 822,092 | 4.59803 | 0.018893 | 0.012577 | 0.01354 | 0.007945 | 0.884318 | 0.855746 | 0.831469 | 0.814163 | 0.794183 | 0.775604 | 0 | 0.013238 | 0.38656 | 822,092 | 13,801 | 352 | 59.567568 | 0.786401 | 0.034053 | 0 | 0.679858 | 0 | 0.008348 | 0.099691 | 0.004868 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.0017 | 0.002628 | null | null | 0.026588 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d2c9e4c89b84d72886ac0c5b57f16bc3def4a3da | 6,613 | py | Python | tests/unit/cli/test_cli_arg_parser.py | lochsh/strictdoc | 25580d0ee9bbecc63df74016f071116b4b8cda5c | [
"Apache-2.0"
] | null | null | null | tests/unit/cli/test_cli_arg_parser.py | lochsh/strictdoc | 25580d0ee9bbecc63df74016f071116b4b8cda5c | [
"Apache-2.0"
] | null | null | null | tests/unit/cli/test_cli_arg_parser.py | lochsh/strictdoc | 25580d0ee9bbecc63df74016f071116b4b8cda5c | [
"Apache-2.0"
] | null | null | null | from strictdoc.cli.cli_arg_parser import (
cli_args_parser,
create_sdoc_args_parser,
)
FAKE_STRICTDOC_ROOT_PATH = "/tmp/strictdoc-123"
TOTAL_EXPORT_ARGS = 7
def test_export_00_strictdoc_root_path():
parser = cli_args_parser()
args = parser.parse_args(["export", "docs"])
assert len(args._get_kwargs()) == TOTAL_EXPORT_ARGS
assert args.command == "export"
assert args.fields == ["uid", "statement", "parent"]
assert args.formats == ["html"]
assert args.input_paths == ["docs"]
assert args.no_parallelization is False
assert args.output_dir is None
config_parser = create_sdoc_args_parser(args)
export_config = config_parser.get_export_config(FAKE_STRICTDOC_ROOT_PATH)
assert export_config.strictdoc_root_path == FAKE_STRICTDOC_ROOT_PATH
def test_export_01_minimal():
parser = cli_args_parser()
args = parser.parse_args(["export", "docs"])
assert len(args._get_kwargs()) == TOTAL_EXPORT_ARGS
assert args.command == "export"
assert args.fields == ["uid", "statement", "parent"]
assert args.formats == ["html"]
assert args.input_paths == ["docs"]
assert args.no_parallelization is False
assert args.output_dir is None
config_parser = create_sdoc_args_parser(args)
export_config = config_parser.get_export_config(FAKE_STRICTDOC_ROOT_PATH)
assert export_config.fields == args.fields
assert export_config.formats == args.formats
assert export_config.input_paths == args.input_paths
assert export_config.no_parallelization == args.no_parallelization
assert export_config.output_dir == args.output_dir
def test_export_02_output_dir():
parser = cli_args_parser()
args = parser.parse_args(
["export", "docs", "--output-dir", "custom-output-dir"]
)
assert len(args._get_kwargs()) == TOTAL_EXPORT_ARGS
assert args.command == "export"
assert args.input_paths == ["docs"]
assert args.fields == ["uid", "statement", "parent"]
assert args.formats == ["html"]
assert args.no_parallelization is False
assert args.output_dir == "custom-output-dir"
config_parser = create_sdoc_args_parser(args)
export_config = config_parser.get_export_config(FAKE_STRICTDOC_ROOT_PATH)
assert export_config.fields == args.fields
assert export_config.formats == args.formats
assert export_config.input_paths == args.input_paths
assert export_config.no_parallelization == args.no_parallelization
assert export_config.output_dir == args.output_dir
def test_export_03_parallelization():
parser = cli_args_parser()
args = parser.parse_args(["export", "docs", "--no-parallelization"])
assert len(args._get_kwargs()) == TOTAL_EXPORT_ARGS
assert args.command == "export"
assert args.fields == ["uid", "statement", "parent"]
assert args.formats == ["html"]
assert args.input_paths == ["docs"]
assert args.no_parallelization is True
assert args.output_dir is None
config_parser = create_sdoc_args_parser(args)
export_config = config_parser.get_export_config(FAKE_STRICTDOC_ROOT_PATH)
assert export_config.fields == args.fields
assert export_config.formats == args.formats
assert export_config.input_paths == args.input_paths
assert export_config.no_parallelization == args.no_parallelization
assert export_config.output_dir == args.output_dir
def test_export_04_export_format_rst():
parser = cli_args_parser()
args = parser.parse_args(["export", "--formats=rst", "docs"])
assert len(args._get_kwargs()) == TOTAL_EXPORT_ARGS
assert args.command == "export"
assert args.fields == ["uid", "statement", "parent"]
assert args.formats == ["rst"]
assert args.input_paths == ["docs"]
assert args.no_parallelization is False
assert args.output_dir is None
config_parser = create_sdoc_args_parser(args)
export_config = config_parser.get_export_config(FAKE_STRICTDOC_ROOT_PATH)
assert export_config.fields == args.fields
assert export_config.formats == args.formats
assert export_config.input_paths == args.input_paths
assert export_config.no_parallelization == args.no_parallelization
assert export_config.output_dir == args.output_dir
def test_export_05_export_format_multiple():
parser = cli_args_parser()
args = parser.parse_args(["export", "--formats=html,rst", "docs"])
assert args.command == "export"
assert args.input_paths == ["docs"]
assert len(args._get_kwargs()) == TOTAL_EXPORT_ARGS
assert args.command == "export"
assert args.fields == ["uid", "statement", "parent"]
assert args.formats == ["html", "rst"]
assert args.input_paths == ["docs"]
assert args.no_parallelization is False
assert args.output_dir is None
config_parser = create_sdoc_args_parser(args)
export_config = config_parser.get_export_config(FAKE_STRICTDOC_ROOT_PATH)
assert export_config.fields == args.fields
assert export_config.formats == args.formats
assert export_config.input_paths == args.input_paths
assert export_config.no_parallelization == args.no_parallelization
assert export_config.output_dir == args.output_dir
def test_export_06_export_format_multiple():
parser = cli_args_parser()
args = parser.parse_args(
["export", "--experimental-enable-file-traceability", "docs"]
)
assert args.command == "export"
assert args.input_paths == ["docs"]
assert len(args._get_kwargs()) == TOTAL_EXPORT_ARGS
assert args.command == "export"
assert args.experimental_enable_file_traceability is True
config_parser = create_sdoc_args_parser(args)
export_config = config_parser.get_export_config(FAKE_STRICTDOC_ROOT_PATH)
assert export_config.fields == args.fields
assert export_config.formats == args.formats
assert export_config.input_paths == args.input_paths
assert export_config.no_parallelization == args.no_parallelization
assert export_config.output_dir == args.output_dir
def test_passthrough_01_minimal():
parser = cli_args_parser()
args = parser.parse_args(["passthrough", "input.sdoc"])
assert args._get_kwargs() == [
("command", "passthrough"),
("input_file", "input.sdoc"),
("output_file", None),
]
def test_passthrough_02_minimal():
parser = cli_args_parser()
args = parser.parse_args(
["passthrough", "input.sdoc", "--output-file", "output.sdoc"]
)
assert args._get_kwargs() == [
("command", "passthrough"),
("input_file", "input.sdoc"),
("output_file", "output.sdoc"),
]
| 33.739796 | 77 | 0.719341 | 844 | 6,613 | 5.300948 | 0.07109 | 0.120697 | 0.124721 | 0.042244 | 0.918418 | 0.903889 | 0.898078 | 0.898078 | 0.898078 | 0.898078 | 0 | 0.004 | 0.168305 | 6,613 | 195 | 78 | 33.912821 | 0.809455 | 0 | 0 | 0.755245 | 0 | 0 | 0.093604 | 0.005897 | 0 | 0 | 0 | 0 | 0.573427 | 1 | 0.062937 | false | 0.041958 | 0.006993 | 0 | 0.06993 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
960278999c737b4cfcdd81d16a3a3068cbffe078 | 9,805 | py | Python | tests/test_packet.py | iepngs/python-socketio | 1ec2e10efcefce0d23040b7ee36e3167f758054b | [
"MIT"
] | 1 | 2018-06-07T20:38:36.000Z | 2018-06-07T20:38:36.000Z | tests/test_packet.py | iepngs/python-socketio | 1ec2e10efcefce0d23040b7ee36e3167f758054b | [
"MIT"
] | null | null | null | tests/test_packet.py | iepngs/python-socketio | 1ec2e10efcefce0d23040b7ee36e3167f758054b | [
"MIT"
] | null | null | null | import unittest
import six
from socketio import packet
class TestPacket(unittest.TestCase):
def test_encode_default_packet(self):
pkt = packet.Packet()
self.assertEqual(pkt.packet_type, packet.EVENT)
self.assertIsNone(pkt.data)
self.assertIsNone(pkt.namespace)
self.assertIsNone(pkt.id)
self.assertEqual(pkt.attachment_count, 0)
self.assertEqual(pkt.encode(), '2')
def test_decode_default_packet(self):
pkt = packet.Packet(encoded_packet='2')
self.assertTrue(pkt.encode(), '2')
def test_encode_text_event_packet(self):
pkt = packet.Packet(packet_type=packet.EVENT,
data=[six.text_type('foo')])
self.assertEqual(pkt.packet_type, packet.EVENT)
self.assertEqual(pkt.data, ['foo'])
self.assertEqual(pkt.encode(), '2["foo"]')
def test_decode_text_event_packet(self):
pkt = packet.Packet(encoded_packet='2["foo"]')
self.assertEqual(pkt.packet_type, packet.EVENT)
self.assertEqual(pkt.data, ['foo'])
self.assertEqual(pkt.encode(), '2["foo"]')
def test_decode_empty_event_packet(self):
pkt = packet.Packet(encoded_packet='1')
self.assertEqual(pkt.packet_type, packet.DISCONNECT)
# same thing, but with a numeric payload
pkt = packet.Packet(encoded_packet=1)
self.assertEqual(pkt.packet_type, packet.DISCONNECT)
def test_encode_binary_event_packet(self):
pkt = packet.Packet(packet_type=packet.EVENT, data=b'1234')
self.assertEqual(pkt.packet_type, packet.BINARY_EVENT)
self.assertEqual(pkt.data, b'1234')
a = ['51-{"_placeholder":true,"num":0}', b'1234']
b = ['51-{"num":0,"_placeholder":true}', b'1234']
encoded_packet = pkt.encode()
self.assertTrue(encoded_packet == a or encoded_packet == b)
def test_decode_binary_event_packet(self):
pkt = packet.Packet(encoded_packet='51-{"_placeholder":true,"num":0}')
self.assertTrue(pkt.add_attachment(b'1234'))
self.assertEqual(pkt.packet_type, packet.BINARY_EVENT)
self.assertEqual(pkt.data, b'1234')
def test_encode_text_ack_packet(self):
pkt = packet.Packet(packet_type=packet.ACK,
data=[six.text_type('foo')])
self.assertEqual(pkt.packet_type, packet.ACK)
self.assertEqual(pkt.data, ['foo'])
self.assertEqual(pkt.encode(), '3["foo"]')
def test_decode_text_ack_packet(self):
pkt = packet.Packet(encoded_packet='3["foo"]')
self.assertEqual(pkt.packet_type, packet.ACK)
self.assertEqual(pkt.data, ['foo'])
self.assertEqual(pkt.encode(), '3["foo"]')
def test_encode_binary_ack_packet(self):
pkt = packet.Packet(packet_type=packet.ACK, data=b'1234')
self.assertEqual(pkt.packet_type, packet.BINARY_ACK)
self.assertEqual(pkt.data, b'1234')
a = ['61-{"_placeholder":true,"num":0}', b'1234']
b = ['61-{"num":0,"_placeholder":true}', b'1234']
encoded_packet = pkt.encode()
self.assertTrue(encoded_packet == a or encoded_packet == b)
def test_decode_binary_ack_packet(self):
pkt = packet.Packet(encoded_packet='61-{"_placeholder":true,"num":0}')
self.assertTrue(pkt.add_attachment(b'1234'))
self.assertEqual(pkt.packet_type, packet.BINARY_ACK)
self.assertEqual(pkt.data, b'1234')
def test_invalid_binary_packet(self):
self.assertRaises(ValueError, packet.Packet, packet_type=packet.ERROR,
data=b'123')
def test_encode_namespace(self):
pkt = packet.Packet(packet_type=packet.EVENT,
data=[six.text_type('foo')], namespace='/bar')
self.assertEqual(pkt.namespace, '/bar')
self.assertEqual(pkt.encode(), '2/bar,["foo"]')
def test_decode_namespace(self):
pkt = packet.Packet(encoded_packet='2/bar,["foo"]')
self.assertEqual(pkt.namespace, '/bar')
self.assertEqual(pkt.encode(), '2/bar,["foo"]')
def test_decode_namespace_with_query_string(self):
# some Socket.IO clients mistakenly attach the query string to the
# namespace
pkt = packet.Packet(encoded_packet='2/bar?a=b,["foo"]')
self.assertEqual(pkt.namespace, '/bar')
self.assertEqual(pkt.encode(), '2/bar,["foo"]')
def test_encode_namespace_no_data(self):
pkt = packet.Packet(packet_type=packet.EVENT, namespace='/bar')
self.assertEqual(pkt.encode(), '2/bar')
def test_decode_namespace_no_data(self):
pkt = packet.Packet(encoded_packet='2/bar')
self.assertEqual(pkt.namespace, '/bar')
self.assertEqual(pkt.encode(), '2/bar')
def test_encode_namespace_with_hyphens(self):
pkt = packet.Packet(packet_type=packet.EVENT,
data=[six.text_type('foo')], namespace='/b-a-r')
self.assertEqual(pkt.namespace, '/b-a-r')
self.assertEqual(pkt.encode(), '2/b-a-r,["foo"]')
def test_decode_namespace_with_hyphens(self):
pkt = packet.Packet(encoded_packet='2/b-a-r,["foo"]')
self.assertEqual(pkt.namespace, '/b-a-r')
self.assertEqual(pkt.encode(), '2/b-a-r,["foo"]')
def test_encode_event_with_hyphens(self):
pkt = packet.Packet(packet_type=packet.EVENT,
data=[six.text_type('f-o-o')])
self.assertEqual(pkt.namespace, None)
self.assertEqual(pkt.encode(), '2["f-o-o"]')
def test_decode_event_with_hyphens(self):
pkt = packet.Packet(encoded_packet='2["f-o-o"]')
self.assertEqual(pkt.namespace, None)
self.assertEqual(pkt.encode(), '2["f-o-o"]')
def test_encode_id(self):
pkt = packet.Packet(packet_type=packet.EVENT,
data=[six.text_type('foo')], id=123)
self.assertEqual(pkt.id, 123)
self.assertEqual(pkt.encode(), '2123["foo"]')
def test_decode_id(self):
pkt = packet.Packet(encoded_packet='2123["foo"]')
self.assertEqual(pkt.id, 123)
self.assertEqual(pkt.encode(), '2123["foo"]')
def test_encode_namespace_and_id(self):
pkt = packet.Packet(packet_type=packet.EVENT,
data=[six.text_type('foo')], namespace='/bar',
id=123)
self.assertEqual(pkt.namespace, '/bar')
self.assertEqual(pkt.id, 123)
self.assertEqual(pkt.encode(), '2/bar,123["foo"]')
def test_decode_namespace_and_id(self):
pkt = packet.Packet(encoded_packet='2/bar,123["foo"]')
self.assertEqual(pkt.namespace, '/bar')
self.assertEqual(pkt.id, 123)
self.assertEqual(pkt.encode(), '2/bar,123["foo"]')
def test_encode_many_binary(self):
pkt = packet.Packet(packet_type=packet.EVENT,
data={'a': six.text_type('123'),
'b': b'456',
'c': [b'789', 123]})
self.assertEqual(pkt.packet_type, packet.BINARY_EVENT)
ep = pkt.encode()
self.assertEqual(len(ep), 3)
self.assertIn(b'456', ep)
self.assertIn(b'789', ep)
def test_encode_many_binary_ack(self):
pkt = packet.Packet(packet_type=packet.ACK,
data={'a': six.text_type('123'),
'b': b'456',
'c': [b'789', 123]})
self.assertEqual(pkt.packet_type, packet.BINARY_ACK)
ep = pkt.encode()
self.assertEqual(len(ep), 3)
self.assertIn(b'456', ep)
self.assertIn(b'789', ep)
def test_decode_many_binary(self):
pkt = packet.Packet(encoded_packet=(
'52-{"a":"123","b":{"_placeholder":true,"num":0},'
'"c":[{"_placeholder":true,"num":1},123]}'))
self.assertFalse(pkt.add_attachment(b'456'))
self.assertTrue(pkt.add_attachment(b'789'))
self.assertEqual(pkt.packet_type, packet.BINARY_EVENT)
self.assertEqual(pkt.data['a'], '123')
self.assertEqual(pkt.data['b'], b'456')
self.assertEqual(pkt.data['c'], [b'789', 123])
def test_decode_many_binary_ack(self):
pkt = packet.Packet(encoded_packet=(
'62-{"a":"123","b":{"_placeholder":true,"num":0},'
'"c":[{"_placeholder":true,"num":1},123]}'))
self.assertFalse(pkt.add_attachment(b'456'))
self.assertTrue(pkt.add_attachment(b'789'))
self.assertEqual(pkt.packet_type, packet.BINARY_ACK)
self.assertEqual(pkt.data['a'], '123')
self.assertEqual(pkt.data['b'], b'456')
self.assertEqual(pkt.data['c'], [b'789', 123])
def test_decode_too_many_binary_packets(self):
pkt = packet.Packet(encoded_packet=(
'62-{"a":"123","b":{"_placeholder":true,"num":0},'
'"c":[{"_placeholder":true,"num":1},123]}'))
self.assertFalse(pkt.add_attachment(b'456'))
self.assertTrue(pkt.add_attachment(b'789'))
self.assertRaises(ValueError, pkt.add_attachment, b'123')
def test_data_is_binary_list(self):
pkt = packet.Packet()
self.assertFalse(pkt._data_is_binary([six.text_type('foo')]))
self.assertFalse(pkt._data_is_binary([]))
self.assertTrue(pkt._data_is_binary([b'foo']))
self.assertTrue(pkt._data_is_binary([six.text_type('foo'), b'bar']))
def test_data_is_binary_dict(self):
pkt = packet.Packet()
self.assertFalse(pkt._data_is_binary({'a': six.text_type('foo')}))
self.assertFalse(pkt._data_is_binary({}))
self.assertTrue(pkt._data_is_binary({'a': b'foo'}))
self.assertTrue(pkt._data_is_binary({'a': six.text_type('foo'),
'b': b'bar'}))
| 42.816594 | 78 | 0.610913 | 1,255 | 9,805 | 4.585657 | 0.073307 | 0.166811 | 0.193918 | 0.099044 | 0.901825 | 0.868462 | 0.850912 | 0.805734 | 0.719201 | 0.684448 | 0 | 0.033056 | 0.228659 | 9,805 | 228 | 79 | 43.004386 | 0.727886 | 0.011525 | 0 | 0.568421 | 0 | 0 | 0.102911 | 0.047069 | 0 | 0 | 0 | 0 | 0.484211 | 1 | 0.168421 | false | 0 | 0.015789 | 0 | 0.189474 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
825d94851438ddd73c4a4d8d5115f76d14ceab84 | 16,523 | py | Python | nintendo/nex/utility.py | h1k421/NintendoClients | 970d703215939361df14d14dc5d21b64d3ffbb13 | [
"MIT"
] | null | null | null | nintendo/nex/utility.py | h1k421/NintendoClients | 970d703215939361df14d14dc5d21b64d3ffbb13 | [
"MIT"
] | null | null | null | nintendo/nex/utility.py | h1k421/NintendoClients | 970d703215939361df14d14dc5d21b64d3ffbb13 | [
"MIT"
] | null | null | null |
# This file was generated automatically by generate_protocols.py
from nintendo.nex import notification, rmc, common, streams
import logging
logger = logging.getLogger(__name__)
class UniqueIdInfo(common.Structure):
def __init__(self):
super().__init__()
self.unique_id = 0
self.password = 0
def check_required(self, settings, version):
pass
def load(self, stream, version):
self.unique_id = stream.u64()
self.password = stream.u64()
def save(self, stream, version):
self.check_required(stream.settings, version)
stream.u64(self.unique_id)
stream.u64(self.password)
class UtilityProtocol:
METHOD_ACQUIRE_NEX_UNIQUE_ID = 1
METHOD_ACQUIRE_NEX_UNIQUE_ID_WITH_PASSWORD = 2
METHOD_ASSOCIATE_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID = 3
METHOD_ASSOCIATE_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID = 4
METHOD_GET_ASSOCIATED_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID = 5
METHOD_GET_ASSOCIATED_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID = 6
METHOD_GET_INTEGER_SETTINGS = 7
METHOD_GET_STRING_SETTINGS = 8
PROTOCOL_ID = 0x6E
def __init__(self):
self.request_decodes = {
self.METHOD_ACQUIRE_NEX_UNIQUE_ID: self.request_decode_acquire_nex_unique_id,
self.METHOD_ACQUIRE_NEX_UNIQUE_ID_WITH_PASSWORD: self.request_decode_acquire_nex_unique_id_with_password,
self.METHOD_ASSOCIATE_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID: self.request_decode_associate_nex_unique_id_with_my_principal_id,
self.METHOD_ASSOCIATE_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID: self.request_decode_associate_nex_unique_ids_with_my_principal_id,
self.METHOD_GET_ASSOCIATED_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID: self.request_decode_get_associated_nex_unique_id_with_my_principal_id,
self.METHOD_GET_ASSOCIATED_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID: self.request_decode_get_associated_nex_unique_ids_with_my_principal_id,
self.METHOD_GET_INTEGER_SETTINGS: self.request_decode_get_integer_settings,
self.METHOD_GET_STRING_SETTINGS: self.request_decode_get_string_settings,
}
self.response_decodes = {
self.METHOD_ACQUIRE_NEX_UNIQUE_ID: self.response_decode_acquire_nex_unique_id,
self.METHOD_ACQUIRE_NEX_UNIQUE_ID_WITH_PASSWORD: self.response_decode_acquire_nex_unique_id_with_password,
self.METHOD_ASSOCIATE_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID: self.response_decode_associate_nex_unique_id_with_my_principal_id,
self.METHOD_ASSOCIATE_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID: self.response_decode_associate_nex_unique_ids_with_my_principal_id,
self.METHOD_GET_ASSOCIATED_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID: self.response_decode_get_associated_nex_unique_id_with_my_principal_id,
self.METHOD_GET_ASSOCIATED_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID: self.response_decode_get_associated_nex_unique_ids_with_my_principal_id,
self.METHOD_GET_INTEGER_SETTINGS: self.response_decode_get_integer_settings,
self.METHOD_GET_STRING_SETTINGS: self.response_decode_get_string_settings,
}
@staticmethod
def request_decode_acquire_nex_unique_id(input):
result = {}
return result
@staticmethod
def response_decode_acquire_nex_unique_id(input):
result = {}
result["unique_id"] = input.u64()
return result
@staticmethod
def request_decode_acquire_nex_unique_id_with_password(input):
result = {}
return result
@staticmethod
def response_decode_acquire_nex_unique_id_with_password(input):
result = {}
result["info"] = input.extract(UniqueIdInfo)
return result
@staticmethod
def request_decode_associate_nex_unique_id_with_my_principal_id(input):
result = {}
result["info"] = input.extract(UniqueIdInfo)
return result
@staticmethod
def response_decode_associate_nex_unique_id_with_my_principal_id(input):
result = {}
return result
@staticmethod
def request_decode_associate_nex_unique_ids_with_my_principal_id(input):
result = {}
result["infos"] = input.list(UniqueIdInfo)
return result
@staticmethod
def response_decode_associate_nex_unique_ids_with_my_principal_id(input):
result = {}
return result
@staticmethod
def request_decode_get_associated_nex_unique_id_with_my_principal_id(input):
result = {}
return result
@staticmethod
def response_decode_get_associated_nex_unique_id_with_my_principal_id(input):
result = {}
result["info"] = input.extract(UniqueIdInfo)
return result
@staticmethod
def request_decode_get_associated_nex_unique_ids_with_my_principal_id(input):
result = {}
return result
@staticmethod
def response_decode_get_associated_nex_unique_ids_with_my_principal_id(input):
result = {}
result["infos"] = input.list(UniqueIdInfo)
return result
@staticmethod
def request_decode_get_integer_settings(input):
result = {}
result["index"] = input.u32()
return result
@staticmethod
def response_decode_get_integer_settings(input):
result = {}
result["settings"] = input.map(input.u16, input.s32)
return result
@staticmethod
def request_decode_get_string_settings(input):
result = {}
result["index"] = input.u32()
return result
@staticmethod
def response_decode_get_string_settings(input):
result = {}
result["settings"] = input.map(input.u16, input.string)
return result
class UtilityClient(UtilityProtocol):
def __init__(self, client):
self.settings = client.settings
self.client = client
async def acquire_nex_unique_id(self):
logger.info("UtilityClient.acquire_nex_unique_id()")
#--- request ---
stream = streams.StreamOut(self.settings)
data = await self.client.request(self.PROTOCOL_ID, self.METHOD_ACQUIRE_NEX_UNIQUE_ID, stream.get())
#--- response ---
stream = streams.StreamIn(data, self.settings)
unique_id = stream.u64()
if not stream.eof():
raise ValueError("Response is bigger than expected (got %i bytes, but only %i were read)" %(stream.size(), stream.tell()))
logger.info("UtilityClient.acquire_nex_unique_id -> done")
return unique_id
async def acquire_nex_unique_id_with_password(self):
logger.info("UtilityClient.acquire_nex_unique_id_with_password()")
#--- request ---
stream = streams.StreamOut(self.settings)
data = await self.client.request(self.PROTOCOL_ID, self.METHOD_ACQUIRE_NEX_UNIQUE_ID_WITH_PASSWORD, stream.get())
#--- response ---
stream = streams.StreamIn(data, self.settings)
info = stream.extract(UniqueIdInfo)
if not stream.eof():
raise ValueError("Response is bigger than expected (got %i bytes, but only %i were read)" %(stream.size(), stream.tell()))
logger.info("UtilityClient.acquire_nex_unique_id_with_password -> done")
return info
async def associate_nex_unique_id_with_my_principal_id(self, info):
logger.info("UtilityClient.associate_nex_unique_id_with_my_principal_id()")
#--- request ---
stream = streams.StreamOut(self.settings)
stream.add(info)
data = await self.client.request(self.PROTOCOL_ID, self.METHOD_ASSOCIATE_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID, stream.get())
#--- response ---
stream = streams.StreamIn(data, self.settings)
if not stream.eof():
raise ValueError("Response is bigger than expected (got %i bytes, but only %i were read)" %(stream.size(), stream.tell()))
logger.info("UtilityClient.associate_nex_unique_id_with_my_principal_id -> done")
async def associate_nex_unique_ids_with_my_principal_id(self, infos):
logger.info("UtilityClient.associate_nex_unique_ids_with_my_principal_id()")
#--- request ---
stream = streams.StreamOut(self.settings)
stream.list(infos, stream.add)
data = await self.client.request(self.PROTOCOL_ID, self.METHOD_ASSOCIATE_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID, stream.get())
#--- response ---
stream = streams.StreamIn(data, self.settings)
if not stream.eof():
raise ValueError("Response is bigger than expected (got %i bytes, but only %i were read)" %(stream.size(), stream.tell()))
logger.info("UtilityClient.associate_nex_unique_ids_with_my_principal_id -> done")
async def get_associated_nex_unique_id_with_my_principal_id(self):
logger.info("UtilityClient.get_associated_nex_unique_id_with_my_principal_id()")
#--- request ---
stream = streams.StreamOut(self.settings)
data = await self.client.request(self.PROTOCOL_ID, self.METHOD_GET_ASSOCIATED_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID, stream.get())
#--- response ---
stream = streams.StreamIn(data, self.settings)
info = stream.extract(UniqueIdInfo)
if not stream.eof():
raise ValueError("Response is bigger than expected (got %i bytes, but only %i were read)" %(stream.size(), stream.tell()))
logger.info("UtilityClient.get_associated_nex_unique_id_with_my_principal_id -> done")
return info
async def get_associated_nex_unique_ids_with_my_principal_id(self):
logger.info("UtilityClient.get_associated_nex_unique_ids_with_my_principal_id()")
#--- request ---
stream = streams.StreamOut(self.settings)
data = await self.client.request(self.PROTOCOL_ID, self.METHOD_GET_ASSOCIATED_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID, stream.get())
#--- response ---
stream = streams.StreamIn(data, self.settings)
infos = stream.list(UniqueIdInfo)
if not stream.eof():
raise ValueError("Response is bigger than expected (got %i bytes, but only %i were read)" %(stream.size(), stream.tell()))
logger.info("UtilityClient.get_associated_nex_unique_ids_with_my_principal_id -> done")
return infos
async def get_integer_settings(self, index):
logger.info("UtilityClient.get_integer_settings()")
#--- request ---
stream = streams.StreamOut(self.settings)
stream.u32(index)
data = await self.client.request(self.PROTOCOL_ID, self.METHOD_GET_INTEGER_SETTINGS, stream.get())
#--- response ---
stream = streams.StreamIn(data, self.settings)
settings = stream.map(stream.u16, stream.s32)
if not stream.eof():
raise ValueError("Response is bigger than expected (got %i bytes, but only %i were read)" %(stream.size(), stream.tell()))
logger.info("UtilityClient.get_integer_settings -> done")
return settings
async def get_string_settings(self, index):
logger.info("UtilityClient.get_string_settings()")
#--- request ---
stream = streams.StreamOut(self.settings)
stream.u32(index)
data = await self.client.request(self.PROTOCOL_ID, self.METHOD_GET_STRING_SETTINGS, stream.get())
#--- response ---
stream = streams.StreamIn(data, self.settings)
settings = stream.map(stream.u16, stream.string)
if not stream.eof():
raise ValueError("Response is bigger than expected (got %i bytes, but only %i were read)" %(stream.size(), stream.tell()))
logger.info("UtilityClient.get_string_settings -> done")
return settings
class UtilityServer(UtilityProtocol):
def __init__(self):
self.methods = {
self.METHOD_ACQUIRE_NEX_UNIQUE_ID: self.handle_acquire_nex_unique_id,
self.METHOD_ACQUIRE_NEX_UNIQUE_ID_WITH_PASSWORD: self.handle_acquire_nex_unique_id_with_password,
self.METHOD_ASSOCIATE_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID: self.handle_associate_nex_unique_id_with_my_principal_id,
self.METHOD_ASSOCIATE_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID: self.handle_associate_nex_unique_ids_with_my_principal_id,
self.METHOD_GET_ASSOCIATED_NEX_UNIQUE_ID_WITH_MY_PRINCIPAL_ID: self.handle_get_associated_nex_unique_id_with_my_principal_id,
self.METHOD_GET_ASSOCIATED_NEX_UNIQUE_IDS_WITH_MY_PRINCIPAL_ID: self.handle_get_associated_nex_unique_ids_with_my_principal_id,
self.METHOD_GET_INTEGER_SETTINGS: self.handle_get_integer_settings,
self.METHOD_GET_STRING_SETTINGS: self.handle_get_string_settings,
}
async def logout(self, client):
pass
async def handle(self, client, method_id, input, output):
if method_id in self.methods:
await self.methods[method_id](client, input, output)
else:
logger.warning("Unknown method called on UtilityServer: %i", method_id)
raise common.RMCError("Core::NotImplemented")
async def handle_acquire_nex_unique_id(self, client, input, output):
logger.info("UtilityServer.acquire_nex_unique_id()")
#--- request ---
response = await self.acquire_nex_unique_id(client)
#--- response ---
if not isinstance(response, int):
raise RuntimeError("Expected int, got %s" %response.__class__.__name__)
output.u64(response)
async def handle_acquire_nex_unique_id_with_password(self, client, input, output):
logger.info("UtilityServer.acquire_nex_unique_id_with_password()")
#--- request ---
response = await self.acquire_nex_unique_id_with_password(client)
#--- response ---
if not isinstance(response, UniqueIdInfo):
raise RuntimeError("Expected UniqueIdInfo, got %s" %response.__class__.__name__)
output.add(response)
async def handle_associate_nex_unique_id_with_my_principal_id(self, client, input, output):
logger.info("UtilityServer.associate_nex_unique_id_with_my_principal_id()")
#--- request ---
info = input.extract(UniqueIdInfo)
await self.associate_nex_unique_id_with_my_principal_id(client, info)
async def handle_associate_nex_unique_ids_with_my_principal_id(self, client, input, output):
logger.info("UtilityServer.associate_nex_unique_ids_with_my_principal_id()")
#--- request ---
infos = input.list(UniqueIdInfo)
await self.associate_nex_unique_ids_with_my_principal_id(client, infos)
async def handle_get_associated_nex_unique_id_with_my_principal_id(self, client, input, output):
logger.info("UtilityServer.get_associated_nex_unique_id_with_my_principal_id()")
#--- request ---
response = await self.get_associated_nex_unique_id_with_my_principal_id(client)
#--- response ---
if not isinstance(response, UniqueIdInfo):
raise RuntimeError("Expected UniqueIdInfo, got %s" %response.__class__.__name__)
output.add(response)
async def handle_get_associated_nex_unique_ids_with_my_principal_id(self, client, input, output):
logger.info("UtilityServer.get_associated_nex_unique_ids_with_my_principal_id()")
#--- request ---
response = await self.get_associated_nex_unique_ids_with_my_principal_id(client)
#--- response ---
if not isinstance(response, list):
raise RuntimeError("Expected list, got %s" %response.__class__.__name__)
output.list(response, output.add)
async def handle_get_integer_settings(self, client, input, output):
logger.info("UtilityServer.get_integer_settings()")
#--- request ---
index = input.u32()
response = await self.get_integer_settings(client, index)
#--- response ---
if not isinstance(response, dict):
raise RuntimeError("Expected dict, got %s" %response.__class__.__name__)
output.map(response, output.u16, output.s32)
async def handle_get_string_settings(self, client, input, output):
logger.info("UtilityServer.get_string_settings()")
#--- request ---
index = input.u32()
response = await self.get_string_settings(client, index)
#--- response ---
if not isinstance(response, dict):
raise RuntimeError("Expected dict, got %s" %response.__class__.__name__)
output.map(response, output.u16, output.string)
async def acquire_nex_unique_id(self, *args):
logger.warning("UtilityServer.acquire_nex_unique_id not implemented")
raise common.RMCError("Core::NotImplemented")
async def acquire_nex_unique_id_with_password(self, *args):
logger.warning("UtilityServer.acquire_nex_unique_id_with_password not implemented")
raise common.RMCError("Core::NotImplemented")
async def associate_nex_unique_id_with_my_principal_id(self, *args):
logger.warning("UtilityServer.associate_nex_unique_id_with_my_principal_id not implemented")
raise common.RMCError("Core::NotImplemented")
async def associate_nex_unique_ids_with_my_principal_id(self, *args):
logger.warning("UtilityServer.associate_nex_unique_ids_with_my_principal_id not implemented")
raise common.RMCError("Core::NotImplemented")
async def get_associated_nex_unique_id_with_my_principal_id(self, *args):
logger.warning("UtilityServer.get_associated_nex_unique_id_with_my_principal_id not implemented")
raise common.RMCError("Core::NotImplemented")
async def get_associated_nex_unique_ids_with_my_principal_id(self, *args):
logger.warning("UtilityServer.get_associated_nex_unique_ids_with_my_principal_id not implemented")
raise common.RMCError("Core::NotImplemented")
async def get_integer_settings(self, *args):
logger.warning("UtilityServer.get_integer_settings not implemented")
raise common.RMCError("Core::NotImplemented")
async def get_string_settings(self, *args):
logger.warning("UtilityServer.get_string_settings not implemented")
raise common.RMCError("Core::NotImplemented")
| 38.159353 | 139 | 0.787448 | 2,262 | 16,523 | 5.322281 | 0.056587 | 0.080738 | 0.065786 | 0.10167 | 0.897832 | 0.884708 | 0.864524 | 0.814603 | 0.771742 | 0.713514 | 0 | 0.003826 | 0.114083 | 16,523 | 432 | 140 | 38.247685 | 0.818623 | 0.031834 | 0 | 0.408784 | 1 | 0 | 0.174403 | 0.100984 | 0 | 0 | 0.000251 | 0 | 0 | 1 | 0.077703 | false | 0.067568 | 0.006757 | 0 | 0.202703 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
82660c1e65049106315a0ade162fb93499ab876f | 13,934 | py | Python | devilry/apps/core/tests/test_subject.py | devilry/devilry-django | 9ae28e462dfa4cfee966ebacbca04ade9627e715 | [
"BSD-3-Clause"
] | 29 | 2015-01-18T22:56:23.000Z | 2020-11-10T21:28:27.000Z | devilry/apps/core/tests/test_subject.py | devilry/devilry-django | 9ae28e462dfa4cfee966ebacbca04ade9627e715 | [
"BSD-3-Clause"
] | 786 | 2015-01-06T16:10:18.000Z | 2022-03-16T11:10:50.000Z | devilry/apps/core/tests/test_subject.py | devilry/devilry-django | 9ae28e462dfa4cfee966ebacbca04ade9627e715 | [
"BSD-3-Clause"
] | 15 | 2015-04-06T06:18:43.000Z | 2021-02-24T12:28:30.000Z | from datetime import timedelta
from django import test
from django.conf import settings
from model_bakery import baker
from devilry.apps.core.models import Subject
from devilry.apps.core.baker_recipes import ACTIVE_PERIOD_START
class TestSubjectQuerySetFilterUserIsAdmin(test.TestCase):
def test_is_not_admin_on_anything(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
baker.make('core.Subject')
self.assertFalse(Subject.objects.filter_user_is_admin(user=testuser).exists())
def test_superuser(self):
testuser = baker.make(settings.AUTH_USER_MODEL, is_superuser=True)
testsubject = baker.make('core.Subject')
self.assertEqual(
{testsubject},
set(Subject.objects.filter_user_is_admin(user=testuser)))
def test_ignore_subjects_where_not_in_group(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
testsubject = baker.make('core.Subject')
baker.make('core.Subject')
baker.make('devilry_account.SubjectPermissionGroup',
subject=testsubject)
self.assertFalse(Subject.objects.filter_user_is_admin(user=testuser).exists())
def test_filter_user_is_admin(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
testsubject = baker.make('core.Subject')
subjectpermissiongroup = baker.make('devilry_account.SubjectPermissionGroup',
subject=testsubject)
baker.make('devilry_account.PermissionGroupUser',
user=testuser, permissiongroup=subjectpermissiongroup.permissiongroup)
self.assertEqual(
{testsubject},
set(Subject.objects.filter_user_is_admin(user=testuser)))
def test_distinct(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
testsubject = baker.make('core.Subject')
subjectpermissiongroup1 = baker.make('devilry_account.SubjectPermissionGroup',
subject=testsubject)
subjectpermissiongroup2 = baker.make('devilry_account.SubjectPermissionGroup',
subject=testsubject)
baker.make('devilry_account.PermissionGroupUser',
user=testuser, permissiongroup=subjectpermissiongroup1.permissiongroup)
baker.make('devilry_account.PermissionGroupUser',
user=testuser, permissiongroup=subjectpermissiongroup2.permissiongroup)
self.assertEqual(
{testsubject},
set(Subject.objects.filter_user_is_admin(user=testuser)))
class TestSubjectQuerySetFilterUserIsAdminForAnyPeriodsWithinSubject(test.TestCase):
def test_is_not_admin_on_anything(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
baker.make('core.Subject')
self.assertEqual(
[],
list(Subject.objects.filter_user_is_admin_for_any_periods_within_subject(user=testuser)))
def test_superuser(self):
testuser = baker.make(settings.AUTH_USER_MODEL, is_superuser=True)
testsubject1 = baker.make('core.Subject')
testsubject2 = baker.make('core.Subject')
self.assertEqual(
{testsubject1, testsubject2},
set(Subject.objects.filter_user_is_admin_for_any_periods_within_subject(user=testuser)))
def test_admin_on_subject(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
testsubject = baker.make('core.Subject')
subjectpermissiongroup = baker.make('devilry_account.SubjectPermissionGroup',
subject=testsubject)
baker.make('devilry_account.PermissionGroupUser',
permissiongroup=subjectpermissiongroup.permissiongroup,
user=testuser)
self.assertEqual(
[testsubject],
list(Subject.objects.filter_user_is_admin_for_any_periods_within_subject(user=testuser)))
def test_admin_on_other_subject(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
testsubject = baker.make('core.Subject')
othersubject = baker.make('core.Subject')
subjectpermissiongroup = baker.make('devilry_account.SubjectPermissionGroup',
subject=othersubject)
baker.make('devilry_account.PermissionGroupUser',
permissiongroup=subjectpermissiongroup.permissiongroup,
user=testuser)
self.assertEqual(
[othersubject],
list(Subject.objects.filter_user_is_admin_for_any_periods_within_subject(user=testuser)))
def test_admin_on_period(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
testsubject = baker.make('core.Subject')
testperiod = baker.make('core.Period', parentnode=testsubject)
periodpermissiongroup = baker.make('devilry_account.PeriodPermissionGroup',
period=testperiod)
baker.make('devilry_account.PermissionGroupUser',
permissiongroup=periodpermissiongroup.permissiongroup,
user=testuser)
self.assertEqual(
[testsubject],
list(Subject.objects.filter_user_is_admin_for_any_periods_within_subject(user=testuser)))
def test_admin_on_other_period(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
testsubject = baker.make('core.Subject')
baker.make('core.Period', parentnode=testsubject)
otherperiod = baker.make('core.Period')
periodpermissiongroup = baker.make('devilry_account.PeriodPermissionGroup',
period=otherperiod)
baker.make('devilry_account.PermissionGroupUser',
permissiongroup=periodpermissiongroup.permissiongroup,
user=testuser)
self.assertEqual(
[otherperiod.subject],
list(Subject.objects.filter_user_is_admin_for_any_periods_within_subject(user=testuser)))
def test_admin_on_multiple_periods(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
testsubject = baker.make('core.Subject')
testperiod1 = baker.make('core.Period', parentnode=testsubject)
periodpermissiongroup1 = baker.make('devilry_account.PeriodPermissionGroup',
period=testperiod1)
baker.make('devilry_account.PermissionGroupUser',
permissiongroup=periodpermissiongroup1.permissiongroup,
user=testuser)
testperiod2 = baker.make('core.Period', parentnode=testsubject)
periodpermissiongroup2 = baker.make('devilry_account.PeriodPermissionGroup',
period=testperiod2)
baker.make('devilry_account.PermissionGroupUser',
permissiongroup=periodpermissiongroup2.permissiongroup,
user=testuser)
self.assertEqual(
[testsubject],
list(Subject.objects.filter_user_is_admin_for_any_periods_within_subject(user=testuser)))
def test_admin_on_subject_and_period_distinct(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
testsubject = baker.make('core.Subject')
testperiod = baker.make('core.Period', parentnode=testsubject)
periodpermissiongroup = baker.make('devilry_account.PeriodPermissionGroup',
period=testperiod)
baker.make('devilry_account.PermissionGroupUser',
permissiongroup=periodpermissiongroup.permissiongroup,
user=testuser)
subjectpermissiongroup = baker.make('devilry_account.SubjectPermissionGroup',
subject=testsubject)
baker.make('devilry_account.PermissionGroupUser',
permissiongroup=subjectpermissiongroup.permissiongroup,
user=testuser)
self.assertEqual(
[testsubject],
list(Subject.objects.filter_user_is_admin_for_any_periods_within_subject(user=testuser)))
class TestSubjectQuerySetAnnotateWithHasActivePeriod(test.TestCase):
def test_no_periods(self):
baker.make('core.Subject')
annotated_subject = Subject.objects.annotate_with_has_active_period().first()
self.assertFalse(annotated_subject.has_active_period)
def test_only_old_periods(self):
testsubject = baker.make('core.Subject')
baker.make_recipe('devilry.apps.core.period_old', parentnode=testsubject)
annotated_subject = Subject.objects.annotate_with_has_active_period().first()
self.assertFalse(annotated_subject.has_active_period)
def test_only_future_periods(self):
testsubject = baker.make('core.Subject')
baker.make_recipe('devilry.apps.core.period_future', parentnode=testsubject)
annotated_subject = Subject.objects.annotate_with_has_active_period().first()
self.assertFalse(annotated_subject.has_active_period)
def test_has_active_period(self):
testsubject = baker.make('core.Subject')
baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject)
annotated_subject = Subject.objects.annotate_with_has_active_period().first()
self.assertTrue(annotated_subject.has_active_period)
def test_has_multiple_active_period(self):
testsubject = baker.make('core.Subject')
baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject)
baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject)
annotated_subject = Subject.objects.annotate_with_has_active_period().first()
self.assertTrue(annotated_subject.has_active_period)
class TestSubjectQuerySetPrefetchActivePeriodobjects(test.TestCase):
def test_no_periods(self):
baker.make('core.Subject')
annotated_subject = Subject.objects.prefetch_active_period_objects().first()
self.assertEqual([], annotated_subject.active_period_objects)
def test_only_old_periods(self):
testsubject = baker.make('core.Subject')
baker.make_recipe('devilry.apps.core.period_old', parentnode=testsubject)
annotated_subject = Subject.objects.prefetch_active_period_objects().first()
self.assertEqual([], annotated_subject.active_period_objects)
def test_only_future_periods(self):
testsubject = baker.make('core.Subject')
baker.make_recipe('devilry.apps.core.period_future', parentnode=testsubject)
annotated_subject = Subject.objects.prefetch_active_period_objects().first()
self.assertEqual([], annotated_subject.active_period_objects)
def test_has_active_period(self):
testsubject = baker.make('core.Subject')
testperiod = baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject)
annotated_subject = Subject.objects.prefetch_active_period_objects().first()
self.assertEqual([testperiod],
annotated_subject.active_period_objects)
def test_has_multiple_active_periods_ordering(self):
testsubject = baker.make('core.Subject')
testperiod1 = baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject)
testperiod3 = baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject,
start_time=ACTIVE_PERIOD_START + timedelta(days=60))
testperiod2 = baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject,
start_time=ACTIVE_PERIOD_START + timedelta(days=30))
annotated_subject = Subject.objects.prefetch_active_period_objects().first()
self.assertEqual([testperiod1, testperiod2, testperiod3],
annotated_subject.active_period_objects)
def test_querycount(self):
testsubject = baker.make('core.Subject')
baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject)
baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject,
start_time=ACTIVE_PERIOD_START + timedelta(days=30))
baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject,
start_time=ACTIVE_PERIOD_START + timedelta(days=60))
with self.assertNumQueries(2):
annotated_subject = Subject.objects.prefetch_active_period_objects().first()
str(annotated_subject.active_period_objects[0].short_name)
str(annotated_subject.active_period_objects[1].short_name)
str(annotated_subject.active_period_objects[2].short_name)
def test_last_active_period_not_using_prefetch_active_period_objects(self):
testsubject = baker.make('core.Subject')
with self.assertRaisesMessage(AttributeError,
'The last_active_period property requires '
'SubjectQuerySet.prefetch_active_period_objects()'):
str(testsubject.last_active_period)
def test_last_active_period(self):
testsubject = baker.make('core.Subject')
baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject)
testperiod3 = baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject,
start_time=ACTIVE_PERIOD_START + timedelta(days=60))
baker.make_recipe('devilry.apps.core.period_active', parentnode=testsubject,
start_time=ACTIVE_PERIOD_START + timedelta(days=30))
annotated_subject = Subject.objects.prefetch_active_period_objects().first()
self.assertEqual(testperiod3, annotated_subject.last_active_period)
| 53.183206 | 105 | 0.684226 | 1,365 | 13,934 | 6.704762 | 0.081319 | 0.086538 | 0.049716 | 0.063374 | 0.862872 | 0.855551 | 0.80507 | 0.778628 | 0.746941 | 0.740166 | 0 | 0.00372 | 0.228219 | 13,934 | 261 | 106 | 53.386973 | 0.847313 | 0 | 0 | 0.720524 | 0 | 0 | 0.133486 | 0.100833 | 0 | 0 | 0 | 0 | 0.113537 | 1 | 0.113537 | false | 0 | 0.026201 | 0 | 0.157205 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8298a03c4e995acc42f5d4c65d74eb947c49b3fc | 6,407 | py | Python | loldib/getratings/models/NA/na_teemo/na_teemo_top.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_teemo/na_teemo_top.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_teemo/na_teemo_top.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Teemo_Top_Aatrox(Ratings):
pass
class NA_Teemo_Top_Ahri(Ratings):
pass
class NA_Teemo_Top_Akali(Ratings):
pass
class NA_Teemo_Top_Alistar(Ratings):
pass
class NA_Teemo_Top_Amumu(Ratings):
pass
class NA_Teemo_Top_Anivia(Ratings):
pass
class NA_Teemo_Top_Annie(Ratings):
pass
class NA_Teemo_Top_Ashe(Ratings):
pass
class NA_Teemo_Top_AurelionSol(Ratings):
pass
class NA_Teemo_Top_Azir(Ratings):
pass
class NA_Teemo_Top_Bard(Ratings):
pass
class NA_Teemo_Top_Blitzcrank(Ratings):
pass
class NA_Teemo_Top_Brand(Ratings):
pass
class NA_Teemo_Top_Braum(Ratings):
pass
class NA_Teemo_Top_Caitlyn(Ratings):
pass
class NA_Teemo_Top_Camille(Ratings):
pass
class NA_Teemo_Top_Cassiopeia(Ratings):
pass
class NA_Teemo_Top_Chogath(Ratings):
pass
class NA_Teemo_Top_Corki(Ratings):
pass
class NA_Teemo_Top_Darius(Ratings):
pass
class NA_Teemo_Top_Diana(Ratings):
pass
class NA_Teemo_Top_Draven(Ratings):
pass
class NA_Teemo_Top_DrMundo(Ratings):
pass
class NA_Teemo_Top_Ekko(Ratings):
pass
class NA_Teemo_Top_Elise(Ratings):
pass
class NA_Teemo_Top_Evelynn(Ratings):
pass
class NA_Teemo_Top_Ezreal(Ratings):
pass
class NA_Teemo_Top_Fiddlesticks(Ratings):
pass
class NA_Teemo_Top_Fiora(Ratings):
pass
class NA_Teemo_Top_Fizz(Ratings):
pass
class NA_Teemo_Top_Galio(Ratings):
pass
class NA_Teemo_Top_Gangplank(Ratings):
pass
class NA_Teemo_Top_Garen(Ratings):
pass
class NA_Teemo_Top_Gnar(Ratings):
pass
class NA_Teemo_Top_Gragas(Ratings):
pass
class NA_Teemo_Top_Graves(Ratings):
pass
class NA_Teemo_Top_Hecarim(Ratings):
pass
class NA_Teemo_Top_Heimerdinger(Ratings):
pass
class NA_Teemo_Top_Illaoi(Ratings):
pass
class NA_Teemo_Top_Irelia(Ratings):
pass
class NA_Teemo_Top_Ivern(Ratings):
pass
class NA_Teemo_Top_Janna(Ratings):
pass
class NA_Teemo_Top_JarvanIV(Ratings):
pass
class NA_Teemo_Top_Jax(Ratings):
pass
class NA_Teemo_Top_Jayce(Ratings):
pass
class NA_Teemo_Top_Jhin(Ratings):
pass
class NA_Teemo_Top_Jinx(Ratings):
pass
class NA_Teemo_Top_Kalista(Ratings):
pass
class NA_Teemo_Top_Karma(Ratings):
pass
class NA_Teemo_Top_Karthus(Ratings):
pass
class NA_Teemo_Top_Kassadin(Ratings):
pass
class NA_Teemo_Top_Katarina(Ratings):
pass
class NA_Teemo_Top_Kayle(Ratings):
pass
class NA_Teemo_Top_Kayn(Ratings):
pass
class NA_Teemo_Top_Kennen(Ratings):
pass
class NA_Teemo_Top_Khazix(Ratings):
pass
class NA_Teemo_Top_Kindred(Ratings):
pass
class NA_Teemo_Top_Kled(Ratings):
pass
class NA_Teemo_Top_KogMaw(Ratings):
pass
class NA_Teemo_Top_Leblanc(Ratings):
pass
class NA_Teemo_Top_LeeSin(Ratings):
pass
class NA_Teemo_Top_Leona(Ratings):
pass
class NA_Teemo_Top_Lissandra(Ratings):
pass
class NA_Teemo_Top_Lucian(Ratings):
pass
class NA_Teemo_Top_Lulu(Ratings):
pass
class NA_Teemo_Top_Lux(Ratings):
pass
class NA_Teemo_Top_Malphite(Ratings):
pass
class NA_Teemo_Top_Malzahar(Ratings):
pass
class NA_Teemo_Top_Maokai(Ratings):
pass
class NA_Teemo_Top_MasterYi(Ratings):
pass
class NA_Teemo_Top_MissFortune(Ratings):
pass
class NA_Teemo_Top_MonkeyKing(Ratings):
pass
class NA_Teemo_Top_Mordekaiser(Ratings):
pass
class NA_Teemo_Top_Morgana(Ratings):
pass
class NA_Teemo_Top_Nami(Ratings):
pass
class NA_Teemo_Top_Nasus(Ratings):
pass
class NA_Teemo_Top_Nautilus(Ratings):
pass
class NA_Teemo_Top_Nidalee(Ratings):
pass
class NA_Teemo_Top_Nocturne(Ratings):
pass
class NA_Teemo_Top_Nunu(Ratings):
pass
class NA_Teemo_Top_Olaf(Ratings):
pass
class NA_Teemo_Top_Orianna(Ratings):
pass
class NA_Teemo_Top_Ornn(Ratings):
pass
class NA_Teemo_Top_Pantheon(Ratings):
pass
class NA_Teemo_Top_Poppy(Ratings):
pass
class NA_Teemo_Top_Quinn(Ratings):
pass
class NA_Teemo_Top_Rakan(Ratings):
pass
class NA_Teemo_Top_Rammus(Ratings):
pass
class NA_Teemo_Top_RekSai(Ratings):
pass
class NA_Teemo_Top_Renekton(Ratings):
pass
class NA_Teemo_Top_Rengar(Ratings):
pass
class NA_Teemo_Top_Riven(Ratings):
pass
class NA_Teemo_Top_Rumble(Ratings):
pass
class NA_Teemo_Top_Ryze(Ratings):
pass
class NA_Teemo_Top_Sejuani(Ratings):
pass
class NA_Teemo_Top_Shaco(Ratings):
pass
class NA_Teemo_Top_Shen(Ratings):
pass
class NA_Teemo_Top_Shyvana(Ratings):
pass
class NA_Teemo_Top_Singed(Ratings):
pass
class NA_Teemo_Top_Sion(Ratings):
pass
class NA_Teemo_Top_Sivir(Ratings):
pass
class NA_Teemo_Top_Skarner(Ratings):
pass
class NA_Teemo_Top_Sona(Ratings):
pass
class NA_Teemo_Top_Soraka(Ratings):
pass
class NA_Teemo_Top_Swain(Ratings):
pass
class NA_Teemo_Top_Syndra(Ratings):
pass
class NA_Teemo_Top_TahmKench(Ratings):
pass
class NA_Teemo_Top_Taliyah(Ratings):
pass
class NA_Teemo_Top_Talon(Ratings):
pass
class NA_Teemo_Top_Taric(Ratings):
pass
class NA_Teemo_Top_Teemo(Ratings):
pass
class NA_Teemo_Top_Thresh(Ratings):
pass
class NA_Teemo_Top_Tristana(Ratings):
pass
class NA_Teemo_Top_Trundle(Ratings):
pass
class NA_Teemo_Top_Tryndamere(Ratings):
pass
class NA_Teemo_Top_TwistedFate(Ratings):
pass
class NA_Teemo_Top_Twitch(Ratings):
pass
class NA_Teemo_Top_Udyr(Ratings):
pass
class NA_Teemo_Top_Urgot(Ratings):
pass
class NA_Teemo_Top_Varus(Ratings):
pass
class NA_Teemo_Top_Vayne(Ratings):
pass
class NA_Teemo_Top_Veigar(Ratings):
pass
class NA_Teemo_Top_Velkoz(Ratings):
pass
class NA_Teemo_Top_Vi(Ratings):
pass
class NA_Teemo_Top_Viktor(Ratings):
pass
class NA_Teemo_Top_Vladimir(Ratings):
pass
class NA_Teemo_Top_Volibear(Ratings):
pass
class NA_Teemo_Top_Warwick(Ratings):
pass
class NA_Teemo_Top_Xayah(Ratings):
pass
class NA_Teemo_Top_Xerath(Ratings):
pass
class NA_Teemo_Top_XinZhao(Ratings):
pass
class NA_Teemo_Top_Yasuo(Ratings):
pass
class NA_Teemo_Top_Yorick(Ratings):
pass
class NA_Teemo_Top_Zac(Ratings):
pass
class NA_Teemo_Top_Zed(Ratings):
pass
class NA_Teemo_Top_Ziggs(Ratings):
pass
class NA_Teemo_Top_Zilean(Ratings):
pass
class NA_Teemo_Top_Zyra(Ratings):
pass
| 15.364508 | 46 | 0.761667 | 972 | 6,407 | 4.59465 | 0.151235 | 0.216301 | 0.370802 | 0.463502 | 0.797582 | 0.797582 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173404 | 6,407 | 416 | 47 | 15.401442 | 0.843278 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
7d55b039ad2fad32063b696ed7421078bf4ace61 | 22,826 | py | Python | main.py | andreiec/sudoku-data-extraction | f3750642edaea5d5a525b7c9cdab79c22a4c9722 | [
"MIT"
] | 1 | 2021-12-04T20:33:30.000Z | 2021-12-04T20:33:30.000Z | main.py | andreiec/sudoku-data-extraction | f3750642edaea5d5a525b7c9cdab79c22a4c9722 | [
"MIT"
] | null | null | null | main.py | andreiec/sudoku-data-extraction | f3750642edaea5d5a525b7c9cdab79c22a4c9722 | [
"MIT"
] | null | null | null | import cv2
import cv2 as cv
import numpy as np
import imutils
import glob
import os
# If true, display images (rescale image down with a factor of 'scale')
image_debug = False
# Scale down image for debug with a factor of 'scale'
scale = 5
# If true, display information and draw contours
draw_debug = False
# If true, write files
write_files = True
# Max distinctive colors for task 2
colors = {
'0': (255, 255, 255),
'1': (0, 100, 0),
'2': (188, 143, 143),
'3': (255, 0, 0),
'4': (255, 215, 0),
'5': (0, 255, 0),
'6': (65, 105, 225),
'7': (0, 255, 255),
'8': (0, 0, 255),
'9': (255, 20, 147)
}
# Task number 1
def task1():
# Local paths
images_path = ".\\antrenare\\clasic\\"
destination_path = ".\\results\\Constantinescu_Andrei-Eduard_344\\clasic\\"
# Count how many images we processed
images_counter = 1
# Iterate each image
for image_path in glob.glob(images_path + '*.jpg'):
# Debug single image
if image_debug:
debug_image_number = 1
if str(debug_image_number) not in image_path:
continue
# Read image
image = cv.imread(image_path)
file_name = str(images_counter) + "_predicted.txt"
images_counter += 1
# Image padding
image_padding_horizontal = 100
image_padding_vertical = 0
# Expand canvas to add padding to image
old_image_height, old_image_width, channels = image.shape
# New size of padded image
new_image_width = old_image_width + image_padding_horizontal
new_image_height = old_image_height + image_padding_vertical
# Create new array for padded image
padded_image = np.full((new_image_height, new_image_width, channels), (200, 200, 200), dtype=np.uint8)
# Calculate the center of the padded image
x_center = (new_image_width - old_image_width) // 2
y_center = (new_image_height - old_image_height) // 2
# Paste original image into the center
padded_image[y_center:y_center + old_image_height, x_center:x_center + old_image_width] = image
# Gray, blur and threshold padded image
grayed_image = cv.cvtColor(padded_image, cv.COLOR_BGR2GRAY)
blurred_image = cv.GaussianBlur(grayed_image, (15, 15), 6)
thresholded_image = cv.adaptiveThreshold(blurred_image, 255, cv.ADAPTIVE_THRESH_GAUSSIAN_C, cv.THRESH_BINARY, 33, 4)
thresholded_image = cv.bitwise_not(thresholded_image)
# Get contours
contours = cv.findContours(thresholded_image.copy(), cv.RETR_EXTERNAL, cv.CHAIN_APPROX_SIMPLE)
contours = imutils.grab_contours(contours)
contours = sorted(contours, key=cv.contourArea, reverse=True)
# If we find a sudoku square save it in sudoku_contour
sudoku_contour = None
# Iterate through contours
for c in contours:
# Convex Hull
epsilon = 0.02 * cv.arcLength(c, True)
approx = cv.approxPolyDP(c, epsilon, True)
# Find the bounding rectangle of contour to check its size
x, y, w, h = cv.boundingRect(c)
# Draw contour if square and if size of box is higher than threshold (so that text cannot be picked up)
if len(approx) == 4 and w * h > 500000:
# Draw the contour and display bounding square size
if draw_debug:
cv.putText(padded_image, f'Box size: {str(w * h)} pixels', (15, 60), cv.FONT_HERSHEY_SIMPLEX, 2, (0, 255, 0), 4)
cv.drawContours(padded_image, [approx], -1, (0, 255, 0), 4)
# Save the contour as the sudoku_contour
sudoku_contour = approx
break
# If we found a sudoku contour then proceed to wrap image so that it contains only the sudoku contour
if sudoku_contour is not None:
# Order points from contour
rect = np.zeros((4, 2), dtype='float32')
sudoku_contour_reshaped = sudoku_contour.reshape(4, 2)
# Calculate the sum and difference of x and y of each corner
points_sum = sudoku_contour_reshaped.sum(axis=1)
points_diff = np.diff(sudoku_contour_reshaped, axis=1)
# First element will be top left and third will be bottom right (minimum sum and maximum sum)
rect[0] = sudoku_contour_reshaped[np.argmin(points_sum)]
rect[2] = sudoku_contour_reshaped[np.argmax(points_sum)]
# Second element will be top right and last will be bottom left (minimum and maximum diff)
rect[1] = sudoku_contour_reshaped[np.argmin(points_diff)]
rect[3] = sudoku_contour_reshaped[np.argmax(points_diff)]
# Calculate the width of the new reshaped image
width_bottom = np.sqrt(((rect[2][0] - rect[3][0]) ** 2) + ((rect[2][1] - rect[3][1]) ** 2))
width_top = np.sqrt(((rect[1][0] - rect[0][0]) ** 2) + ((rect[1][1] - rect[0][1]) ** 2))
width_max = max(int(width_top), int(width_bottom))
# Calculate the height of the new reshaped image
height_right = np.sqrt(((rect[1][0] - rect[2][0]) ** 2) + ((rect[1][1] - rect[2][1]) ** 2))
height_left = np.sqrt(((rect[0][0] - rect[3][0]) ** 2) + ((rect[0][1] - rect[3][1]) ** 2))
height_max = max(int(height_left), int(height_right))
# Put text in each corner of the sudoku box
if image_debug:
if draw_debug:
for i, r in enumerate(rect):
cv.putText(padded_image, str(i), (int(r[0]), int(r[1])), cv.FONT_HERSHEY_SIMPLEX, 2, (0, 0, 255), 5)
# Draw image before transformation
dims = (padded_image.shape[1] // scale, padded_image.shape[0] // scale)
cv.imshow('image', cv.resize(padded_image, dims))
# Construct the size of the new image and save it in a matrix
sudoku_matrix_template = np.array([[0, 0], [width_max - 1, 0], [width_max - 1, height_max - 1], [0, height_max - 1]], dtype='float32')
perspective_transform = cv.getPerspectiveTransform(rect, sudoku_matrix_template)
sudoku_contour_warped = cv.warpPerspective(padded_image, perspective_transform, (width_max, height_max))
# Calculate step size for each cell
width_step = sudoku_contour_warped.shape[1] // 9
height_step = sudoku_contour_warped.shape[0] // 9
# Array to hold each cell upper left corner coord
coords = []
# Calculate the upper left coord of each cell
for c in range(0, 81):
coord = ((c % 9) * width_step, (c // 9 * height_step))
coords.append(coord)
if draw_debug:
sudoku_contour_warped = cv.circle(sudoku_contour_warped, coord, 12, (0, 0, 255), -1)
# Array to hold indices of cells that contain numbers
cells_with_numbers = []
for i, coord in enumerate(coords):
# Add padding to remove borders
padding = 40
cell_mean_bias = 10
cell = sudoku_contour_warped[coord[1] + padding:coord[1] + height_step - padding, coord[0] + padding:coord[0] + width_step - padding].copy()
cell_grayed = cv.cvtColor(cell, cv.COLOR_BGR2GRAY)
cell_threshold = cv.threshold(cell_grayed, 145, 255, cv.THRESH_BINARY_INV)[1]
# If there is something inside the cell (if the mean of the cell is higher than the cell_mean_bias) append to the final array
if cell_threshold.mean() > cell_mean_bias:
cells_with_numbers.append(i)
# Display some cells if debug
if image_debug:
number_of_cells = 1
if i + 81 - number_of_cells < len(coords):
cv.imshow('cell' + str(i), cell_threshold)
# Generate final answer array
answer = []
for i in range(81):
if i in cells_with_numbers:
answer.append('x')
else:
answer.append('o')
# Create folder if not exists
if write_files:
if not os.path.exists(destination_path):
os.makedirs(destination_path)
# Save answer and create file
with open(destination_path + file_name, 'w+') as file:
for i, val in enumerate(answer):
file.write(val)
if (i + 1) % 9 == 0 and i < len(answer) - 1:
file.write('\n')
# Display the warped image
if image_debug:
sudoku_dims = (sudoku_contour_warped.shape[1] // scale, sudoku_contour_warped.shape[0] // scale)
cv.imshow('warped', cv.resize(sudoku_contour_warped, sudoku_dims))
else:
print(f"Could not find sudoku in image with name {image_path}!")
if image_debug:
cv.waitKey(0)
return # Display only one image
# Task number 2
def task2():
# Local paths
images_path = ".\\antrenare\\jigsaw\\"
destination_path = ".\\results\\Constantinescu_Andrei-Eduard_344\\jigsaw\\"
# Count how many images we processed
images_counter = 1
# Iterate each image
for image_path in glob.glob(images_path + '*.jpg'):
# Debug single image
if image_debug:
debug_image_number = 20
if str(debug_image_number) not in image_path:
continue
# Image padding
image_padding_horizontal = 100
image_padding_vertical = 0
# Read image
image = cv.imread(image_path)
file_name = str(images_counter) + "_predicted.txt"
images_counter += 1
# Expand canvas to add padding to image
old_image_height, old_image_width, channels = image.shape
# New size of padded image
new_image_width = old_image_width + image_padding_horizontal
new_image_height = old_image_height + image_padding_vertical
# Create new array for padded image
padded_image = np.full((new_image_height, new_image_width, channels), (200, 200, 200), dtype=np.uint8)
# Calculate the center of the padded image
x_center = (new_image_width - old_image_width) // 2
y_center = (new_image_height - old_image_height) // 2
# Paste original image into the center
padded_image[y_center:y_center + old_image_height, x_center:x_center + old_image_width] = image
# Gray, blur and threshold padded image
grayed_image = cv.cvtColor(padded_image, cv.COLOR_BGR2GRAY)
blurred_image = cv.GaussianBlur(grayed_image, (15, 15), 6)
thresholded_image = cv.adaptiveThreshold(blurred_image, 255, cv.ADAPTIVE_THRESH_GAUSSIAN_C, cv.THRESH_BINARY, 33, 4)
thresholded_image = cv.bitwise_not(thresholded_image)
# Get contours
contours = cv.findContours(thresholded_image.copy(), cv.RETR_EXTERNAL, cv.CHAIN_APPROX_SIMPLE)
contours = imutils.grab_contours(contours)
contours = sorted(contours, key=cv.contourArea, reverse=True)
# If we find a sudoku square save it in sudoku_contour
sudoku_contour = None
# Iterate through contours
for c in contours:
# Convex Hull
epsilon = 0.02 * cv.arcLength(c, True)
approx = cv.approxPolyDP(c, epsilon, True)
# Find the bounding rectangle of contour to check its size
x, y, w, h = cv.boundingRect(c)
# Draw contour if square and if size of box is higher than threshold (so that text cannot be picked up)
if len(approx) == 4 and 10000000 > w * h > 500000:
# Draw the contour and display bounding square size
if draw_debug:
cv.putText(padded_image, f'Box size: {str(w * h)} pixels', (15, 60), cv.FONT_HERSHEY_SIMPLEX, 2, (0, 255, 0), 4)
cv.drawContours(padded_image, [approx], -1, (0, 255, 0), 4)
# Save the contour as the sudoku_contour
sudoku_contour = approx
break
# If we found a sudoku contour then proceed to wrap image so that it contains only the sudoku contour
if sudoku_contour is not None:
# Order points from contour
rect = np.zeros((4, 2), dtype='float32')
sudoku_contour_reshaped = sudoku_contour.reshape(4, 2)
# Calculate the sum and difference of x and y of each corner
points_sum = sudoku_contour_reshaped.sum(axis=1)
points_diff = np.diff(sudoku_contour_reshaped, axis=1)
# First element will be top left and third will be bottom right (minimum sum and maximum sum)
rect[0] = sudoku_contour_reshaped[np.argmin(points_sum)]
rect[2] = sudoku_contour_reshaped[np.argmax(points_sum)]
# Second element will be top right and last will be bottom left (minimum and maximum diff)
rect[1] = sudoku_contour_reshaped[np.argmin(points_diff)]
rect[3] = sudoku_contour_reshaped[np.argmax(points_diff)]
# Calculate the width of the new reshaped image
width_bottom = np.sqrt(((rect[2][0] - rect[3][0]) ** 2) + ((rect[2][1] - rect[3][1]) ** 2))
width_top = np.sqrt(((rect[1][0] - rect[0][0]) ** 2) + ((rect[1][1] - rect[0][1]) ** 2))
width_max = max(int(width_top), int(width_bottom))
# Calculate the height of the new reshaped image
height_right = np.sqrt(((rect[1][0] - rect[2][0]) ** 2) + ((rect[1][1] - rect[2][1]) ** 2))
height_left = np.sqrt(((rect[0][0] - rect[3][0]) ** 2) + ((rect[0][1] - rect[3][1]) ** 2))
height_max = max(int(height_left), int(height_right))
# Put text in each corner of the sudoku box
if image_debug:
if draw_debug:
for i, r in enumerate(rect):
cv.putText(padded_image, str(i), (int(r[0]), int(r[1])), cv.FONT_HERSHEY_SIMPLEX, 2, (0, 0, 255), 5)
# Draw image before transformation
dims = (padded_image.shape[1] // scale, padded_image.shape[0] // scale)
cv.imshow('image', cv.resize(padded_image, dims))
# Construct the size of the new image and save it in a matrix
sudoku_matrix_template = np.array([[0, 0], [width_max - 1, 0], [width_max - 1, height_max - 1], [0, height_max - 1]], dtype='float32')
perspective_transform = cv.getPerspectiveTransform(rect, sudoku_matrix_template)
sudoku_contour_warped = cv.warpPerspective(padded_image, perspective_transform, (width_max, height_max))
# Gray and blur image
sudoku_grayed_image = cv.cvtColor(sudoku_contour_warped, cv.COLOR_BGR2GRAY)
sudoku_blurred_image = cv.GaussianBlur(sudoku_grayed_image, (5, 5), 3)
# Do opening of image (erode then dilate) to remove thin lines and keep the thick ones
sudoku_kernel_erode = np.ones((19, 19), np.uint8)
T, sudoku_thresholded_image = cv.threshold(sudoku_blurred_image, 80, 255, cv.THRESH_BINARY_INV | cv.THRESH_OTSU)
sudoku_opened = cv.morphologyEx(sudoku_thresholded_image, cv2.MORPH_OPEN, sudoku_kernel_erode)
# Invert the opened image and convert it to rgb
sudoku_opened = cv.bitwise_not(sudoku_opened)
# Draw border around sudoku table to prevent small gaps
border_size = 30
top_left = (border_size // 2, border_size // 2)
bottom_right = (sudoku_opened.shape[1] - border_size // 2, sudoku_opened.shape[0] - border_size // 2)
sudoku_opened = cv.rectangle(sudoku_opened, top_left, bottom_right, (0, 0, 0), border_size)
# Get contours
contours = cv.findContours(sudoku_opened.copy(), cv.RETR_EXTERNAL, cv.CHAIN_APPROX_SIMPLE)
contours = imutils.grab_contours(contours)
# Convert image to rgb to color it
sudoku_opened = cv.cvtColor(sudoku_opened, cv2.COLOR_GRAY2RGB)
# Iterate through different contours to fill white
for number, c in enumerate(contours):
# Convex Hull
epsilon = 0.00002 * cv.arcLength(c, True)
approx = cv.approxPolyDP(c, epsilon, True)
# Fill inside of contour with white to create a canvas
cv.drawContours(sudoku_opened, [approx], -1, (255, 255, 255), cv.FILLED)
if draw_debug:
# Get center of contour
M = cv.moments(c)
cX = int(M["m10"] / M["m00"])
cY = int(M["m01"] / M["m00"])
# Put contour number
cv.putText(sudoku_opened, str(number + 1), (cX, cY), cv.FONT_HERSHEY_SIMPLEX, 7, (0, 255, 0), 20)
# Display the color-zone image
if image_debug:
sudoku_dims = (sudoku_opened.shape[1] // scale, sudoku_opened.shape[0] // scale)
cv.imshow('zoned', cv.resize(sudoku_opened, sudoku_dims))
# Calculate step size for each cell
width_step = sudoku_contour_warped.shape[1] // 9
height_step = sudoku_contour_warped.shape[0] // 9
# Array to hold each cell upper left corner coord
coords = []
# Calculate the upper left coord of each cell
for c in range(0, 81):
coord = ((c % 9) * width_step, (c // 9 * height_step))
coords.append(coord)
if draw_debug:
sudoku_contour_warped = cv.circle(sudoku_contour_warped, coord, 12, (0, 0, 255), -1)
# Iterate through each cell and verify if it is colored, if not, color the whole contour that contains the cell
current_zone = 1
for coord in coords:
padding = 100
cell = sudoku_opened[coord[1] + padding:coord[1] + height_step - padding, coord[0] + padding:coord[0] + width_step - padding].copy()
average_color = cv.mean(cell)[:3]
# Use epsilon to check for small errors between colors
color_epsilon = (5, 5, 5)
# Check if cell is not colored
if abs(average_color[0] - colors['0'][0]) < color_epsilon[0] and abs(average_color[1] - colors['0'][1]) < color_epsilon[1] and abs(average_color[2] - colors['0'][2]) < color_epsilon[2]:
# Iterate through each contour
for c in contours:
# Check if cell is inside the current contour
if cv.pointPolygonTest(c, (coord[0] + padding, coord[1] + padding), False) > 0:
# Color the zone according to its id
cv.drawContours(sudoku_opened, [c], -1, colors[str(current_zone)], cv.FILLED)
current_zone += 1
break
# Display the true color-zone image
if image_debug:
sudoku_dims = (sudoku_opened.shape[1] // scale, sudoku_opened.shape[0] // scale)
cv.imshow('true zoned', cv.resize(sudoku_opened, sudoku_dims))
# Array to hold each cell color-zone
cells_to_zone = []
# Assign color-zone to each cell based on sudoku_opened colors
for i, coord in enumerate(coords):
# Add padding to remove borders
padding = 100
cell = sudoku_opened[coord[1] + padding:coord[1] + height_step - padding, coord[0] + padding:coord[0] + width_step - padding].copy()
average_color = cv.mean(cell)[:3]
# Use epsilon to check for small errors between colors
color_epsilon = (5, 5, 5)
# Iterate through each color
for color in colors.values():
# If average color is close to a defined color
if abs(average_color[0] - color[0]) < color_epsilon[0] and abs(average_color[1] - color[1]) < color_epsilon[1] and abs(average_color[2] - color[2]) < color_epsilon[2]:
cells_to_zone.append(list(colors.keys())[list(colors.values()).index(color)])
# Array to hold indices of cells that contain numbers
cells_with_numbers = []
# Check if cell contains number
for i, coord in enumerate(coords):
# Add padding to remove borders
padding = 40
cell_mean_bias = 10
cell = sudoku_contour_warped[coord[1] + padding:coord[1] + height_step - padding, coord[0] + padding:coord[0] + width_step - padding].copy()
cell_grayed = cv.cvtColor(cell, cv.COLOR_BGR2GRAY)
cell_threshold = cv.threshold(cell_grayed, 145, 255, cv.THRESH_BINARY_INV)[1]
# If there is something inside the cell (if the mean of the cell is higher than the cell_mean_bias) append to the final array
if cell_threshold.mean() > cell_mean_bias:
cells_with_numbers.append(i)
# Generate final answer array
answer = []
for i in range(81):
answer.append(cells_to_zone[i])
if i in cells_with_numbers:
answer.append('x')
else:
answer.append('o')
if write_files:
# Create folder if not exists
if not os.path.exists(destination_path):
os.makedirs(destination_path)
# Save answer and create file
with open(destination_path + file_name, 'w+') as file:
for i, val in enumerate(answer):
file.write(val)
if (i + 1) % 18 == 0 and i < len(answer) - 1:
file.write('\n')
# Display the warped image
if image_debug:
sudoku_dims = (sudoku_contour_warped.shape[1] // scale, sudoku_contour_warped.shape[0] // scale)
cv.imshow('warped', cv.resize(sudoku_contour_warped, sudoku_dims))
else:
print(f"Could not find sudoku in image with name {image_path}!")
if image_debug:
cv.waitKey(0)
return # Display only one image
if __name__ == "__main__":
# task1()
# task2()
pass
| 43.561069 | 201 | 0.585254 | 2,971 | 22,826 | 4.324806 | 0.119825 | 0.049576 | 0.028096 | 0.01432 | 0.820998 | 0.799673 | 0.795937 | 0.782551 | 0.782551 | 0.772278 | 0 | 0.036578 | 0.318496 | 22,826 | 523 | 202 | 43.644359 | 0.789406 | 0.210681 | 0 | 0.748227 | 0 | 0 | 0.02626 | 0.008493 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007092 | false | 0.003546 | 0.021277 | 0 | 0.035461 | 0.007092 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7d93ed61276f1f99621d7f10d8f55543025faf36 | 113,332 | py | Python | boto3_type_annotations_with_docs/boto3_type_annotations/dax/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 119 | 2018-12-01T18:20:57.000Z | 2022-02-02T10:31:29.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/dax/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 15 | 2018-11-16T00:16:44.000Z | 2021-11-13T03:44:18.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/dax/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 11 | 2019-05-06T05:26:51.000Z | 2021-09-28T15:27:59.000Z | from typing import Optional
from botocore.client import BaseClient
from typing import Dict
from botocore.paginate import Paginator
from datetime import datetime
from botocore.waiter import Waiter
from typing import Union
from typing import List
class Client(BaseClient):
def can_paginate(self, operation_name: str = None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is ``create_foo``, and you\'d normally invoke the
operation as ``client.create_foo(**kwargs)``, if the
``create_foo`` operation can be paginated, you can use the
call ``client.get_paginator(\"create_foo\")``.
:return: ``True`` if the operation can be paginated,
``False`` otherwise.
"""
pass
def create_cluster(self, ClusterName: str, NodeType: str, ReplicationFactor: int, IamRoleArn: str, Description: str = None, AvailabilityZones: List = None, SubnetGroupName: str = None, SecurityGroupIds: List = None, PreferredMaintenanceWindow: str = None, NotificationTopicArn: str = None, ParameterGroupName: str = None, Tags: List = None, SSESpecification: Dict = None) -> Dict:
"""
Creates a DAX cluster. All nodes in the cluster run the same DAX caching software.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/CreateCluster>`_
**Request Syntax**
::
response = client.create_cluster(
ClusterName='string',
NodeType='string',
Description='string',
ReplicationFactor=123,
AvailabilityZones=[
'string',
],
SubnetGroupName='string',
SecurityGroupIds=[
'string',
],
PreferredMaintenanceWindow='string',
NotificationTopicArn='string',
IamRoleArn='string',
ParameterGroupName='string',
Tags=[
{
'Key': 'string',
'Value': 'string'
},
],
SSESpecification={
'Enabled': True|False
}
)
**Response Syntax**
::
{
'Cluster': {
'ClusterName': 'string',
'Description': 'string',
'ClusterArn': 'string',
'TotalNodes': 123,
'ActiveNodes': 123,
'NodeType': 'string',
'Status': 'string',
'ClusterDiscoveryEndpoint': {
'Address': 'string',
'Port': 123
},
'NodeIdsToRemove': [
'string',
],
'Nodes': [
{
'NodeId': 'string',
'Endpoint': {
'Address': 'string',
'Port': 123
},
'NodeCreateTime': datetime(2015, 1, 1),
'AvailabilityZone': 'string',
'NodeStatus': 'string',
'ParameterGroupStatus': 'string'
},
],
'PreferredMaintenanceWindow': 'string',
'NotificationConfiguration': {
'TopicArn': 'string',
'TopicStatus': 'string'
},
'SubnetGroup': 'string',
'SecurityGroups': [
{
'SecurityGroupIdentifier': 'string',
'Status': 'string'
},
],
'IamRoleArn': 'string',
'ParameterGroup': {
'ParameterGroupName': 'string',
'ParameterApplyStatus': 'string',
'NodeIdsToReboot': [
'string',
]
},
'SSEDescription': {
'Status': 'ENABLING'|'ENABLED'|'DISABLING'|'DISABLED'
}
}
}
**Response Structure**
- *(dict) --*
- **Cluster** *(dict) --*
A description of the DAX cluster that you have created.
- **ClusterName** *(string) --*
The name of the DAX cluster.
- **Description** *(string) --*
The description of the cluster.
- **ClusterArn** *(string) --*
The Amazon Resource Name (ARN) that uniquely identifies the cluster.
- **TotalNodes** *(integer) --*
The total number of nodes in the cluster.
- **ActiveNodes** *(integer) --*
The number of nodes in the cluster that are active (i.e., capable of serving requests).
- **NodeType** *(string) --*
The node type for the nodes in the cluster. (All nodes in a DAX cluster are of the same type.)
- **Status** *(string) --*
The current status of the cluster.
- **ClusterDiscoveryEndpoint** *(dict) --*
The configuration endpoint for this DAX cluster, consisting of a DNS name and a port number. Client applications can specify this endpoint, rather than an individual node endpoint, and allow the DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeIdsToRemove** *(list) --*
A list of nodes to be removed from the cluster.
- *(string) --*
- **Nodes** *(list) --*
A list of nodes that are currently in the cluster.
- *(dict) --*
Represents an individual node within a DAX cluster.
- **NodeId** *(string) --*
A system-generated identifier for the node.
- **Endpoint** *(dict) --*
The endpoint for the node, consisting of a DNS name and a port number. Client applications can connect directly to a node endpoint, if desired (as an alternative to allowing DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeCreateTime** *(datetime) --*
The date and time (in UNIX epoch format) when the node was launched.
- **AvailabilityZone** *(string) --*
The Availability Zone (AZ) in which the node has been deployed.
- **NodeStatus** *(string) --*
The current status of the node. For example: ``available`` .
- **ParameterGroupStatus** *(string) --*
The status of the parameter group associated with this node. For example, ``in-sync`` .
- **PreferredMaintenanceWindow** *(string) --*
A range of time when maintenance of DAX cluster software will be performed. For example: ``sun:01:00-sun:09:00`` . Cluster maintenance normally takes less than 30 minutes, and is performed automatically within the maintenance window.
- **NotificationConfiguration** *(dict) --*
Describes a notification topic and its status. Notification topics are used for publishing DAX events to subscribers using Amazon Simple Notification Service (SNS).
- **TopicArn** *(string) --*
The Amazon Resource Name (ARN) that identifies the topic.
- **TopicStatus** *(string) --*
The current state of the topic.
- **SubnetGroup** *(string) --*
The subnet group where the DAX cluster is running.
- **SecurityGroups** *(list) --*
A list of security groups, and the status of each, for the nodes in the cluster.
- *(dict) --*
An individual VPC security group and its status.
- **SecurityGroupIdentifier** *(string) --*
The unique ID for this security group.
- **Status** *(string) --*
The status of this security group.
- **IamRoleArn** *(string) --*
A valid Amazon Resource Name (ARN) that identifies an IAM role. At runtime, DAX will assume this role and use the role's permissions to access DynamoDB on your behalf.
- **ParameterGroup** *(dict) --*
The parameter group being used by nodes in the cluster.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **ParameterApplyStatus** *(string) --*
The status of parameter updates.
- **NodeIdsToReboot** *(list) --*
The node IDs of one or more nodes to be rebooted.
- *(string) --*
- **SSEDescription** *(dict) --*
The description of the server-side encryption status on the specified DAX cluster.
- **Status** *(string) --*
The current state of server-side encryption:
* ``ENABLING`` - Server-side encryption is being enabled.
* ``ENABLED`` - Server-side encryption is enabled.
* ``DISABLING`` - Server-side encryption is being disabled.
* ``DISABLED`` - Server-side encryption is disabled.
:type ClusterName: string
:param ClusterName: **[REQUIRED]**
The cluster identifier. This parameter is stored as a lowercase string.
**Constraints:**
* A name must contain from 1 to 20 alphanumeric characters or hyphens.
* The first character must be a letter.
* A name cannot end with a hyphen or contain two consecutive hyphens.
:type NodeType: string
:param NodeType: **[REQUIRED]**
The compute and memory capacity of the nodes in the cluster.
:type Description: string
:param Description:
A description of the cluster.
:type ReplicationFactor: integer
:param ReplicationFactor: **[REQUIRED]**
The number of nodes in the DAX cluster. A replication factor of 1 will create a single-node cluster, without any read replicas. For additional fault tolerance, you can create a multiple node cluster with one or more read replicas. To do this, set *ReplicationFactor* to 2 or more.
.. note::
AWS recommends that you have at least two read replicas per cluster.
:type AvailabilityZones: list
:param AvailabilityZones:
The Availability Zones (AZs) in which the cluster nodes will be created. All nodes belonging to the cluster are placed in these Availability Zones. Use this parameter if you want to distribute the nodes across multiple AZs.
- *(string) --*
:type SubnetGroupName: string
:param SubnetGroupName:
The name of the subnet group to be used for the replication group.
.. warning::
DAX clusters can only run in an Amazon VPC environment. All of the subnets that you specify in a subnet group must exist in the same VPC.
:type SecurityGroupIds: list
:param SecurityGroupIds:
A list of security group IDs to be assigned to each node in the DAX cluster. (Each of the security group ID is system-generated.)
If this parameter is not specified, DAX assigns the default VPC security group to each node.
- *(string) --*
:type PreferredMaintenanceWindow: string
:param PreferredMaintenanceWindow:
Specifies the weekly time range during which maintenance on the DAX cluster is performed. It is specified as a range in the format ddd:hh24:mi-ddd:hh24:mi (24H Clock UTC). The minimum maintenance window is a 60 minute period. Valid values for ``ddd`` are:
* ``sun``
* ``mon``
* ``tue``
* ``wed``
* ``thu``
* ``fri``
* ``sat``
Example: ``sun:05:00-sun:09:00``
.. note::
If you don\'t specify a preferred maintenance window when you create or modify a cache cluster, DAX assigns a 60-minute maintenance window on a randomly selected day of the week.
:type NotificationTopicArn: string
:param NotificationTopicArn:
The Amazon Resource Name (ARN) of the Amazon SNS topic to which notifications will be sent.
.. note::
The Amazon SNS topic owner must be same as the DAX cluster owner.
:type IamRoleArn: string
:param IamRoleArn: **[REQUIRED]**
A valid Amazon Resource Name (ARN) that identifies an IAM role. At runtime, DAX will assume this role and use the role\'s permissions to access DynamoDB on your behalf.
:type ParameterGroupName: string
:param ParameterGroupName:
The parameter group to be associated with the DAX cluster.
:type Tags: list
:param Tags:
A set of tags to associate with the DAX cluster.
- *(dict) --*
A description of a tag. Every tag is a key-value pair. You can add up to 50 tags to a single DAX cluster.
AWS-assigned tag names and values are automatically assigned the ``aws:`` prefix, which the user cannot assign. AWS-assigned tag names do not count towards the tag limit of 50. User-assigned tag names have the prefix ``user:`` .
You cannot backdate the application of a tag.
- **Key** *(string) --*
The key for the tag. Tag keys are case sensitive. Every DAX cluster can only have one tag with the same key. If you try to add an existing tag (same key), the existing tag value will be updated to the new value.
- **Value** *(string) --*
The value of the tag. Tag values are case-sensitive and can be null.
:type SSESpecification: dict
:param SSESpecification:
Represents the settings used to enable server-side encryption on the cluster.
- **Enabled** *(boolean) --* **[REQUIRED]**
Indicates whether server-side encryption is enabled (true) or disabled (false) on the cluster.
:rtype: dict
:returns:
"""
pass
def create_parameter_group(self, ParameterGroupName: str, Description: str = None) -> Dict:
"""
Creates a new parameter group. A parameter group is a collection of parameters that you apply to all of the nodes in a DAX cluster.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/CreateParameterGroup>`_
**Request Syntax**
::
response = client.create_parameter_group(
ParameterGroupName='string',
Description='string'
)
**Response Syntax**
::
{
'ParameterGroup': {
'ParameterGroupName': 'string',
'Description': 'string'
}
}
**Response Structure**
- *(dict) --*
- **ParameterGroup** *(dict) --*
Represents the output of a *CreateParameterGroup* action.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **Description** *(string) --*
A description of the parameter group.
:type ParameterGroupName: string
:param ParameterGroupName: **[REQUIRED]**
The name of the parameter group to apply to all of the clusters in this replication group.
:type Description: string
:param Description:
A description of the parameter group.
:rtype: dict
:returns:
"""
pass
def create_subnet_group(self, SubnetGroupName: str, SubnetIds: List, Description: str = None) -> Dict:
"""
Creates a new subnet group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/CreateSubnetGroup>`_
**Request Syntax**
::
response = client.create_subnet_group(
SubnetGroupName='string',
Description='string',
SubnetIds=[
'string',
]
)
**Response Syntax**
::
{
'SubnetGroup': {
'SubnetGroupName': 'string',
'Description': 'string',
'VpcId': 'string',
'Subnets': [
{
'SubnetIdentifier': 'string',
'SubnetAvailabilityZone': 'string'
},
]
}
}
**Response Structure**
- *(dict) --*
- **SubnetGroup** *(dict) --*
Represents the output of a *CreateSubnetGroup* operation.
- **SubnetGroupName** *(string) --*
The name of the subnet group.
- **Description** *(string) --*
The description of the subnet group.
- **VpcId** *(string) --*
The Amazon Virtual Private Cloud identifier (VPC ID) of the subnet group.
- **Subnets** *(list) --*
A list of subnets associated with the subnet group.
- *(dict) --*
Represents the subnet associated with a DAX cluster. This parameter refers to subnets defined in Amazon Virtual Private Cloud (Amazon VPC) and used with DAX.
- **SubnetIdentifier** *(string) --*
The system-assigned identifier for the subnet.
- **SubnetAvailabilityZone** *(string) --*
The Availability Zone (AZ) for subnet subnet.
:type SubnetGroupName: string
:param SubnetGroupName: **[REQUIRED]**
A name for the subnet group. This value is stored as a lowercase string.
:type Description: string
:param Description:
A description for the subnet group
:type SubnetIds: list
:param SubnetIds: **[REQUIRED]**
A list of VPC subnet IDs for the subnet group.
- *(string) --*
:rtype: dict
:returns:
"""
pass
def decrease_replication_factor(self, ClusterName: str, NewReplicationFactor: int, AvailabilityZones: List = None, NodeIdsToRemove: List = None) -> Dict:
"""
Removes one or more nodes from a DAX cluster.
.. note::
You cannot use ``DecreaseReplicationFactor`` to remove the last node in a DAX cluster. If you need to do this, use ``DeleteCluster`` instead.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DecreaseReplicationFactor>`_
**Request Syntax**
::
response = client.decrease_replication_factor(
ClusterName='string',
NewReplicationFactor=123,
AvailabilityZones=[
'string',
],
NodeIdsToRemove=[
'string',
]
)
**Response Syntax**
::
{
'Cluster': {
'ClusterName': 'string',
'Description': 'string',
'ClusterArn': 'string',
'TotalNodes': 123,
'ActiveNodes': 123,
'NodeType': 'string',
'Status': 'string',
'ClusterDiscoveryEndpoint': {
'Address': 'string',
'Port': 123
},
'NodeIdsToRemove': [
'string',
],
'Nodes': [
{
'NodeId': 'string',
'Endpoint': {
'Address': 'string',
'Port': 123
},
'NodeCreateTime': datetime(2015, 1, 1),
'AvailabilityZone': 'string',
'NodeStatus': 'string',
'ParameterGroupStatus': 'string'
},
],
'PreferredMaintenanceWindow': 'string',
'NotificationConfiguration': {
'TopicArn': 'string',
'TopicStatus': 'string'
},
'SubnetGroup': 'string',
'SecurityGroups': [
{
'SecurityGroupIdentifier': 'string',
'Status': 'string'
},
],
'IamRoleArn': 'string',
'ParameterGroup': {
'ParameterGroupName': 'string',
'ParameterApplyStatus': 'string',
'NodeIdsToReboot': [
'string',
]
},
'SSEDescription': {
'Status': 'ENABLING'|'ENABLED'|'DISABLING'|'DISABLED'
}
}
}
**Response Structure**
- *(dict) --*
- **Cluster** *(dict) --*
A description of the DAX cluster, after you have decreased its replication factor.
- **ClusterName** *(string) --*
The name of the DAX cluster.
- **Description** *(string) --*
The description of the cluster.
- **ClusterArn** *(string) --*
The Amazon Resource Name (ARN) that uniquely identifies the cluster.
- **TotalNodes** *(integer) --*
The total number of nodes in the cluster.
- **ActiveNodes** *(integer) --*
The number of nodes in the cluster that are active (i.e., capable of serving requests).
- **NodeType** *(string) --*
The node type for the nodes in the cluster. (All nodes in a DAX cluster are of the same type.)
- **Status** *(string) --*
The current status of the cluster.
- **ClusterDiscoveryEndpoint** *(dict) --*
The configuration endpoint for this DAX cluster, consisting of a DNS name and a port number. Client applications can specify this endpoint, rather than an individual node endpoint, and allow the DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeIdsToRemove** *(list) --*
A list of nodes to be removed from the cluster.
- *(string) --*
- **Nodes** *(list) --*
A list of nodes that are currently in the cluster.
- *(dict) --*
Represents an individual node within a DAX cluster.
- **NodeId** *(string) --*
A system-generated identifier for the node.
- **Endpoint** *(dict) --*
The endpoint for the node, consisting of a DNS name and a port number. Client applications can connect directly to a node endpoint, if desired (as an alternative to allowing DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeCreateTime** *(datetime) --*
The date and time (in UNIX epoch format) when the node was launched.
- **AvailabilityZone** *(string) --*
The Availability Zone (AZ) in which the node has been deployed.
- **NodeStatus** *(string) --*
The current status of the node. For example: ``available`` .
- **ParameterGroupStatus** *(string) --*
The status of the parameter group associated with this node. For example, ``in-sync`` .
- **PreferredMaintenanceWindow** *(string) --*
A range of time when maintenance of DAX cluster software will be performed. For example: ``sun:01:00-sun:09:00`` . Cluster maintenance normally takes less than 30 minutes, and is performed automatically within the maintenance window.
- **NotificationConfiguration** *(dict) --*
Describes a notification topic and its status. Notification topics are used for publishing DAX events to subscribers using Amazon Simple Notification Service (SNS).
- **TopicArn** *(string) --*
The Amazon Resource Name (ARN) that identifies the topic.
- **TopicStatus** *(string) --*
The current state of the topic.
- **SubnetGroup** *(string) --*
The subnet group where the DAX cluster is running.
- **SecurityGroups** *(list) --*
A list of security groups, and the status of each, for the nodes in the cluster.
- *(dict) --*
An individual VPC security group and its status.
- **SecurityGroupIdentifier** *(string) --*
The unique ID for this security group.
- **Status** *(string) --*
The status of this security group.
- **IamRoleArn** *(string) --*
A valid Amazon Resource Name (ARN) that identifies an IAM role. At runtime, DAX will assume this role and use the role's permissions to access DynamoDB on your behalf.
- **ParameterGroup** *(dict) --*
The parameter group being used by nodes in the cluster.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **ParameterApplyStatus** *(string) --*
The status of parameter updates.
- **NodeIdsToReboot** *(list) --*
The node IDs of one or more nodes to be rebooted.
- *(string) --*
- **SSEDescription** *(dict) --*
The description of the server-side encryption status on the specified DAX cluster.
- **Status** *(string) --*
The current state of server-side encryption:
* ``ENABLING`` - Server-side encryption is being enabled.
* ``ENABLED`` - Server-side encryption is enabled.
* ``DISABLING`` - Server-side encryption is being disabled.
* ``DISABLED`` - Server-side encryption is disabled.
:type ClusterName: string
:param ClusterName: **[REQUIRED]**
The name of the DAX cluster from which you want to remove nodes.
:type NewReplicationFactor: integer
:param NewReplicationFactor: **[REQUIRED]**
The new number of nodes for the DAX cluster.
:type AvailabilityZones: list
:param AvailabilityZones:
The Availability Zone(s) from which to remove nodes.
- *(string) --*
:type NodeIdsToRemove: list
:param NodeIdsToRemove:
The unique identifiers of the nodes to be removed from the cluster.
- *(string) --*
:rtype: dict
:returns:
"""
pass
def delete_cluster(self, ClusterName: str) -> Dict:
"""
Deletes a previously provisioned DAX cluster. *DeleteCluster* deletes all associated nodes, node endpoints and the DAX cluster itself. When you receive a successful response from this action, DAX immediately begins deleting the cluster; you cannot cancel or revert this action.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DeleteCluster>`_
**Request Syntax**
::
response = client.delete_cluster(
ClusterName='string'
)
**Response Syntax**
::
{
'Cluster': {
'ClusterName': 'string',
'Description': 'string',
'ClusterArn': 'string',
'TotalNodes': 123,
'ActiveNodes': 123,
'NodeType': 'string',
'Status': 'string',
'ClusterDiscoveryEndpoint': {
'Address': 'string',
'Port': 123
},
'NodeIdsToRemove': [
'string',
],
'Nodes': [
{
'NodeId': 'string',
'Endpoint': {
'Address': 'string',
'Port': 123
},
'NodeCreateTime': datetime(2015, 1, 1),
'AvailabilityZone': 'string',
'NodeStatus': 'string',
'ParameterGroupStatus': 'string'
},
],
'PreferredMaintenanceWindow': 'string',
'NotificationConfiguration': {
'TopicArn': 'string',
'TopicStatus': 'string'
},
'SubnetGroup': 'string',
'SecurityGroups': [
{
'SecurityGroupIdentifier': 'string',
'Status': 'string'
},
],
'IamRoleArn': 'string',
'ParameterGroup': {
'ParameterGroupName': 'string',
'ParameterApplyStatus': 'string',
'NodeIdsToReboot': [
'string',
]
},
'SSEDescription': {
'Status': 'ENABLING'|'ENABLED'|'DISABLING'|'DISABLED'
}
}
}
**Response Structure**
- *(dict) --*
- **Cluster** *(dict) --*
A description of the DAX cluster that is being deleted.
- **ClusterName** *(string) --*
The name of the DAX cluster.
- **Description** *(string) --*
The description of the cluster.
- **ClusterArn** *(string) --*
The Amazon Resource Name (ARN) that uniquely identifies the cluster.
- **TotalNodes** *(integer) --*
The total number of nodes in the cluster.
- **ActiveNodes** *(integer) --*
The number of nodes in the cluster that are active (i.e., capable of serving requests).
- **NodeType** *(string) --*
The node type for the nodes in the cluster. (All nodes in a DAX cluster are of the same type.)
- **Status** *(string) --*
The current status of the cluster.
- **ClusterDiscoveryEndpoint** *(dict) --*
The configuration endpoint for this DAX cluster, consisting of a DNS name and a port number. Client applications can specify this endpoint, rather than an individual node endpoint, and allow the DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeIdsToRemove** *(list) --*
A list of nodes to be removed from the cluster.
- *(string) --*
- **Nodes** *(list) --*
A list of nodes that are currently in the cluster.
- *(dict) --*
Represents an individual node within a DAX cluster.
- **NodeId** *(string) --*
A system-generated identifier for the node.
- **Endpoint** *(dict) --*
The endpoint for the node, consisting of a DNS name and a port number. Client applications can connect directly to a node endpoint, if desired (as an alternative to allowing DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeCreateTime** *(datetime) --*
The date and time (in UNIX epoch format) when the node was launched.
- **AvailabilityZone** *(string) --*
The Availability Zone (AZ) in which the node has been deployed.
- **NodeStatus** *(string) --*
The current status of the node. For example: ``available`` .
- **ParameterGroupStatus** *(string) --*
The status of the parameter group associated with this node. For example, ``in-sync`` .
- **PreferredMaintenanceWindow** *(string) --*
A range of time when maintenance of DAX cluster software will be performed. For example: ``sun:01:00-sun:09:00`` . Cluster maintenance normally takes less than 30 minutes, and is performed automatically within the maintenance window.
- **NotificationConfiguration** *(dict) --*
Describes a notification topic and its status. Notification topics are used for publishing DAX events to subscribers using Amazon Simple Notification Service (SNS).
- **TopicArn** *(string) --*
The Amazon Resource Name (ARN) that identifies the topic.
- **TopicStatus** *(string) --*
The current state of the topic.
- **SubnetGroup** *(string) --*
The subnet group where the DAX cluster is running.
- **SecurityGroups** *(list) --*
A list of security groups, and the status of each, for the nodes in the cluster.
- *(dict) --*
An individual VPC security group and its status.
- **SecurityGroupIdentifier** *(string) --*
The unique ID for this security group.
- **Status** *(string) --*
The status of this security group.
- **IamRoleArn** *(string) --*
A valid Amazon Resource Name (ARN) that identifies an IAM role. At runtime, DAX will assume this role and use the role's permissions to access DynamoDB on your behalf.
- **ParameterGroup** *(dict) --*
The parameter group being used by nodes in the cluster.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **ParameterApplyStatus** *(string) --*
The status of parameter updates.
- **NodeIdsToReboot** *(list) --*
The node IDs of one or more nodes to be rebooted.
- *(string) --*
- **SSEDescription** *(dict) --*
The description of the server-side encryption status on the specified DAX cluster.
- **Status** *(string) --*
The current state of server-side encryption:
* ``ENABLING`` - Server-side encryption is being enabled.
* ``ENABLED`` - Server-side encryption is enabled.
* ``DISABLING`` - Server-side encryption is being disabled.
* ``DISABLED`` - Server-side encryption is disabled.
:type ClusterName: string
:param ClusterName: **[REQUIRED]**
The name of the cluster to be deleted.
:rtype: dict
:returns:
"""
pass
def delete_parameter_group(self, ParameterGroupName: str) -> Dict:
"""
Deletes the specified parameter group. You cannot delete a parameter group if it is associated with any DAX clusters.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DeleteParameterGroup>`_
**Request Syntax**
::
response = client.delete_parameter_group(
ParameterGroupName='string'
)
**Response Syntax**
::
{
'DeletionMessage': 'string'
}
**Response Structure**
- *(dict) --*
- **DeletionMessage** *(string) --*
A user-specified message for this action (i.e., a reason for deleting the parameter group).
:type ParameterGroupName: string
:param ParameterGroupName: **[REQUIRED]**
The name of the parameter group to delete.
:rtype: dict
:returns:
"""
pass
def delete_subnet_group(self, SubnetGroupName: str) -> Dict:
"""
Deletes a subnet group.
.. note::
You cannot delete a subnet group if it is associated with any DAX clusters.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DeleteSubnetGroup>`_
**Request Syntax**
::
response = client.delete_subnet_group(
SubnetGroupName='string'
)
**Response Syntax**
::
{
'DeletionMessage': 'string'
}
**Response Structure**
- *(dict) --*
- **DeletionMessage** *(string) --*
A user-specified message for this action (i.e., a reason for deleting the subnet group).
:type SubnetGroupName: string
:param SubnetGroupName: **[REQUIRED]**
The name of the subnet group to delete.
:rtype: dict
:returns:
"""
pass
def describe_clusters(self, ClusterNames: List = None, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns information about all provisioned DAX clusters if no cluster identifier is specified, or about a specific DAX cluster if a cluster identifier is supplied.
If the cluster is in the CREATING state, only cluster level information will be displayed until all of the nodes are successfully provisioned.
If the cluster is in the DELETING state, only cluster level information will be displayed.
If nodes are currently being added to the DAX cluster, node endpoint information and creation time for the additional nodes will not be displayed until they are completely provisioned. When the DAX cluster state is *available* , the cluster is ready for use.
If nodes are currently being removed from the DAX cluster, no endpoint information for the removed nodes is displayed.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DescribeClusters>`_
**Request Syntax**
::
response = client.describe_clusters(
ClusterNames=[
'string',
],
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Clusters': [
{
'ClusterName': 'string',
'Description': 'string',
'ClusterArn': 'string',
'TotalNodes': 123,
'ActiveNodes': 123,
'NodeType': 'string',
'Status': 'string',
'ClusterDiscoveryEndpoint': {
'Address': 'string',
'Port': 123
},
'NodeIdsToRemove': [
'string',
],
'Nodes': [
{
'NodeId': 'string',
'Endpoint': {
'Address': 'string',
'Port': 123
},
'NodeCreateTime': datetime(2015, 1, 1),
'AvailabilityZone': 'string',
'NodeStatus': 'string',
'ParameterGroupStatus': 'string'
},
],
'PreferredMaintenanceWindow': 'string',
'NotificationConfiguration': {
'TopicArn': 'string',
'TopicStatus': 'string'
},
'SubnetGroup': 'string',
'SecurityGroups': [
{
'SecurityGroupIdentifier': 'string',
'Status': 'string'
},
],
'IamRoleArn': 'string',
'ParameterGroup': {
'ParameterGroupName': 'string',
'ParameterApplyStatus': 'string',
'NodeIdsToReboot': [
'string',
]
},
'SSEDescription': {
'Status': 'ENABLING'|'ENABLED'|'DISABLING'|'DISABLED'
}
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --*
Provides an identifier to allow retrieval of paginated results.
- **Clusters** *(list) --*
The descriptions of your DAX clusters, in response to a *DescribeClusters* request.
- *(dict) --*
Contains all of the attributes of a specific DAX cluster.
- **ClusterName** *(string) --*
The name of the DAX cluster.
- **Description** *(string) --*
The description of the cluster.
- **ClusterArn** *(string) --*
The Amazon Resource Name (ARN) that uniquely identifies the cluster.
- **TotalNodes** *(integer) --*
The total number of nodes in the cluster.
- **ActiveNodes** *(integer) --*
The number of nodes in the cluster that are active (i.e., capable of serving requests).
- **NodeType** *(string) --*
The node type for the nodes in the cluster. (All nodes in a DAX cluster are of the same type.)
- **Status** *(string) --*
The current status of the cluster.
- **ClusterDiscoveryEndpoint** *(dict) --*
The configuration endpoint for this DAX cluster, consisting of a DNS name and a port number. Client applications can specify this endpoint, rather than an individual node endpoint, and allow the DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeIdsToRemove** *(list) --*
A list of nodes to be removed from the cluster.
- *(string) --*
- **Nodes** *(list) --*
A list of nodes that are currently in the cluster.
- *(dict) --*
Represents an individual node within a DAX cluster.
- **NodeId** *(string) --*
A system-generated identifier for the node.
- **Endpoint** *(dict) --*
The endpoint for the node, consisting of a DNS name and a port number. Client applications can connect directly to a node endpoint, if desired (as an alternative to allowing DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeCreateTime** *(datetime) --*
The date and time (in UNIX epoch format) when the node was launched.
- **AvailabilityZone** *(string) --*
The Availability Zone (AZ) in which the node has been deployed.
- **NodeStatus** *(string) --*
The current status of the node. For example: ``available`` .
- **ParameterGroupStatus** *(string) --*
The status of the parameter group associated with this node. For example, ``in-sync`` .
- **PreferredMaintenanceWindow** *(string) --*
A range of time when maintenance of DAX cluster software will be performed. For example: ``sun:01:00-sun:09:00`` . Cluster maintenance normally takes less than 30 minutes, and is performed automatically within the maintenance window.
- **NotificationConfiguration** *(dict) --*
Describes a notification topic and its status. Notification topics are used for publishing DAX events to subscribers using Amazon Simple Notification Service (SNS).
- **TopicArn** *(string) --*
The Amazon Resource Name (ARN) that identifies the topic.
- **TopicStatus** *(string) --*
The current state of the topic.
- **SubnetGroup** *(string) --*
The subnet group where the DAX cluster is running.
- **SecurityGroups** *(list) --*
A list of security groups, and the status of each, for the nodes in the cluster.
- *(dict) --*
An individual VPC security group and its status.
- **SecurityGroupIdentifier** *(string) --*
The unique ID for this security group.
- **Status** *(string) --*
The status of this security group.
- **IamRoleArn** *(string) --*
A valid Amazon Resource Name (ARN) that identifies an IAM role. At runtime, DAX will assume this role and use the role's permissions to access DynamoDB on your behalf.
- **ParameterGroup** *(dict) --*
The parameter group being used by nodes in the cluster.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **ParameterApplyStatus** *(string) --*
The status of parameter updates.
- **NodeIdsToReboot** *(list) --*
The node IDs of one or more nodes to be rebooted.
- *(string) --*
- **SSEDescription** *(dict) --*
The description of the server-side encryption status on the specified DAX cluster.
- **Status** *(string) --*
The current state of server-side encryption:
* ``ENABLING`` - Server-side encryption is being enabled.
* ``ENABLED`` - Server-side encryption is enabled.
* ``DISABLING`` - Server-side encryption is being disabled.
* ``DISABLED`` - Server-side encryption is disabled.
:type ClusterNames: list
:param ClusterNames:
The names of the DAX clusters being described.
- *(string) --*
:type MaxResults: integer
:param MaxResults:
The maximum number of results to include in the response. If more results exist than the specified ``MaxResults`` value, a token is included in the response so that the remaining results can be retrieved.
The value for ``MaxResults`` must be between 20 and 100.
:type NextToken: string
:param NextToken:
An optional token returned from a prior request. Use this token for pagination of results from this action. If this parameter is specified, the response includes only results beyond the token, up to the value specified by ``MaxResults`` .
:rtype: dict
:returns:
"""
pass
def describe_default_parameters(self, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns the default system parameter information for the DAX caching software.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DescribeDefaultParameters>`_
**Request Syntax**
::
response = client.describe_default_parameters(
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Parameters': [
{
'ParameterName': 'string',
'ParameterType': 'DEFAULT'|'NODE_TYPE_SPECIFIC',
'ParameterValue': 'string',
'NodeTypeSpecificValues': [
{
'NodeType': 'string',
'Value': 'string'
},
],
'Description': 'string',
'Source': 'string',
'DataType': 'string',
'AllowedValues': 'string',
'IsModifiable': 'TRUE'|'FALSE'|'CONDITIONAL',
'ChangeType': 'IMMEDIATE'|'REQUIRES_REBOOT'
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --*
Provides an identifier to allow retrieval of paginated results.
- **Parameters** *(list) --*
A list of parameters. Each element in the list represents one parameter.
- *(dict) --*
Describes an individual setting that controls some aspect of DAX behavior.
- **ParameterName** *(string) --*
The name of the parameter.
- **ParameterType** *(string) --*
Determines whether the parameter can be applied to any nodes, or only nodes of a particular type.
- **ParameterValue** *(string) --*
The value for the parameter.
- **NodeTypeSpecificValues** *(list) --*
A list of node types, and specific parameter values for each node.
- *(dict) --*
Represents a parameter value that is applicable to a particular node type.
- **NodeType** *(string) --*
A node type to which the parameter value applies.
- **Value** *(string) --*
The parameter value for this node type.
- **Description** *(string) --*
A description of the parameter
- **Source** *(string) --*
How the parameter is defined. For example, ``system`` denotes a system-defined parameter.
- **DataType** *(string) --*
The data type of the parameter. For example, ``integer`` :
- **AllowedValues** *(string) --*
A range of values within which the parameter can be set.
- **IsModifiable** *(string) --*
Whether the customer is allowed to modify the parameter.
- **ChangeType** *(string) --*
The conditions under which changes to this parameter can be applied. For example, ``requires-reboot`` indicates that a new value for this parameter will only take effect if a node is rebooted.
:type MaxResults: integer
:param MaxResults:
The maximum number of results to include in the response. If more results exist than the specified ``MaxResults`` value, a token is included in the response so that the remaining results can be retrieved.
The value for ``MaxResults`` must be between 20 and 100.
:type NextToken: string
:param NextToken:
An optional token returned from a prior request. Use this token for pagination of results from this action. If this parameter is specified, the response includes only results beyond the token, up to the value specified by ``MaxResults`` .
:rtype: dict
:returns:
"""
pass
def describe_events(self, SourceName: str = None, SourceType: str = None, StartTime: datetime = None, EndTime: datetime = None, Duration: int = None, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns events related to DAX clusters and parameter groups. You can obtain events specific to a particular DAX cluster or parameter group by providing the name as a parameter.
By default, only the events occurring within the last hour are returned; however, you can retrieve up to 14 days' worth of events if necessary.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DescribeEvents>`_
**Request Syntax**
::
response = client.describe_events(
SourceName='string',
SourceType='CLUSTER'|'PARAMETER_GROUP'|'SUBNET_GROUP',
StartTime=datetime(2015, 1, 1),
EndTime=datetime(2015, 1, 1),
Duration=123,
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Events': [
{
'SourceName': 'string',
'SourceType': 'CLUSTER'|'PARAMETER_GROUP'|'SUBNET_GROUP',
'Message': 'string',
'Date': datetime(2015, 1, 1)
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --*
Provides an identifier to allow retrieval of paginated results.
- **Events** *(list) --*
An array of events. Each element in the array represents one event.
- *(dict) --*
Represents a single occurrence of something interesting within the system. Some examples of events are creating a DAX cluster, adding or removing a node, or rebooting a node.
- **SourceName** *(string) --*
The source of the event. For example, if the event occurred at the node level, the source would be the node ID.
- **SourceType** *(string) --*
Specifies the origin of this event - a cluster, a parameter group, a node ID, etc.
- **Message** *(string) --*
A user-defined message associated with the event.
- **Date** *(datetime) --*
The date and time when the event occurred.
:type SourceName: string
:param SourceName:
The identifier of the event source for which events will be returned. If not specified, then all sources are included in the response.
:type SourceType: string
:param SourceType:
The event source to retrieve events for. If no value is specified, all events are returned.
:type StartTime: datetime
:param StartTime:
The beginning of the time interval to retrieve events for, specified in ISO 8601 format.
:type EndTime: datetime
:param EndTime:
The end of the time interval for which to retrieve events, specified in ISO 8601 format.
:type Duration: integer
:param Duration:
The number of minutes\' worth of events to retrieve.
:type MaxResults: integer
:param MaxResults:
The maximum number of results to include in the response. If more results exist than the specified ``MaxResults`` value, a token is included in the response so that the remaining results can be retrieved.
The value for ``MaxResults`` must be between 20 and 100.
:type NextToken: string
:param NextToken:
An optional token returned from a prior request. Use this token for pagination of results from this action. If this parameter is specified, the response includes only results beyond the token, up to the value specified by ``MaxResults`` .
:rtype: dict
:returns:
"""
pass
def describe_parameter_groups(self, ParameterGroupNames: List = None, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns a list of parameter group descriptions. If a parameter group name is specified, the list will contain only the descriptions for that group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DescribeParameterGroups>`_
**Request Syntax**
::
response = client.describe_parameter_groups(
ParameterGroupNames=[
'string',
],
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'ParameterGroups': [
{
'ParameterGroupName': 'string',
'Description': 'string'
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --*
Provides an identifier to allow retrieval of paginated results.
- **ParameterGroups** *(list) --*
An array of parameter groups. Each element in the array represents one parameter group.
- *(dict) --*
A named set of parameters that are applied to all of the nodes in a DAX cluster.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **Description** *(string) --*
A description of the parameter group.
:type ParameterGroupNames: list
:param ParameterGroupNames:
The names of the parameter groups.
- *(string) --*
:type MaxResults: integer
:param MaxResults:
The maximum number of results to include in the response. If more results exist than the specified ``MaxResults`` value, a token is included in the response so that the remaining results can be retrieved.
The value for ``MaxResults`` must be between 20 and 100.
:type NextToken: string
:param NextToken:
An optional token returned from a prior request. Use this token for pagination of results from this action. If this parameter is specified, the response includes only results beyond the token, up to the value specified by ``MaxResults`` .
:rtype: dict
:returns:
"""
pass
def describe_parameters(self, ParameterGroupName: str, Source: str = None, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns the detailed parameter list for a particular parameter group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DescribeParameters>`_
**Request Syntax**
::
response = client.describe_parameters(
ParameterGroupName='string',
Source='string',
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'Parameters': [
{
'ParameterName': 'string',
'ParameterType': 'DEFAULT'|'NODE_TYPE_SPECIFIC',
'ParameterValue': 'string',
'NodeTypeSpecificValues': [
{
'NodeType': 'string',
'Value': 'string'
},
],
'Description': 'string',
'Source': 'string',
'DataType': 'string',
'AllowedValues': 'string',
'IsModifiable': 'TRUE'|'FALSE'|'CONDITIONAL',
'ChangeType': 'IMMEDIATE'|'REQUIRES_REBOOT'
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --*
Provides an identifier to allow retrieval of paginated results.
- **Parameters** *(list) --*
A list of parameters within a parameter group. Each element in the list represents one parameter.
- *(dict) --*
Describes an individual setting that controls some aspect of DAX behavior.
- **ParameterName** *(string) --*
The name of the parameter.
- **ParameterType** *(string) --*
Determines whether the parameter can be applied to any nodes, or only nodes of a particular type.
- **ParameterValue** *(string) --*
The value for the parameter.
- **NodeTypeSpecificValues** *(list) --*
A list of node types, and specific parameter values for each node.
- *(dict) --*
Represents a parameter value that is applicable to a particular node type.
- **NodeType** *(string) --*
A node type to which the parameter value applies.
- **Value** *(string) --*
The parameter value for this node type.
- **Description** *(string) --*
A description of the parameter
- **Source** *(string) --*
How the parameter is defined. For example, ``system`` denotes a system-defined parameter.
- **DataType** *(string) --*
The data type of the parameter. For example, ``integer`` :
- **AllowedValues** *(string) --*
A range of values within which the parameter can be set.
- **IsModifiable** *(string) --*
Whether the customer is allowed to modify the parameter.
- **ChangeType** *(string) --*
The conditions under which changes to this parameter can be applied. For example, ``requires-reboot`` indicates that a new value for this parameter will only take effect if a node is rebooted.
:type ParameterGroupName: string
:param ParameterGroupName: **[REQUIRED]**
The name of the parameter group.
:type Source: string
:param Source:
How the parameter is defined. For example, ``system`` denotes a system-defined parameter.
:type MaxResults: integer
:param MaxResults:
The maximum number of results to include in the response. If more results exist than the specified ``MaxResults`` value, a token is included in the response so that the remaining results can be retrieved.
The value for ``MaxResults`` must be between 20 and 100.
:type NextToken: string
:param NextToken:
An optional token returned from a prior request. Use this token for pagination of results from this action. If this parameter is specified, the response includes only results beyond the token, up to the value specified by ``MaxResults`` .
:rtype: dict
:returns:
"""
pass
def describe_subnet_groups(self, SubnetGroupNames: List = None, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns a list of subnet group descriptions. If a subnet group name is specified, the list will contain only the description of that group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/DescribeSubnetGroups>`_
**Request Syntax**
::
response = client.describe_subnet_groups(
SubnetGroupNames=[
'string',
],
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'NextToken': 'string',
'SubnetGroups': [
{
'SubnetGroupName': 'string',
'Description': 'string',
'VpcId': 'string',
'Subnets': [
{
'SubnetIdentifier': 'string',
'SubnetAvailabilityZone': 'string'
},
]
},
]
}
**Response Structure**
- *(dict) --*
- **NextToken** *(string) --*
Provides an identifier to allow retrieval of paginated results.
- **SubnetGroups** *(list) --*
An array of subnet groups. Each element in the array represents a single subnet group.
- *(dict) --*
Represents the output of one of the following actions:
* *CreateSubnetGroup*
* *ModifySubnetGroup*
- **SubnetGroupName** *(string) --*
The name of the subnet group.
- **Description** *(string) --*
The description of the subnet group.
- **VpcId** *(string) --*
The Amazon Virtual Private Cloud identifier (VPC ID) of the subnet group.
- **Subnets** *(list) --*
A list of subnets associated with the subnet group.
- *(dict) --*
Represents the subnet associated with a DAX cluster. This parameter refers to subnets defined in Amazon Virtual Private Cloud (Amazon VPC) and used with DAX.
- **SubnetIdentifier** *(string) --*
The system-assigned identifier for the subnet.
- **SubnetAvailabilityZone** *(string) --*
The Availability Zone (AZ) for subnet subnet.
:type SubnetGroupNames: list
:param SubnetGroupNames:
The name of the subnet group.
- *(string) --*
:type MaxResults: integer
:param MaxResults:
The maximum number of results to include in the response. If more results exist than the specified ``MaxResults`` value, a token is included in the response so that the remaining results can be retrieved.
The value for ``MaxResults`` must be between 20 and 100.
:type NextToken: string
:param NextToken:
An optional token returned from a prior request. Use this token for pagination of results from this action. If this parameter is specified, the response includes only results beyond the token, up to the value specified by ``MaxResults`` .
:rtype: dict
:returns:
"""
pass
def generate_presigned_url(self, ClientMethod: str = None, Params: Dict = None, ExpiresIn: int = None, HttpMethod: str = None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to
``ClientMethod``.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid
for. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By
default, the http method is whatever is used in the method\'s model.
:returns: The presigned url
"""
pass
def get_paginator(self, operation_name: str = None) -> Paginator:
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is ``create_foo``, and you\'d normally invoke the
operation as ``client.create_foo(**kwargs)``, if the
``create_foo`` operation can be paginated, you can use the
call ``client.get_paginator(\"create_foo\")``.
:raise OperationNotPageableError: Raised if the operation is not
pageable. You can use the ``client.can_paginate`` method to
check if an operation is pageable.
:rtype: L{botocore.paginate.Paginator}
:return: A paginator object.
"""
pass
def get_waiter(self, waiter_name: str = None) -> Waiter:
"""
Returns an object that can wait for some condition.
:type waiter_name: str
:param waiter_name: The name of the waiter to get. See the waiters
section of the service docs for a list of available waiters.
:returns: The specified waiter object.
:rtype: botocore.waiter.Waiter
"""
pass
def increase_replication_factor(self, ClusterName: str, NewReplicationFactor: int, AvailabilityZones: List = None) -> Dict:
"""
Adds one or more nodes to a DAX cluster.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/IncreaseReplicationFactor>`_
**Request Syntax**
::
response = client.increase_replication_factor(
ClusterName='string',
NewReplicationFactor=123,
AvailabilityZones=[
'string',
]
)
**Response Syntax**
::
{
'Cluster': {
'ClusterName': 'string',
'Description': 'string',
'ClusterArn': 'string',
'TotalNodes': 123,
'ActiveNodes': 123,
'NodeType': 'string',
'Status': 'string',
'ClusterDiscoveryEndpoint': {
'Address': 'string',
'Port': 123
},
'NodeIdsToRemove': [
'string',
],
'Nodes': [
{
'NodeId': 'string',
'Endpoint': {
'Address': 'string',
'Port': 123
},
'NodeCreateTime': datetime(2015, 1, 1),
'AvailabilityZone': 'string',
'NodeStatus': 'string',
'ParameterGroupStatus': 'string'
},
],
'PreferredMaintenanceWindow': 'string',
'NotificationConfiguration': {
'TopicArn': 'string',
'TopicStatus': 'string'
},
'SubnetGroup': 'string',
'SecurityGroups': [
{
'SecurityGroupIdentifier': 'string',
'Status': 'string'
},
],
'IamRoleArn': 'string',
'ParameterGroup': {
'ParameterGroupName': 'string',
'ParameterApplyStatus': 'string',
'NodeIdsToReboot': [
'string',
]
},
'SSEDescription': {
'Status': 'ENABLING'|'ENABLED'|'DISABLING'|'DISABLED'
}
}
}
**Response Structure**
- *(dict) --*
- **Cluster** *(dict) --*
A description of the DAX cluster. with its new replication factor.
- **ClusterName** *(string) --*
The name of the DAX cluster.
- **Description** *(string) --*
The description of the cluster.
- **ClusterArn** *(string) --*
The Amazon Resource Name (ARN) that uniquely identifies the cluster.
- **TotalNodes** *(integer) --*
The total number of nodes in the cluster.
- **ActiveNodes** *(integer) --*
The number of nodes in the cluster that are active (i.e., capable of serving requests).
- **NodeType** *(string) --*
The node type for the nodes in the cluster. (All nodes in a DAX cluster are of the same type.)
- **Status** *(string) --*
The current status of the cluster.
- **ClusterDiscoveryEndpoint** *(dict) --*
The configuration endpoint for this DAX cluster, consisting of a DNS name and a port number. Client applications can specify this endpoint, rather than an individual node endpoint, and allow the DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeIdsToRemove** *(list) --*
A list of nodes to be removed from the cluster.
- *(string) --*
- **Nodes** *(list) --*
A list of nodes that are currently in the cluster.
- *(dict) --*
Represents an individual node within a DAX cluster.
- **NodeId** *(string) --*
A system-generated identifier for the node.
- **Endpoint** *(dict) --*
The endpoint for the node, consisting of a DNS name and a port number. Client applications can connect directly to a node endpoint, if desired (as an alternative to allowing DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeCreateTime** *(datetime) --*
The date and time (in UNIX epoch format) when the node was launched.
- **AvailabilityZone** *(string) --*
The Availability Zone (AZ) in which the node has been deployed.
- **NodeStatus** *(string) --*
The current status of the node. For example: ``available`` .
- **ParameterGroupStatus** *(string) --*
The status of the parameter group associated with this node. For example, ``in-sync`` .
- **PreferredMaintenanceWindow** *(string) --*
A range of time when maintenance of DAX cluster software will be performed. For example: ``sun:01:00-sun:09:00`` . Cluster maintenance normally takes less than 30 minutes, and is performed automatically within the maintenance window.
- **NotificationConfiguration** *(dict) --*
Describes a notification topic and its status. Notification topics are used for publishing DAX events to subscribers using Amazon Simple Notification Service (SNS).
- **TopicArn** *(string) --*
The Amazon Resource Name (ARN) that identifies the topic.
- **TopicStatus** *(string) --*
The current state of the topic.
- **SubnetGroup** *(string) --*
The subnet group where the DAX cluster is running.
- **SecurityGroups** *(list) --*
A list of security groups, and the status of each, for the nodes in the cluster.
- *(dict) --*
An individual VPC security group and its status.
- **SecurityGroupIdentifier** *(string) --*
The unique ID for this security group.
- **Status** *(string) --*
The status of this security group.
- **IamRoleArn** *(string) --*
A valid Amazon Resource Name (ARN) that identifies an IAM role. At runtime, DAX will assume this role and use the role's permissions to access DynamoDB on your behalf.
- **ParameterGroup** *(dict) --*
The parameter group being used by nodes in the cluster.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **ParameterApplyStatus** *(string) --*
The status of parameter updates.
- **NodeIdsToReboot** *(list) --*
The node IDs of one or more nodes to be rebooted.
- *(string) --*
- **SSEDescription** *(dict) --*
The description of the server-side encryption status on the specified DAX cluster.
- **Status** *(string) --*
The current state of server-side encryption:
* ``ENABLING`` - Server-side encryption is being enabled.
* ``ENABLED`` - Server-side encryption is enabled.
* ``DISABLING`` - Server-side encryption is being disabled.
* ``DISABLED`` - Server-side encryption is disabled.
:type ClusterName: string
:param ClusterName: **[REQUIRED]**
The name of the DAX cluster that will receive additional nodes.
:type NewReplicationFactor: integer
:param NewReplicationFactor: **[REQUIRED]**
The new number of nodes for the DAX cluster.
:type AvailabilityZones: list
:param AvailabilityZones:
The Availability Zones (AZs) in which the cluster nodes will be created. All nodes belonging to the cluster are placed in these Availability Zones. Use this parameter if you want to distribute the nodes across multiple AZs.
- *(string) --*
:rtype: dict
:returns:
"""
pass
def list_tags(self, ResourceName: str, NextToken: str = None) -> Dict:
"""
List all of the tags for a DAX cluster. You can call ``ListTags`` up to 10 times per second, per account.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/ListTags>`_
**Request Syntax**
::
response = client.list_tags(
ResourceName='string',
NextToken='string'
)
**Response Syntax**
::
{
'Tags': [
{
'Key': 'string',
'Value': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **Tags** *(list) --*
A list of tags currently associated with the DAX cluster.
- *(dict) --*
A description of a tag. Every tag is a key-value pair. You can add up to 50 tags to a single DAX cluster.
AWS-assigned tag names and values are automatically assigned the ``aws:`` prefix, which the user cannot assign. AWS-assigned tag names do not count towards the tag limit of 50. User-assigned tag names have the prefix ``user:`` .
You cannot backdate the application of a tag.
- **Key** *(string) --*
The key for the tag. Tag keys are case sensitive. Every DAX cluster can only have one tag with the same key. If you try to add an existing tag (same key), the existing tag value will be updated to the new value.
- **Value** *(string) --*
The value of the tag. Tag values are case-sensitive and can be null.
- **NextToken** *(string) --*
If this value is present, there are additional results to be displayed. To retrieve them, call ``ListTags`` again, with ``NextToken`` set to this value.
:type ResourceName: string
:param ResourceName: **[REQUIRED]**
The name of the DAX resource to which the tags belong.
:type NextToken: string
:param NextToken:
An optional token returned from a prior request. Use this token for pagination of results from this action. If this parameter is specified, the response includes only results beyond the token.
:rtype: dict
:returns:
"""
pass
def reboot_node(self, ClusterName: str, NodeId: str) -> Dict:
"""
Reboots a single node of a DAX cluster. The reboot action takes place as soon as possible. During the reboot, the node status is set to REBOOTING.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/RebootNode>`_
**Request Syntax**
::
response = client.reboot_node(
ClusterName='string',
NodeId='string'
)
**Response Syntax**
::
{
'Cluster': {
'ClusterName': 'string',
'Description': 'string',
'ClusterArn': 'string',
'TotalNodes': 123,
'ActiveNodes': 123,
'NodeType': 'string',
'Status': 'string',
'ClusterDiscoveryEndpoint': {
'Address': 'string',
'Port': 123
},
'NodeIdsToRemove': [
'string',
],
'Nodes': [
{
'NodeId': 'string',
'Endpoint': {
'Address': 'string',
'Port': 123
},
'NodeCreateTime': datetime(2015, 1, 1),
'AvailabilityZone': 'string',
'NodeStatus': 'string',
'ParameterGroupStatus': 'string'
},
],
'PreferredMaintenanceWindow': 'string',
'NotificationConfiguration': {
'TopicArn': 'string',
'TopicStatus': 'string'
},
'SubnetGroup': 'string',
'SecurityGroups': [
{
'SecurityGroupIdentifier': 'string',
'Status': 'string'
},
],
'IamRoleArn': 'string',
'ParameterGroup': {
'ParameterGroupName': 'string',
'ParameterApplyStatus': 'string',
'NodeIdsToReboot': [
'string',
]
},
'SSEDescription': {
'Status': 'ENABLING'|'ENABLED'|'DISABLING'|'DISABLED'
}
}
}
**Response Structure**
- *(dict) --*
- **Cluster** *(dict) --*
A description of the DAX cluster after a node has been rebooted.
- **ClusterName** *(string) --*
The name of the DAX cluster.
- **Description** *(string) --*
The description of the cluster.
- **ClusterArn** *(string) --*
The Amazon Resource Name (ARN) that uniquely identifies the cluster.
- **TotalNodes** *(integer) --*
The total number of nodes in the cluster.
- **ActiveNodes** *(integer) --*
The number of nodes in the cluster that are active (i.e., capable of serving requests).
- **NodeType** *(string) --*
The node type for the nodes in the cluster. (All nodes in a DAX cluster are of the same type.)
- **Status** *(string) --*
The current status of the cluster.
- **ClusterDiscoveryEndpoint** *(dict) --*
The configuration endpoint for this DAX cluster, consisting of a DNS name and a port number. Client applications can specify this endpoint, rather than an individual node endpoint, and allow the DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeIdsToRemove** *(list) --*
A list of nodes to be removed from the cluster.
- *(string) --*
- **Nodes** *(list) --*
A list of nodes that are currently in the cluster.
- *(dict) --*
Represents an individual node within a DAX cluster.
- **NodeId** *(string) --*
A system-generated identifier for the node.
- **Endpoint** *(dict) --*
The endpoint for the node, consisting of a DNS name and a port number. Client applications can connect directly to a node endpoint, if desired (as an alternative to allowing DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeCreateTime** *(datetime) --*
The date and time (in UNIX epoch format) when the node was launched.
- **AvailabilityZone** *(string) --*
The Availability Zone (AZ) in which the node has been deployed.
- **NodeStatus** *(string) --*
The current status of the node. For example: ``available`` .
- **ParameterGroupStatus** *(string) --*
The status of the parameter group associated with this node. For example, ``in-sync`` .
- **PreferredMaintenanceWindow** *(string) --*
A range of time when maintenance of DAX cluster software will be performed. For example: ``sun:01:00-sun:09:00`` . Cluster maintenance normally takes less than 30 minutes, and is performed automatically within the maintenance window.
- **NotificationConfiguration** *(dict) --*
Describes a notification topic and its status. Notification topics are used for publishing DAX events to subscribers using Amazon Simple Notification Service (SNS).
- **TopicArn** *(string) --*
The Amazon Resource Name (ARN) that identifies the topic.
- **TopicStatus** *(string) --*
The current state of the topic.
- **SubnetGroup** *(string) --*
The subnet group where the DAX cluster is running.
- **SecurityGroups** *(list) --*
A list of security groups, and the status of each, for the nodes in the cluster.
- *(dict) --*
An individual VPC security group and its status.
- **SecurityGroupIdentifier** *(string) --*
The unique ID for this security group.
- **Status** *(string) --*
The status of this security group.
- **IamRoleArn** *(string) --*
A valid Amazon Resource Name (ARN) that identifies an IAM role. At runtime, DAX will assume this role and use the role's permissions to access DynamoDB on your behalf.
- **ParameterGroup** *(dict) --*
The parameter group being used by nodes in the cluster.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **ParameterApplyStatus** *(string) --*
The status of parameter updates.
- **NodeIdsToReboot** *(list) --*
The node IDs of one or more nodes to be rebooted.
- *(string) --*
- **SSEDescription** *(dict) --*
The description of the server-side encryption status on the specified DAX cluster.
- **Status** *(string) --*
The current state of server-side encryption:
* ``ENABLING`` - Server-side encryption is being enabled.
* ``ENABLED`` - Server-side encryption is enabled.
* ``DISABLING`` - Server-side encryption is being disabled.
* ``DISABLED`` - Server-side encryption is disabled.
:type ClusterName: string
:param ClusterName: **[REQUIRED]**
The name of the DAX cluster containing the node to be rebooted.
:type NodeId: string
:param NodeId: **[REQUIRED]**
The system-assigned ID of the node to be rebooted.
:rtype: dict
:returns:
"""
pass
def tag_resource(self, ResourceName: str, Tags: List) -> Dict:
"""
Associates a set of tags with a DAX resource. You can call ``TagResource`` up to 5 times per second, per account.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/TagResource>`_
**Request Syntax**
::
response = client.tag_resource(
ResourceName='string',
Tags=[
{
'Key': 'string',
'Value': 'string'
},
]
)
**Response Syntax**
::
{
'Tags': [
{
'Key': 'string',
'Value': 'string'
},
]
}
**Response Structure**
- *(dict) --*
- **Tags** *(list) --*
The list of tags that are associated with the DAX resource.
- *(dict) --*
A description of a tag. Every tag is a key-value pair. You can add up to 50 tags to a single DAX cluster.
AWS-assigned tag names and values are automatically assigned the ``aws:`` prefix, which the user cannot assign. AWS-assigned tag names do not count towards the tag limit of 50. User-assigned tag names have the prefix ``user:`` .
You cannot backdate the application of a tag.
- **Key** *(string) --*
The key for the tag. Tag keys are case sensitive. Every DAX cluster can only have one tag with the same key. If you try to add an existing tag (same key), the existing tag value will be updated to the new value.
- **Value** *(string) --*
The value of the tag. Tag values are case-sensitive and can be null.
:type ResourceName: string
:param ResourceName: **[REQUIRED]**
The name of the DAX resource to which tags should be added.
:type Tags: list
:param Tags: **[REQUIRED]**
The tags to be assigned to the DAX resource.
- *(dict) --*
A description of a tag. Every tag is a key-value pair. You can add up to 50 tags to a single DAX cluster.
AWS-assigned tag names and values are automatically assigned the ``aws:`` prefix, which the user cannot assign. AWS-assigned tag names do not count towards the tag limit of 50. User-assigned tag names have the prefix ``user:`` .
You cannot backdate the application of a tag.
- **Key** *(string) --*
The key for the tag. Tag keys are case sensitive. Every DAX cluster can only have one tag with the same key. If you try to add an existing tag (same key), the existing tag value will be updated to the new value.
- **Value** *(string) --*
The value of the tag. Tag values are case-sensitive and can be null.
:rtype: dict
:returns:
"""
pass
def untag_resource(self, ResourceName: str, TagKeys: List) -> Dict:
"""
Removes the association of tags from a DAX resource. You can call ``UntagResource`` up to 5 times per second, per account.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/UntagResource>`_
**Request Syntax**
::
response = client.untag_resource(
ResourceName='string',
TagKeys=[
'string',
]
)
**Response Syntax**
::
{
'Tags': [
{
'Key': 'string',
'Value': 'string'
},
]
}
**Response Structure**
- *(dict) --*
- **Tags** *(list) --*
The tag keys that have been removed from the cluster.
- *(dict) --*
A description of a tag. Every tag is a key-value pair. You can add up to 50 tags to a single DAX cluster.
AWS-assigned tag names and values are automatically assigned the ``aws:`` prefix, which the user cannot assign. AWS-assigned tag names do not count towards the tag limit of 50. User-assigned tag names have the prefix ``user:`` .
You cannot backdate the application of a tag.
- **Key** *(string) --*
The key for the tag. Tag keys are case sensitive. Every DAX cluster can only have one tag with the same key. If you try to add an existing tag (same key), the existing tag value will be updated to the new value.
- **Value** *(string) --*
The value of the tag. Tag values are case-sensitive and can be null.
:type ResourceName: string
:param ResourceName: **[REQUIRED]**
The name of the DAX resource from which the tags should be removed.
:type TagKeys: list
:param TagKeys: **[REQUIRED]**
A list of tag keys. If the DAX cluster has any tags with these keys, then the tags are removed from the cluster.
- *(string) --*
:rtype: dict
:returns:
"""
pass
def update_cluster(self, ClusterName: str, Description: str = None, PreferredMaintenanceWindow: str = None, NotificationTopicArn: str = None, NotificationTopicStatus: str = None, ParameterGroupName: str = None, SecurityGroupIds: List = None) -> Dict:
"""
Modifies the settings for a DAX cluster. You can use this action to change one or more cluster configuration parameters by specifying the parameters and the new values.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/UpdateCluster>`_
**Request Syntax**
::
response = client.update_cluster(
ClusterName='string',
Description='string',
PreferredMaintenanceWindow='string',
NotificationTopicArn='string',
NotificationTopicStatus='string',
ParameterGroupName='string',
SecurityGroupIds=[
'string',
]
)
**Response Syntax**
::
{
'Cluster': {
'ClusterName': 'string',
'Description': 'string',
'ClusterArn': 'string',
'TotalNodes': 123,
'ActiveNodes': 123,
'NodeType': 'string',
'Status': 'string',
'ClusterDiscoveryEndpoint': {
'Address': 'string',
'Port': 123
},
'NodeIdsToRemove': [
'string',
],
'Nodes': [
{
'NodeId': 'string',
'Endpoint': {
'Address': 'string',
'Port': 123
},
'NodeCreateTime': datetime(2015, 1, 1),
'AvailabilityZone': 'string',
'NodeStatus': 'string',
'ParameterGroupStatus': 'string'
},
],
'PreferredMaintenanceWindow': 'string',
'NotificationConfiguration': {
'TopicArn': 'string',
'TopicStatus': 'string'
},
'SubnetGroup': 'string',
'SecurityGroups': [
{
'SecurityGroupIdentifier': 'string',
'Status': 'string'
},
],
'IamRoleArn': 'string',
'ParameterGroup': {
'ParameterGroupName': 'string',
'ParameterApplyStatus': 'string',
'NodeIdsToReboot': [
'string',
]
},
'SSEDescription': {
'Status': 'ENABLING'|'ENABLED'|'DISABLING'|'DISABLED'
}
}
}
**Response Structure**
- *(dict) --*
- **Cluster** *(dict) --*
A description of the DAX cluster, after it has been modified.
- **ClusterName** *(string) --*
The name of the DAX cluster.
- **Description** *(string) --*
The description of the cluster.
- **ClusterArn** *(string) --*
The Amazon Resource Name (ARN) that uniquely identifies the cluster.
- **TotalNodes** *(integer) --*
The total number of nodes in the cluster.
- **ActiveNodes** *(integer) --*
The number of nodes in the cluster that are active (i.e., capable of serving requests).
- **NodeType** *(string) --*
The node type for the nodes in the cluster. (All nodes in a DAX cluster are of the same type.)
- **Status** *(string) --*
The current status of the cluster.
- **ClusterDiscoveryEndpoint** *(dict) --*
The configuration endpoint for this DAX cluster, consisting of a DNS name and a port number. Client applications can specify this endpoint, rather than an individual node endpoint, and allow the DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeIdsToRemove** *(list) --*
A list of nodes to be removed from the cluster.
- *(string) --*
- **Nodes** *(list) --*
A list of nodes that are currently in the cluster.
- *(dict) --*
Represents an individual node within a DAX cluster.
- **NodeId** *(string) --*
A system-generated identifier for the node.
- **Endpoint** *(dict) --*
The endpoint for the node, consisting of a DNS name and a port number. Client applications can connect directly to a node endpoint, if desired (as an alternative to allowing DAX client software to intelligently route requests and responses to nodes in the DAX cluster.
- **Address** *(string) --*
The DNS hostname of the endpoint.
- **Port** *(integer) --*
The port number that applications should use to connect to the endpoint.
- **NodeCreateTime** *(datetime) --*
The date and time (in UNIX epoch format) when the node was launched.
- **AvailabilityZone** *(string) --*
The Availability Zone (AZ) in which the node has been deployed.
- **NodeStatus** *(string) --*
The current status of the node. For example: ``available`` .
- **ParameterGroupStatus** *(string) --*
The status of the parameter group associated with this node. For example, ``in-sync`` .
- **PreferredMaintenanceWindow** *(string) --*
A range of time when maintenance of DAX cluster software will be performed. For example: ``sun:01:00-sun:09:00`` . Cluster maintenance normally takes less than 30 minutes, and is performed automatically within the maintenance window.
- **NotificationConfiguration** *(dict) --*
Describes a notification topic and its status. Notification topics are used for publishing DAX events to subscribers using Amazon Simple Notification Service (SNS).
- **TopicArn** *(string) --*
The Amazon Resource Name (ARN) that identifies the topic.
- **TopicStatus** *(string) --*
The current state of the topic.
- **SubnetGroup** *(string) --*
The subnet group where the DAX cluster is running.
- **SecurityGroups** *(list) --*
A list of security groups, and the status of each, for the nodes in the cluster.
- *(dict) --*
An individual VPC security group and its status.
- **SecurityGroupIdentifier** *(string) --*
The unique ID for this security group.
- **Status** *(string) --*
The status of this security group.
- **IamRoleArn** *(string) --*
A valid Amazon Resource Name (ARN) that identifies an IAM role. At runtime, DAX will assume this role and use the role's permissions to access DynamoDB on your behalf.
- **ParameterGroup** *(dict) --*
The parameter group being used by nodes in the cluster.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **ParameterApplyStatus** *(string) --*
The status of parameter updates.
- **NodeIdsToReboot** *(list) --*
The node IDs of one or more nodes to be rebooted.
- *(string) --*
- **SSEDescription** *(dict) --*
The description of the server-side encryption status on the specified DAX cluster.
- **Status** *(string) --*
The current state of server-side encryption:
* ``ENABLING`` - Server-side encryption is being enabled.
* ``ENABLED`` - Server-side encryption is enabled.
* ``DISABLING`` - Server-side encryption is being disabled.
* ``DISABLED`` - Server-side encryption is disabled.
:type ClusterName: string
:param ClusterName: **[REQUIRED]**
The name of the DAX cluster to be modified.
:type Description: string
:param Description:
A description of the changes being made to the cluster.
:type PreferredMaintenanceWindow: string
:param PreferredMaintenanceWindow:
A range of time when maintenance of DAX cluster software will be performed. For example: ``sun:01:00-sun:09:00`` . Cluster maintenance normally takes less than 30 minutes, and is performed automatically within the maintenance window.
:type NotificationTopicArn: string
:param NotificationTopicArn:
The Amazon Resource Name (ARN) that identifies the topic.
:type NotificationTopicStatus: string
:param NotificationTopicStatus:
The current state of the topic.
:type ParameterGroupName: string
:param ParameterGroupName:
The name of a parameter group for this cluster.
:type SecurityGroupIds: list
:param SecurityGroupIds:
A list of user-specified security group IDs to be assigned to each node in the DAX cluster. If this parameter is not specified, DAX assigns the default VPC security group to each node.
- *(string) --*
:rtype: dict
:returns:
"""
pass
def update_parameter_group(self, ParameterGroupName: str, ParameterNameValues: List) -> Dict:
"""
Modifies the parameters of a parameter group. You can modify up to 20 parameters in a single request by submitting a list parameter name and value pairs.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/UpdateParameterGroup>`_
**Request Syntax**
::
response = client.update_parameter_group(
ParameterGroupName='string',
ParameterNameValues=[
{
'ParameterName': 'string',
'ParameterValue': 'string'
},
]
)
**Response Syntax**
::
{
'ParameterGroup': {
'ParameterGroupName': 'string',
'Description': 'string'
}
}
**Response Structure**
- *(dict) --*
- **ParameterGroup** *(dict) --*
The parameter group that has been modified.
- **ParameterGroupName** *(string) --*
The name of the parameter group.
- **Description** *(string) --*
A description of the parameter group.
:type ParameterGroupName: string
:param ParameterGroupName: **[REQUIRED]**
The name of the parameter group.
:type ParameterNameValues: list
:param ParameterNameValues: **[REQUIRED]**
An array of name-value pairs for the parameters in the group. Each element in the array represents a single parameter.
- *(dict) --*
An individual DAX parameter.
- **ParameterName** *(string) --*
The name of the parameter.
- **ParameterValue** *(string) --*
The value of the parameter.
:rtype: dict
:returns:
"""
pass
def update_subnet_group(self, SubnetGroupName: str, Description: str = None, SubnetIds: List = None) -> Dict:
"""
Modifies an existing subnet group.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/dax-2017-04-19/UpdateSubnetGroup>`_
**Request Syntax**
::
response = client.update_subnet_group(
SubnetGroupName='string',
Description='string',
SubnetIds=[
'string',
]
)
**Response Syntax**
::
{
'SubnetGroup': {
'SubnetGroupName': 'string',
'Description': 'string',
'VpcId': 'string',
'Subnets': [
{
'SubnetIdentifier': 'string',
'SubnetAvailabilityZone': 'string'
},
]
}
}
**Response Structure**
- *(dict) --*
- **SubnetGroup** *(dict) --*
The subnet group that has been modified.
- **SubnetGroupName** *(string) --*
The name of the subnet group.
- **Description** *(string) --*
The description of the subnet group.
- **VpcId** *(string) --*
The Amazon Virtual Private Cloud identifier (VPC ID) of the subnet group.
- **Subnets** *(list) --*
A list of subnets associated with the subnet group.
- *(dict) --*
Represents the subnet associated with a DAX cluster. This parameter refers to subnets defined in Amazon Virtual Private Cloud (Amazon VPC) and used with DAX.
- **SubnetIdentifier** *(string) --*
The system-assigned identifier for the subnet.
- **SubnetAvailabilityZone** *(string) --*
The Availability Zone (AZ) for subnet subnet.
:type SubnetGroupName: string
:param SubnetGroupName: **[REQUIRED]**
The name of the subnet group.
:type Description: string
:param Description:
A description of the subnet group.
:type SubnetIds: list
:param SubnetIds:
A list of subnet IDs in the subnet group.
- *(string) --*
:rtype: dict
:returns:
"""
pass
| 51.891941 | 384 | 0.515124 | 10,588 | 113,332 | 5.502456 | 0.058651 | 0.025798 | 0.011826 | 0.008239 | 0.83287 | 0.803072 | 0.790783 | 0.782921 | 0.769791 | 0.758737 | 0 | 0.007484 | 0.392819 | 113,332 | 2,183 | 385 | 51.915712 | 0.839158 | 0.818577 | 0 | 0.423729 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.423729 | false | 0.423729 | 0.135593 | 0 | 0.576271 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
7dda080b7f42688af9156b0ea113e8bdbc4ed8d2 | 13,936 | py | Python | tests/file_io/encrypted_stream_io.py | dfjxs/dfvfs | a4154b07bb08c3c86afa2847f3224189dd80c138 | [
"Apache-2.0"
] | 176 | 2015-01-02T13:55:39.000Z | 2022-03-12T11:44:37.000Z | tests/file_io/encrypted_stream_io.py | dfjxs/dfvfs | a4154b07bb08c3c86afa2847f3224189dd80c138 | [
"Apache-2.0"
] | 495 | 2015-01-13T06:47:06.000Z | 2022-03-12T11:07:03.000Z | tests/file_io/encrypted_stream_io.py | dfjxs/dfvfs | a4154b07bb08c3c86afa2847f3224189dd80c138 | [
"Apache-2.0"
] | 62 | 2015-02-23T08:19:38.000Z | 2022-03-18T06:01:22.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Tests for the encrypted stream file-like object."""
import os
import unittest
from dfvfs.file_io import encrypted_stream_io
from dfvfs.lib import definitions
from dfvfs.path import factory as path_spec_factory
from dfvfs.resolver import context
from dfvfs.resolver import resolver
from tests.file_io import test_lib
class AESEncryptedStreamWithKeyChainTest(test_lib.PaddedSyslogTestCase):
"""Tests the RC4 encrypted stream file-like object.
The credentials are passed via the key chain.
"""
_AES_KEY = b'This is a key123'
_AES_MODE = definitions.ENCRYPTION_MODE_CBC
_AES_IV = b'This is an IV456'
def setUp(self):
"""Sets up the needed objects used throughout the test."""
self._resolver_context = context.Context()
test_path = self._GetTestFilePath(['syslog.aes'])
self._SkipIfPathNotExists(test_path)
test_os_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_OS, location=test_path)
self._encrypted_stream_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_ENCRYPTED_STREAM,
encryption_method=definitions.ENCRYPTION_METHOD_AES,
parent=test_os_path_spec)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'key', self._AES_KEY)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'initialization_vector',
self._AES_IV)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'cipher_mode', self._AES_MODE)
self.padding_size = 1
def tearDown(self):
"""Cleans up the needed objects used throughout the test."""
self._resolver_context.Empty()
def testOpenCloseFileObject(self):
"""Test the open and close functionality using a file-like object."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testOpenClosePathSpec(self):
"""Test the open and close functionality using a path specification."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testSeek(self):
"""Test the seek functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestSeekFileObject(file_object)
# TODO: Test SEEK_CUR after open.
# Test SEEK_END after open.
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
file_object.seek(-10 - self.padding_size, os.SEEK_END)
self.assertEqual(file_object.read(5), b'times')
def testRead(self):
"""Test the read functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestReadFileObject(file_object)
class AESEncryptedStreamTest(test_lib.PaddedSyslogTestCase):
"""The unit test for a AES encrypted stream file-like object.
The credentials are passed via the path specification.
"""
_AES_CIPHER_MODE = definitions.ENCRYPTION_MODE_CBC
_AES_INITIALIZATION_VECTOR = b'This is an IV456'
_AES_KEY = b'This is a key123'
def setUp(self):
"""Sets up the needed objects used throughout the test."""
self._resolver_context = context.Context()
test_path = self._GetTestFilePath(['syslog.aes'])
self._SkipIfPathNotExists(test_path)
test_os_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_OS, location=test_path)
self._encrypted_stream_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_ENCRYPTED_STREAM,
cipher_mode=self._AES_CIPHER_MODE,
encryption_method=definitions.ENCRYPTION_METHOD_AES,
initialization_vector=self._AES_INITIALIZATION_VECTOR,
key=self._AES_KEY, parent=test_os_path_spec)
self.padding_size = 1
def tearDown(self):
"""Cleans up the needed objects used throughout the test."""
self._resolver_context.Empty()
def testOpenCloseFileObject(self):
"""Test the open and close functionality using a file-like object."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testOpenClosePathSpec(self):
"""Test the open and close functionality using a path specification."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testSeek(self):
"""Test the seek functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestSeekFileObject(file_object)
# TODO: Test SEEK_CUR after open.
# Test SEEK_END after open.
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
file_object.seek(-10 - self.padding_size, os.SEEK_END)
self.assertEqual(file_object.read(5), b'times')
def testRead(self):
"""Test the read functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestReadFileObject(file_object)
class BlowfishEncryptedStreamWithKeyChainTest(test_lib.PaddedSyslogTestCase):
"""Tests the Blowfish encrypted stream file-like object.
The credentials are passed via the key chain.
"""
_BLOWFISH_KEY = b'This is a key123'
_BLOWFISH_MODE = definitions.ENCRYPTION_MODE_CBC
_BLOWFISH_IV = b'This IV!'
def setUp(self):
"""Sets up the needed objects used throughout the test."""
self._resolver_context = context.Context()
test_path = self._GetTestFilePath(['syslog.blowfish'])
self._SkipIfPathNotExists(test_path)
test_os_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_OS, location=test_path)
self._encrypted_stream_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_ENCRYPTED_STREAM,
encryption_method=definitions.ENCRYPTION_METHOD_BLOWFISH,
parent=test_os_path_spec)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'key', self._BLOWFISH_KEY)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'initialization_vector',
self._BLOWFISH_IV)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'cipher_mode', self._BLOWFISH_MODE)
self.padding_size = 1
def tearDown(self):
"""Cleans up the needed objects used throughout the test."""
self._resolver_context.Empty()
def testOpenCloseFileObject(self):
"""Test the open and close functionality using a file-like object."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testOpenClosePathSpec(self):
"""Test the open and close functionality using a path specification."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testSeek(self):
"""Test the seek functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestSeekFileObject(file_object)
# TODO: Test SEEK_CUR after open.
# Test SEEK_END after open.
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
file_object.seek(-10 - self.padding_size, os.SEEK_END)
self.assertEqual(file_object.read(5), b'times')
def testRead(self):
"""Test the read functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestReadFileObject(file_object)
class DES3EncryptedStreamWithKeyChainTest(test_lib.PaddedSyslogTestCase):
"""Tests the Triple DES encrypted stream file-like object.
The credentials are passed via the key chain.
"""
_DES3_KEY = b'This is a key123'
_DES3_MODE = definitions.ENCRYPTION_MODE_CBC
_DES3_IV = b'This IV!'
def setUp(self):
"""Sets up the needed objects used throughout the test."""
self._resolver_context = context.Context()
test_path = self._GetTestFilePath(['syslog.des3'])
self._SkipIfPathNotExists(test_path)
test_os_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_OS, location=test_path)
self._encrypted_stream_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_ENCRYPTED_STREAM,
encryption_method=definitions.ENCRYPTION_METHOD_DES3,
parent=test_os_path_spec)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'key', self._DES3_KEY)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'initialization_vector',
self._DES3_IV)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'cipher_mode', self._DES3_MODE)
self.padding_size = 1
def tearDown(self):
"""Cleans up the needed objects used throughout the test."""
self._resolver_context.Empty()
def testOpenCloseFileObject(self):
"""Test the open and close functionality using a file-like object."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testOpenClosePathSpec(self):
"""Test the open and close functionality using a path specification."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testSeek(self):
"""Test the seek functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestSeekFileObject(file_object)
# TODO: Test SEEK_CUR after open.
# Test SEEK_END after open.
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
file_object.seek(-10 - self.padding_size, os.SEEK_END)
self.assertEqual(file_object.read(5), b'times')
def testRead(self):
"""Test the read functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestReadFileObject(file_object)
class RC4EncryptedStreamWithKeyChainTest(test_lib.SylogTestCase):
"""Tests the RC4 encrypted stream file-like object.
The credentials are passed via the key chain.
"""
_RC4_KEY = b'rc4test'
def setUp(self):
"""Sets up the needed objects used throughout the test."""
self._resolver_context = context.Context()
test_path = self._GetTestFilePath(['syslog.rc4'])
self._SkipIfPathNotExists(test_path)
test_os_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_OS, location=test_path)
self._encrypted_stream_path_spec = path_spec_factory.Factory.NewPathSpec(
definitions.TYPE_INDICATOR_ENCRYPTED_STREAM,
encryption_method=definitions.ENCRYPTION_METHOD_RC4,
parent=test_os_path_spec)
resolver.Resolver.key_chain.SetCredential(
self._encrypted_stream_path_spec, 'key', self._RC4_KEY)
def tearDown(self):
"""Cleans up the needed objects used throughout the test."""
self._resolver_context.Empty()
def testOpenCloseFileObject(self):
"""Test the open and close functionality using a file-like object."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testOpenClosePathSpec(self):
"""Test the open and close functionality using a path specification."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestGetSizeFileObject(file_object)
def testSeek(self):
"""Test the seek functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestSeekFileObject(file_object)
# TODO: Test SEEK_CUR after open.
# Test SEEK_END after open.
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
file_object.seek(-10, os.SEEK_END)
self.assertEqual(file_object.read(5), b'times')
def testRead(self):
"""Test the read functionality."""
file_object = encrypted_stream_io.EncryptedStream(
self._resolver_context, self._encrypted_stream_path_spec)
file_object.Open()
self._TestReadFileObject(file_object)
if __name__ == '__main__':
unittest.main()
| 35.55102 | 77 | 0.75366 | 1,709 | 13,936 | 5.782914 | 0.069046 | 0.080947 | 0.0769 | 0.093089 | 0.919154 | 0.895072 | 0.879591 | 0.875544 | 0.875544 | 0.875544 | 0 | 0.004702 | 0.160735 | 13,936 | 391 | 78 | 35.641944 | 0.840287 | 0.170207 | 0 | 0.809917 | 0 | 0 | 0.027849 | 0.005552 | 0 | 0 | 0 | 0.012788 | 0.020661 | 1 | 0.123967 | false | 0 | 0.033058 | 0 | 0.231405 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
815bb041a910cd0baea45c6c73499060e3ab662f | 94,520 | py | Python | a4kSubtitles/lib/third_party/chardet/langgreekmodel.py | newt-sc/a4kSubtitles | 8fb9bc1e81fba5fe5743bff0471da21113444d48 | [
"MIT"
] | 2 | 2020-04-20T00:01:21.000Z | 2020-04-21T07:57:11.000Z | a4kSubtitles/lib/third_party/chardet/langgreekmodel.py | newt-sc/a4kSubtitles | 8fb9bc1e81fba5fe5743bff0471da21113444d48 | [
"MIT"
] | null | null | null | a4kSubtitles/lib/third_party/chardet/langgreekmodel.py | newt-sc/a4kSubtitles | 8fb9bc1e81fba5fe5743bff0471da21113444d48 | [
"MIT"
] | 1 | 2020-04-20T12:37:02.000Z | 2020-04-20T12:37:02.000Z | from .sbcharsetprober import SingleByteCharSetModel
# 3: Positive
# 2: Likely
# 1: Unlikely
# 0: Negative
GREEK_LANG_MODEL = {
60: { # 'e'
60: 2, # 'e'
55: 1, # 'o'
58: 2, # 't'
36: 1, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 1, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
55: { # 'o'
60: 0, # 'e'
55: 2, # 'o'
58: 2, # 't'
36: 1, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 1, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 1, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
58: { # 't'
60: 2, # 'e'
55: 1, # 'o'
58: 1, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 1, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
36: { # '·'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
61: { # 'Ά'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 1, # 'γ'
21: 2, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 1, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
46: { # 'Έ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 2, # 'β'
20: 2, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 2, # 'κ'
16: 2, # 'λ'
10: 0, # 'μ'
6: 3, # 'ν'
30: 2, # 'ξ'
4: 0, # 'ο'
9: 2, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 1, # 'σ'
2: 2, # 'τ'
12: 0, # 'υ'
28: 2, # 'φ'
23: 3, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
54: { # 'Ό'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 2, # 'μ'
6: 2, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 2, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 2, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 2, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
31: { # 'Α'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 2, # 'Β'
43: 2, # 'Γ'
41: 1, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 2, # 'Θ'
47: 2, # 'Ι'
44: 2, # 'Κ'
53: 2, # 'Λ'
38: 2, # 'Μ'
49: 2, # 'Ν'
59: 1, # 'Ξ'
39: 0, # 'Ο'
35: 2, # 'Π'
48: 2, # 'Ρ'
37: 2, # 'Σ'
33: 2, # 'Τ'
45: 2, # 'Υ'
56: 2, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 2, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 1, # 'θ'
5: 0, # 'ι'
11: 2, # 'κ'
16: 3, # 'λ'
10: 2, # 'μ'
6: 3, # 'ν'
30: 2, # 'ξ'
4: 0, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 2, # 'ς'
7: 2, # 'σ'
2: 0, # 'τ'
12: 3, # 'υ'
28: 2, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 2, # 'ύ'
27: 0, # 'ώ'
},
51: { # 'Β'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 1, # 'Ε'
40: 1, # 'Η'
52: 0, # 'Θ'
47: 1, # 'Ι'
44: 0, # 'Κ'
53: 1, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 2, # 'έ'
22: 2, # 'ή'
15: 0, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 2, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 0, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
43: { # 'Γ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 1, # 'Α'
51: 0, # 'Β'
43: 2, # 'Γ'
41: 0, # 'Δ'
34: 2, # 'Ε'
40: 1, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 1, # 'Κ'
53: 1, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 1, # 'Ο'
35: 0, # 'Π'
48: 2, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 2, # 'Υ'
56: 0, # 'Φ'
50: 1, # 'Χ'
57: 2, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 2, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 2, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 0, # 'μ'
6: 2, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 0, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
41: { # 'Δ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 2, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 2, # 'ή'
15: 2, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 2, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 0, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 2, # 'ω'
19: 1, # 'ό'
26: 2, # 'ύ'
27: 2, # 'ώ'
},
34: { # 'Ε'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 2, # 'Γ'
41: 2, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 2, # 'Κ'
53: 2, # 'Λ'
38: 2, # 'Μ'
49: 2, # 'Ν'
59: 1, # 'Ξ'
39: 0, # 'Ο'
35: 2, # 'Π'
48: 2, # 'Ρ'
37: 2, # 'Σ'
33: 2, # 'Τ'
45: 2, # 'Υ'
56: 0, # 'Φ'
50: 2, # 'Χ'
57: 2, # 'Ω'
17: 3, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 3, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 3, # 'γ'
21: 2, # 'δ'
3: 1, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 1, # 'θ'
5: 2, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 2, # 'μ'
6: 3, # 'ν'
30: 2, # 'ξ'
4: 0, # 'ο'
9: 3, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 2, # 'σ'
2: 2, # 'τ'
12: 2, # 'υ'
28: 2, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 1, # 'ύ'
27: 0, # 'ώ'
},
40: { # 'Η'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 1, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 2, # 'Θ'
47: 0, # 'Ι'
44: 2, # 'Κ'
53: 0, # 'Λ'
38: 2, # 'Μ'
49: 2, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 2, # 'Π'
48: 2, # 'Ρ'
37: 2, # 'Σ'
33: 2, # 'Τ'
45: 1, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 0, # 'μ'
6: 1, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 1, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
52: { # 'Θ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 1, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 1, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 2, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 2, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 2, # 'ύ'
27: 0, # 'ώ'
},
47: { # 'Ι'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 1, # 'Β'
43: 1, # 'Γ'
41: 2, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 2, # 'Κ'
53: 2, # 'Λ'
38: 2, # 'Μ'
49: 2, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 2, # 'Ρ'
37: 2, # 'Σ'
33: 2, # 'Τ'
45: 0, # 'Υ'
56: 2, # 'Φ'
50: 0, # 'Χ'
57: 2, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 2, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 1, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 2, # 'σ'
2: 1, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 1, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
44: { # 'Κ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 1, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 1, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 2, # 'Ρ'
37: 0, # 'Σ'
33: 1, # 'Τ'
45: 2, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 1, # 'Ω'
17: 3, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 2, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 0, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 2, # 'ό'
26: 2, # 'ύ'
27: 2, # 'ώ'
},
53: { # 'Λ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 0, # 'Κ'
53: 2, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 2, # 'Σ'
33: 0, # 'Τ'
45: 2, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 2, # 'Ω'
17: 2, # 'ά'
18: 2, # 'έ'
22: 0, # 'ή'
15: 2, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 2, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 1, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 2, # 'ό'
26: 2, # 'ύ'
27: 0, # 'ώ'
},
38: { # 'Μ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 2, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 2, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 2, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 2, # 'έ'
22: 2, # 'ή'
15: 2, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 2, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 3, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 2, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
49: { # 'Ν'
60: 2, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 2, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 2, # 'Ω'
17: 0, # 'ά'
18: 2, # 'έ'
22: 0, # 'ή'
15: 2, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 1, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 1, # 'ω'
19: 2, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
59: { # 'Ξ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 1, # 'Ε'
40: 1, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 1, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 2, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 2, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
39: { # 'Ο'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 1, # 'Β'
43: 2, # 'Γ'
41: 2, # 'Δ'
34: 2, # 'Ε'
40: 1, # 'Η'
52: 2, # 'Θ'
47: 2, # 'Ι'
44: 2, # 'Κ'
53: 2, # 'Λ'
38: 2, # 'Μ'
49: 2, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 2, # 'Π'
48: 2, # 'Ρ'
37: 2, # 'Σ'
33: 2, # 'Τ'
45: 2, # 'Υ'
56: 2, # 'Φ'
50: 2, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 2, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 2, # 'κ'
16: 2, # 'λ'
10: 2, # 'μ'
6: 2, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 2, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 2, # 'τ'
12: 2, # 'υ'
28: 1, # 'φ'
23: 1, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 2, # 'ύ'
27: 0, # 'ώ'
},
35: { # 'Π'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 2, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 0, # 'Κ'
53: 2, # 'Λ'
38: 1, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 2, # 'Ρ'
37: 0, # 'Σ'
33: 1, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 1, # 'Χ'
57: 2, # 'Ω'
17: 2, # 'ά'
18: 1, # 'έ'
22: 1, # 'ή'
15: 2, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 2, # 'η'
25: 0, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 0, # 'μ'
6: 2, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 2, # 'χ'
42: 0, # 'ψ'
24: 2, # 'ω'
19: 2, # 'ό'
26: 0, # 'ύ'
27: 3, # 'ώ'
},
48: { # 'Ρ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 1, # 'Γ'
41: 1, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 2, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 2, # 'Ρ'
37: 0, # 'Σ'
33: 1, # 'Τ'
45: 1, # 'Υ'
56: 0, # 'Φ'
50: 1, # 'Χ'
57: 1, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 2, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 1, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 3, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 2, # 'ω'
19: 0, # 'ό'
26: 2, # 'ύ'
27: 0, # 'ώ'
},
37: { # 'Σ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 1, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 2, # 'Κ'
53: 0, # 'Λ'
38: 2, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 2, # 'Σ'
33: 2, # 'Τ'
45: 2, # 'Υ'
56: 0, # 'Φ'
50: 2, # 'Χ'
57: 2, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 2, # 'ή'
15: 2, # 'ί'
1: 2, # 'α'
29: 2, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 2, # 'ι'
11: 2, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 2, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 0, # 'φ'
23: 2, # 'χ'
42: 0, # 'ψ'
24: 2, # 'ω'
19: 0, # 'ό'
26: 2, # 'ύ'
27: 2, # 'ώ'
},
33: { # 'Τ'
60: 0, # 'e'
55: 1, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 2, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 2, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 2, # 'Ρ'
37: 0, # 'Σ'
33: 1, # 'Τ'
45: 1, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 2, # 'Ω'
17: 2, # 'ά'
18: 2, # 'έ'
22: 0, # 'ή'
15: 2, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 2, # 'ε'
32: 0, # 'ζ'
13: 2, # 'η'
25: 0, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 2, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 2, # 'σ'
2: 0, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 2, # 'ό'
26: 2, # 'ύ'
27: 3, # 'ώ'
},
45: { # 'Υ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 2, # 'Γ'
41: 0, # 'Δ'
34: 1, # 'Ε'
40: 2, # 'Η'
52: 2, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 1, # 'Λ'
38: 2, # 'Μ'
49: 2, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 2, # 'Π'
48: 1, # 'Ρ'
37: 2, # 'Σ'
33: 2, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 1, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 3, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
56: { # 'Φ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 1, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 1, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 2, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 2, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 2, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 1, # 'ύ'
27: 1, # 'ώ'
},
50: { # 'Χ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 1, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 2, # 'Ε'
40: 2, # 'Η'
52: 0, # 'Θ'
47: 2, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 1, # 'Ν'
59: 0, # 'Ξ'
39: 1, # 'Ο'
35: 0, # 'Π'
48: 2, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 1, # 'Χ'
57: 1, # 'Ω'
17: 2, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 2, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 2, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 2, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
57: { # 'Ω'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 1, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 1, # 'Λ'
38: 0, # 'Μ'
49: 2, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 2, # 'Ρ'
37: 2, # 'Σ'
33: 2, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 0, # 'π'
8: 2, # 'ρ'
14: 2, # 'ς'
7: 2, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 1, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
17: { # 'ά'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 2, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 3, # 'β'
20: 3, # 'γ'
21: 3, # 'δ'
3: 3, # 'ε'
32: 3, # 'ζ'
13: 0, # 'η'
25: 3, # 'θ'
5: 2, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 3, # 'ξ'
4: 0, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 3, # 'φ'
23: 3, # 'χ'
42: 3, # 'ψ'
24: 2, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
18: { # 'έ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 3, # 'α'
29: 2, # 'β'
20: 3, # 'γ'
21: 2, # 'δ'
3: 3, # 'ε'
32: 2, # 'ζ'
13: 0, # 'η'
25: 3, # 'θ'
5: 0, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 3, # 'ξ'
4: 3, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 3, # 'φ'
23: 3, # 'χ'
42: 3, # 'ψ'
24: 2, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
22: { # 'ή'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 1, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 3, # 'γ'
21: 3, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 3, # 'θ'
5: 0, # 'ι'
11: 3, # 'κ'
16: 2, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 2, # 'ξ'
4: 0, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 2, # 'φ'
23: 3, # 'χ'
42: 2, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
15: { # 'ί'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 3, # 'α'
29: 2, # 'β'
20: 3, # 'γ'
21: 3, # 'δ'
3: 3, # 'ε'
32: 3, # 'ζ'
13: 3, # 'η'
25: 3, # 'θ'
5: 0, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 3, # 'ξ'
4: 3, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 1, # 'φ'
23: 3, # 'χ'
42: 2, # 'ψ'
24: 3, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
1: { # 'α'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 2, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 2, # 'έ'
22: 0, # 'ή'
15: 3, # 'ί'
1: 0, # 'α'
29: 3, # 'β'
20: 3, # 'γ'
21: 3, # 'δ'
3: 2, # 'ε'
32: 3, # 'ζ'
13: 1, # 'η'
25: 3, # 'θ'
5: 3, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 3, # 'ξ'
4: 2, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 3, # 'φ'
23: 3, # 'χ'
42: 2, # 'ψ'
24: 0, # 'ω'
19: 2, # 'ό'
26: 2, # 'ύ'
27: 0, # 'ώ'
},
29: { # 'β'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 2, # 'έ'
22: 3, # 'ή'
15: 2, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 2, # 'γ'
21: 2, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 2, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 3, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 2, # 'ω'
19: 2, # 'ό'
26: 2, # 'ύ'
27: 2, # 'ώ'
},
20: { # 'γ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 3, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 3, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 3, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 2, # 'ύ'
27: 3, # 'ώ'
},
21: { # 'δ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 3, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 3, # 'ύ'
27: 3, # 'ώ'
},
3: { # 'ε'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 2, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 3, # 'ί'
1: 2, # 'α'
29: 3, # 'β'
20: 3, # 'γ'
21: 3, # 'δ'
3: 2, # 'ε'
32: 2, # 'ζ'
13: 0, # 'η'
25: 3, # 'θ'
5: 3, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 3, # 'ξ'
4: 2, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 3, # 'φ'
23: 3, # 'χ'
42: 2, # 'ψ'
24: 3, # 'ω'
19: 2, # 'ό'
26: 3, # 'ύ'
27: 2, # 'ώ'
},
32: { # 'ζ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 2, # 'έ'
22: 2, # 'ή'
15: 2, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 1, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 2, # 'ό'
26: 0, # 'ύ'
27: 2, # 'ώ'
},
13: { # 'η'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 2, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 3, # 'γ'
21: 2, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 3, # 'θ'
5: 0, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 2, # 'ξ'
4: 0, # 'ο'
9: 2, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 2, # 'φ'
23: 3, # 'χ'
42: 2, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
25: { # 'θ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 2, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 1, # 'λ'
10: 3, # 'μ'
6: 2, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 3, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 3, # 'ύ'
27: 3, # 'ώ'
},
5: { # 'ι'
60: 0, # 'e'
55: 1, # 'o'
58: 0, # 't'
36: 2, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 1, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 0, # 'ί'
1: 3, # 'α'
29: 3, # 'β'
20: 3, # 'γ'
21: 3, # 'δ'
3: 3, # 'ε'
32: 2, # 'ζ'
13: 3, # 'η'
25: 3, # 'θ'
5: 0, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 3, # 'ξ'
4: 3, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 2, # 'φ'
23: 3, # 'χ'
42: 2, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 0, # 'ύ'
27: 3, # 'ώ'
},
11: { # 'κ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 3, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 2, # 'θ'
5: 3, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 2, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 2, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 2, # 'φ'
23: 2, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 3, # 'ύ'
27: 3, # 'ώ'
},
16: { # 'λ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 1, # 'β'
20: 2, # 'γ'
21: 1, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 2, # 'θ'
5: 3, # 'ι'
11: 2, # 'κ'
16: 3, # 'λ'
10: 2, # 'μ'
6: 2, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 3, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 2, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 3, # 'ύ'
27: 3, # 'ώ'
},
10: { # 'μ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 1, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 3, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 3, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 2, # 'υ'
28: 3, # 'φ'
23: 0, # 'χ'
42: 2, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 2, # 'ύ'
27: 2, # 'ώ'
},
6: { # 'ν'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 2, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 3, # 'δ'
3: 3, # 'ε'
32: 2, # 'ζ'
13: 3, # 'η'
25: 3, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 1, # 'λ'
10: 0, # 'μ'
6: 2, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 3, # 'ύ'
27: 3, # 'ώ'
},
30: { # 'ξ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 2, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 3, # 'τ'
12: 2, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 2, # 'ό'
26: 3, # 'ύ'
27: 1, # 'ώ'
},
4: { # 'ο'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 2, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 2, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 2, # 'α'
29: 3, # 'β'
20: 3, # 'γ'
21: 3, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 3, # 'θ'
5: 3, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 2, # 'ξ'
4: 2, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 3, # 'φ'
23: 3, # 'χ'
42: 2, # 'ψ'
24: 2, # 'ω'
19: 1, # 'ό'
26: 3, # 'ύ'
27: 2, # 'ώ'
},
9: { # 'π'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 3, # 'λ'
10: 0, # 'μ'
6: 2, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 2, # 'ς'
7: 0, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 0, # 'φ'
23: 2, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 2, # 'ύ'
27: 3, # 'ώ'
},
8: { # 'ρ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 2, # 'β'
20: 3, # 'γ'
21: 2, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 3, # 'θ'
5: 3, # 'ι'
11: 3, # 'κ'
16: 1, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 2, # 'ξ'
4: 3, # 'ο'
9: 2, # 'π'
8: 2, # 'ρ'
14: 0, # 'ς'
7: 2, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 3, # 'φ'
23: 3, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 3, # 'ύ'
27: 3, # 'ώ'
},
14: { # 'ς'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 2, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 0, # 'θ'
5: 0, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 0, # 'τ'
12: 0, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
7: { # 'σ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 2, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 3, # 'β'
20: 0, # 'γ'
21: 2, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 3, # 'θ'
5: 3, # 'ι'
11: 3, # 'κ'
16: 2, # 'λ'
10: 3, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 3, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 3, # 'φ'
23: 3, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 3, # 'ύ'
27: 2, # 'ώ'
},
2: { # 'τ'
60: 0, # 'e'
55: 2, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 2, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 3, # 'ι'
11: 2, # 'κ'
16: 2, # 'λ'
10: 3, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 2, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 3, # 'ύ'
27: 3, # 'ώ'
},
12: { # 'υ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 2, # 'έ'
22: 3, # 'ή'
15: 2, # 'ί'
1: 3, # 'α'
29: 2, # 'β'
20: 3, # 'γ'
21: 2, # 'δ'
3: 2, # 'ε'
32: 2, # 'ζ'
13: 2, # 'η'
25: 3, # 'θ'
5: 2, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 3, # 'ξ'
4: 3, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 2, # 'φ'
23: 3, # 'χ'
42: 2, # 'ψ'
24: 2, # 'ω'
19: 2, # 'ό'
26: 0, # 'ύ'
27: 2, # 'ώ'
},
28: { # 'φ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 3, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 2, # 'η'
25: 2, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 0, # 'μ'
6: 1, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 1, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 2, # 'ύ'
27: 2, # 'ώ'
},
23: { # 'χ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 3, # 'ά'
18: 2, # 'έ'
22: 3, # 'ή'
15: 3, # 'ί'
1: 3, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 2, # 'η'
25: 2, # 'θ'
5: 3, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 2, # 'μ'
6: 3, # 'ν'
30: 0, # 'ξ'
4: 3, # 'ο'
9: 0, # 'π'
8: 3, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 3, # 'τ'
12: 3, # 'υ'
28: 0, # 'φ'
23: 2, # 'χ'
42: 0, # 'ψ'
24: 3, # 'ω'
19: 3, # 'ό'
26: 3, # 'ύ'
27: 3, # 'ώ'
},
42: { # 'ψ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 2, # 'ά'
18: 2, # 'έ'
22: 1, # 'ή'
15: 2, # 'ί'
1: 2, # 'α'
29: 0, # 'β'
20: 0, # 'γ'
21: 0, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 3, # 'η'
25: 0, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 0, # 'λ'
10: 0, # 'μ'
6: 0, # 'ν'
30: 0, # 'ξ'
4: 2, # 'ο'
9: 0, # 'π'
8: 0, # 'ρ'
14: 0, # 'ς'
7: 0, # 'σ'
2: 2, # 'τ'
12: 1, # 'υ'
28: 0, # 'φ'
23: 0, # 'χ'
42: 0, # 'ψ'
24: 2, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
24: { # 'ω'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 1, # 'ά'
18: 0, # 'έ'
22: 2, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 2, # 'β'
20: 3, # 'γ'
21: 2, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 0, # 'η'
25: 3, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 0, # 'ξ'
4: 0, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 2, # 'φ'
23: 2, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
19: { # 'ό'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 3, # 'β'
20: 3, # 'γ'
21: 3, # 'δ'
3: 1, # 'ε'
32: 2, # 'ζ'
13: 2, # 'η'
25: 2, # 'θ'
5: 2, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 1, # 'ξ'
4: 2, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 2, # 'φ'
23: 3, # 'χ'
42: 2, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
26: { # 'ύ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 2, # 'α'
29: 2, # 'β'
20: 2, # 'γ'
21: 1, # 'δ'
3: 3, # 'ε'
32: 0, # 'ζ'
13: 2, # 'η'
25: 3, # 'θ'
5: 0, # 'ι'
11: 3, # 'κ'
16: 3, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 2, # 'ξ'
4: 3, # 'ο'
9: 3, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 2, # 'φ'
23: 2, # 'χ'
42: 2, # 'ψ'
24: 2, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
27: { # 'ώ'
60: 0, # 'e'
55: 0, # 'o'
58: 0, # 't'
36: 0, # '·'
61: 0, # 'Ά'
46: 0, # 'Έ'
54: 0, # 'Ό'
31: 0, # 'Α'
51: 0, # 'Β'
43: 0, # 'Γ'
41: 0, # 'Δ'
34: 0, # 'Ε'
40: 0, # 'Η'
52: 0, # 'Θ'
47: 0, # 'Ι'
44: 0, # 'Κ'
53: 0, # 'Λ'
38: 0, # 'Μ'
49: 0, # 'Ν'
59: 0, # 'Ξ'
39: 0, # 'Ο'
35: 0, # 'Π'
48: 0, # 'Ρ'
37: 0, # 'Σ'
33: 0, # 'Τ'
45: 0, # 'Υ'
56: 0, # 'Φ'
50: 0, # 'Χ'
57: 0, # 'Ω'
17: 0, # 'ά'
18: 0, # 'έ'
22: 0, # 'ή'
15: 0, # 'ί'
1: 0, # 'α'
29: 1, # 'β'
20: 0, # 'γ'
21: 3, # 'δ'
3: 0, # 'ε'
32: 0, # 'ζ'
13: 1, # 'η'
25: 2, # 'θ'
5: 2, # 'ι'
11: 0, # 'κ'
16: 2, # 'λ'
10: 3, # 'μ'
6: 3, # 'ν'
30: 1, # 'ξ'
4: 0, # 'ο'
9: 2, # 'π'
8: 3, # 'ρ'
14: 3, # 'ς'
7: 3, # 'σ'
2: 3, # 'τ'
12: 0, # 'υ'
28: 1, # 'φ'
23: 1, # 'χ'
42: 0, # 'ψ'
24: 0, # 'ω'
19: 0, # 'ό'
26: 0, # 'ύ'
27: 0, # 'ώ'
},
}
# 255: Undefined characters that did not exist in training text
# 254: Carriage/Return
# 253: symbol (punctuation) that does not belong to word
# 252: 0 - 9
# 251: Control characters
# Character Mapping Table(s):
WINDOWS_1253_GREEK_CHAR_TO_ORDER = {
0: 255, # '\x00'
1: 255, # '\x01'
2: 255, # '\x02'
3: 255, # '\x03'
4: 255, # '\x04'
5: 255, # '\x05'
6: 255, # '\x06'
7: 255, # '\x07'
8: 255, # '\x08'
9: 255, # '\t'
10: 254, # '\n'
11: 255, # '\x0b'
12: 255, # '\x0c'
13: 254, # '\r'
14: 255, # '\x0e'
15: 255, # '\x0f'
16: 255, # '\x10'
17: 255, # '\x11'
18: 255, # '\x12'
19: 255, # '\x13'
20: 255, # '\x14'
21: 255, # '\x15'
22: 255, # '\x16'
23: 255, # '\x17'
24: 255, # '\x18'
25: 255, # '\x19'
26: 255, # '\x1a'
27: 255, # '\x1b'
28: 255, # '\x1c'
29: 255, # '\x1d'
30: 255, # '\x1e'
31: 255, # '\x1f'
32: 253, # ' '
33: 253, # '!'
34: 253, # '"'
35: 253, # '#'
36: 253, # '$'
37: 253, # '%'
38: 253, # '&'
39: 253, # "'"
40: 253, # '('
41: 253, # ')'
42: 253, # '*'
43: 253, # '+'
44: 253, # ','
45: 253, # '-'
46: 253, # '.'
47: 253, # '/'
48: 252, # '0'
49: 252, # '1'
50: 252, # '2'
51: 252, # '3'
52: 252, # '4'
53: 252, # '5'
54: 252, # '6'
55: 252, # '7'
56: 252, # '8'
57: 252, # '9'
58: 253, # ':'
59: 253, # ';'
60: 253, # '<'
61: 253, # '='
62: 253, # '>'
63: 253, # '?'
64: 253, # '@'
65: 82, # 'A'
66: 100, # 'B'
67: 104, # 'C'
68: 94, # 'D'
69: 98, # 'E'
70: 101, # 'F'
71: 116, # 'G'
72: 102, # 'H'
73: 111, # 'I'
74: 187, # 'J'
75: 117, # 'K'
76: 92, # 'L'
77: 88, # 'M'
78: 113, # 'N'
79: 85, # 'O'
80: 79, # 'P'
81: 118, # 'Q'
82: 105, # 'R'
83: 83, # 'S'
84: 67, # 'T'
85: 114, # 'U'
86: 119, # 'V'
87: 95, # 'W'
88: 99, # 'X'
89: 109, # 'Y'
90: 188, # 'Z'
91: 253, # '['
92: 253, # '\\'
93: 253, # ']'
94: 253, # '^'
95: 253, # '_'
96: 253, # '`'
97: 72, # 'a'
98: 70, # 'b'
99: 80, # 'c'
100: 81, # 'd'
101: 60, # 'e'
102: 96, # 'f'
103: 93, # 'g'
104: 89, # 'h'
105: 68, # 'i'
106: 120, # 'j'
107: 97, # 'k'
108: 77, # 'l'
109: 86, # 'm'
110: 69, # 'n'
111: 55, # 'o'
112: 78, # 'p'
113: 115, # 'q'
114: 65, # 'r'
115: 66, # 's'
116: 58, # 't'
117: 76, # 'u'
118: 106, # 'v'
119: 103, # 'w'
120: 87, # 'x'
121: 107, # 'y'
122: 112, # 'z'
123: 253, # '{'
124: 253, # '|'
125: 253, # '}'
126: 253, # '~'
127: 253, # '\x7f'
128: 255, # '€'
129: 255, # None
130: 255, # '‚'
131: 255, # 'ƒ'
132: 255, # '„'
133: 255, # '…'
134: 255, # '†'
135: 255, # '‡'
136: 255, # None
137: 255, # '‰'
138: 255, # None
139: 255, # '‹'
140: 255, # None
141: 255, # None
142: 255, # None
143: 255, # None
144: 255, # None
145: 255, # '‘'
146: 255, # '’'
147: 255, # '“'
148: 255, # '”'
149: 255, # '•'
150: 255, # '–'
151: 255, # '—'
152: 255, # None
153: 255, # '™'
154: 255, # None
155: 255, # '›'
156: 255, # None
157: 255, # None
158: 255, # None
159: 255, # None
160: 253, # '\xa0'
161: 233, # '΅'
162: 61, # 'Ά'
163: 253, # '£'
164: 253, # '¤'
165: 253, # '¥'
166: 253, # '¦'
167: 253, # '§'
168: 253, # '¨'
169: 253, # '©'
170: 253, # None
171: 253, # '«'
172: 253, # '¬'
173: 74, # '\xad'
174: 253, # '®'
175: 253, # '―'
176: 253, # '°'
177: 253, # '±'
178: 253, # '²'
179: 253, # '³'
180: 247, # '΄'
181: 253, # 'µ'
182: 253, # '¶'
183: 36, # '·'
184: 46, # 'Έ'
185: 71, # 'Ή'
186: 73, # 'Ί'
187: 253, # '»'
188: 54, # 'Ό'
189: 253, # '½'
190: 108, # 'Ύ'
191: 123, # 'Ώ'
192: 110, # 'ΐ'
193: 31, # 'Α'
194: 51, # 'Β'
195: 43, # 'Γ'
196: 41, # 'Δ'
197: 34, # 'Ε'
198: 91, # 'Ζ'
199: 40, # 'Η'
200: 52, # 'Θ'
201: 47, # 'Ι'
202: 44, # 'Κ'
203: 53, # 'Λ'
204: 38, # 'Μ'
205: 49, # 'Ν'
206: 59, # 'Ξ'
207: 39, # 'Ο'
208: 35, # 'Π'
209: 48, # 'Ρ'
210: 250, # None
211: 37, # 'Σ'
212: 33, # 'Τ'
213: 45, # 'Υ'
214: 56, # 'Φ'
215: 50, # 'Χ'
216: 84, # 'Ψ'
217: 57, # 'Ω'
218: 120, # 'Ϊ'
219: 121, # 'Ϋ'
220: 17, # 'ά'
221: 18, # 'έ'
222: 22, # 'ή'
223: 15, # 'ί'
224: 124, # 'ΰ'
225: 1, # 'α'
226: 29, # 'β'
227: 20, # 'γ'
228: 21, # 'δ'
229: 3, # 'ε'
230: 32, # 'ζ'
231: 13, # 'η'
232: 25, # 'θ'
233: 5, # 'ι'
234: 11, # 'κ'
235: 16, # 'λ'
236: 10, # 'μ'
237: 6, # 'ν'
238: 30, # 'ξ'
239: 4, # 'ο'
240: 9, # 'π'
241: 8, # 'ρ'
242: 14, # 'ς'
243: 7, # 'σ'
244: 2, # 'τ'
245: 12, # 'υ'
246: 28, # 'φ'
247: 23, # 'χ'
248: 42, # 'ψ'
249: 24, # 'ω'
250: 64, # 'ϊ'
251: 75, # 'ϋ'
252: 19, # 'ό'
253: 26, # 'ύ'
254: 27, # 'ώ'
255: 253, # None
}
WINDOWS_1253_GREEK_MODEL = SingleByteCharSetModel(
charset_name="windows-1253",
language="Greek",
char_to_order_map=WINDOWS_1253_GREEK_CHAR_TO_ORDER,
language_model=GREEK_LANG_MODEL,
typical_positive_ratio=0.982851,
keep_ascii_letters=False,
alphabet="ΆΈΉΊΌΎΏΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩάέήίαβγδεζηθικλμνξοπρςστυφχψωόύώ",
)
ISO_8859_7_GREEK_CHAR_TO_ORDER = {
0: 255, # '\x00'
1: 255, # '\x01'
2: 255, # '\x02'
3: 255, # '\x03'
4: 255, # '\x04'
5: 255, # '\x05'
6: 255, # '\x06'
7: 255, # '\x07'
8: 255, # '\x08'
9: 255, # '\t'
10: 254, # '\n'
11: 255, # '\x0b'
12: 255, # '\x0c'
13: 254, # '\r'
14: 255, # '\x0e'
15: 255, # '\x0f'
16: 255, # '\x10'
17: 255, # '\x11'
18: 255, # '\x12'
19: 255, # '\x13'
20: 255, # '\x14'
21: 255, # '\x15'
22: 255, # '\x16'
23: 255, # '\x17'
24: 255, # '\x18'
25: 255, # '\x19'
26: 255, # '\x1a'
27: 255, # '\x1b'
28: 255, # '\x1c'
29: 255, # '\x1d'
30: 255, # '\x1e'
31: 255, # '\x1f'
32: 253, # ' '
33: 253, # '!'
34: 253, # '"'
35: 253, # '#'
36: 253, # '$'
37: 253, # '%'
38: 253, # '&'
39: 253, # "'"
40: 253, # '('
41: 253, # ')'
42: 253, # '*'
43: 253, # '+'
44: 253, # ','
45: 253, # '-'
46: 253, # '.'
47: 253, # '/'
48: 252, # '0'
49: 252, # '1'
50: 252, # '2'
51: 252, # '3'
52: 252, # '4'
53: 252, # '5'
54: 252, # '6'
55: 252, # '7'
56: 252, # '8'
57: 252, # '9'
58: 253, # ':'
59: 253, # ';'
60: 253, # '<'
61: 253, # '='
62: 253, # '>'
63: 253, # '?'
64: 253, # '@'
65: 82, # 'A'
66: 100, # 'B'
67: 104, # 'C'
68: 94, # 'D'
69: 98, # 'E'
70: 101, # 'F'
71: 116, # 'G'
72: 102, # 'H'
73: 111, # 'I'
74: 187, # 'J'
75: 117, # 'K'
76: 92, # 'L'
77: 88, # 'M'
78: 113, # 'N'
79: 85, # 'O'
80: 79, # 'P'
81: 118, # 'Q'
82: 105, # 'R'
83: 83, # 'S'
84: 67, # 'T'
85: 114, # 'U'
86: 119, # 'V'
87: 95, # 'W'
88: 99, # 'X'
89: 109, # 'Y'
90: 188, # 'Z'
91: 253, # '['
92: 253, # '\\'
93: 253, # ']'
94: 253, # '^'
95: 253, # '_'
96: 253, # '`'
97: 72, # 'a'
98: 70, # 'b'
99: 80, # 'c'
100: 81, # 'd'
101: 60, # 'e'
102: 96, # 'f'
103: 93, # 'g'
104: 89, # 'h'
105: 68, # 'i'
106: 120, # 'j'
107: 97, # 'k'
108: 77, # 'l'
109: 86, # 'm'
110: 69, # 'n'
111: 55, # 'o'
112: 78, # 'p'
113: 115, # 'q'
114: 65, # 'r'
115: 66, # 's'
116: 58, # 't'
117: 76, # 'u'
118: 106, # 'v'
119: 103, # 'w'
120: 87, # 'x'
121: 107, # 'y'
122: 112, # 'z'
123: 253, # '{'
124: 253, # '|'
125: 253, # '}'
126: 253, # '~'
127: 253, # '\x7f'
128: 255, # '\x80'
129: 255, # '\x81'
130: 255, # '\x82'
131: 255, # '\x83'
132: 255, # '\x84'
133: 255, # '\x85'
134: 255, # '\x86'
135: 255, # '\x87'
136: 255, # '\x88'
137: 255, # '\x89'
138: 255, # '\x8a'
139: 255, # '\x8b'
140: 255, # '\x8c'
141: 255, # '\x8d'
142: 255, # '\x8e'
143: 255, # '\x8f'
144: 255, # '\x90'
145: 255, # '\x91'
146: 255, # '\x92'
147: 255, # '\x93'
148: 255, # '\x94'
149: 255, # '\x95'
150: 255, # '\x96'
151: 255, # '\x97'
152: 255, # '\x98'
153: 255, # '\x99'
154: 255, # '\x9a'
155: 255, # '\x9b'
156: 255, # '\x9c'
157: 255, # '\x9d'
158: 255, # '\x9e'
159: 255, # '\x9f'
160: 253, # '\xa0'
161: 233, # '‘'
162: 90, # '’'
163: 253, # '£'
164: 253, # '€'
165: 253, # '₯'
166: 253, # '¦'
167: 253, # '§'
168: 253, # '¨'
169: 253, # '©'
170: 253, # 'ͺ'
171: 253, # '«'
172: 253, # '¬'
173: 74, # '\xad'
174: 253, # None
175: 253, # '―'
176: 253, # '°'
177: 253, # '±'
178: 253, # '²'
179: 253, # '³'
180: 247, # '΄'
181: 248, # '΅'
182: 61, # 'Ά'
183: 36, # '·'
184: 46, # 'Έ'
185: 71, # 'Ή'
186: 73, # 'Ί'
187: 253, # '»'
188: 54, # 'Ό'
189: 253, # '½'
190: 108, # 'Ύ'
191: 123, # 'Ώ'
192: 110, # 'ΐ'
193: 31, # 'Α'
194: 51, # 'Β'
195: 43, # 'Γ'
196: 41, # 'Δ'
197: 34, # 'Ε'
198: 91, # 'Ζ'
199: 40, # 'Η'
200: 52, # 'Θ'
201: 47, # 'Ι'
202: 44, # 'Κ'
203: 53, # 'Λ'
204: 38, # 'Μ'
205: 49, # 'Ν'
206: 59, # 'Ξ'
207: 39, # 'Ο'
208: 35, # 'Π'
209: 48, # 'Ρ'
210: 250, # None
211: 37, # 'Σ'
212: 33, # 'Τ'
213: 45, # 'Υ'
214: 56, # 'Φ'
215: 50, # 'Χ'
216: 84, # 'Ψ'
217: 57, # 'Ω'
218: 120, # 'Ϊ'
219: 121, # 'Ϋ'
220: 17, # 'ά'
221: 18, # 'έ'
222: 22, # 'ή'
223: 15, # 'ί'
224: 124, # 'ΰ'
225: 1, # 'α'
226: 29, # 'β'
227: 20, # 'γ'
228: 21, # 'δ'
229: 3, # 'ε'
230: 32, # 'ζ'
231: 13, # 'η'
232: 25, # 'θ'
233: 5, # 'ι'
234: 11, # 'κ'
235: 16, # 'λ'
236: 10, # 'μ'
237: 6, # 'ν'
238: 30, # 'ξ'
239: 4, # 'ο'
240: 9, # 'π'
241: 8, # 'ρ'
242: 14, # 'ς'
243: 7, # 'σ'
244: 2, # 'τ'
245: 12, # 'υ'
246: 28, # 'φ'
247: 23, # 'χ'
248: 42, # 'ψ'
249: 24, # 'ω'
250: 64, # 'ϊ'
251: 75, # 'ϋ'
252: 19, # 'ό'
253: 26, # 'ύ'
254: 27, # 'ώ'
255: 253, # None
}
ISO_8859_7_GREEK_MODEL = SingleByteCharSetModel(
charset_name="ISO-8859-7",
language="Greek",
char_to_order_map=ISO_8859_7_GREEK_CHAR_TO_ORDER,
language_model=GREEK_LANG_MODEL,
typical_positive_ratio=0.982851,
keep_ascii_letters=False,
alphabet="ΆΈΉΊΌΎΏΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩάέήίαβγδεζηθικλμνξοπρςστυφχψωόύώ",
)
| 21.491587 | 79 | 0.196921 | 12,890 | 94,520 | 1.448875 | 0.036385 | 0.010923 | 0.013065 | 0.016331 | 0.944367 | 0.934461 | 0.930392 | 0.926376 | 0.919415 | 0.919415 | 0 | 0.335899 | 0.57117 | 94,520 | 4,397 | 80 | 21.496475 | 0.121901 | 0.187643 | 0 | 0.96484 | 0 | 0 | 0.00218 | 0.001738 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.000228 | 0 | 0.000228 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
81e522499dfea74d16a534c69e343454abaa4b34 | 13 | py | Python | demo/change_bright.py | MikeShuang96/My_Project_New | a114218416eca1db5b36343f255b56601cac0b80 | [
"Apache-2.0"
] | 2 | 2019-03-13T10:11:37.000Z | 2020-05-05T10:14:01.000Z | demo/change_bright.py | MikeShuang96/My_Project_New | a114218416eca1db5b36343f255b56601cac0b80 | [
"Apache-2.0"
] | null | null | null | demo/change_bright.py | MikeShuang96/My_Project_New | a114218416eca1db5b36343f255b56601cac0b80 | [
"Apache-2.0"
] | 1 | 2019-02-25T06:58:27.000Z | 2019-02-25T06:58:27.000Z | import cv2
| 3.25 | 10 | 0.692308 | 2 | 13 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0.307692 | 13 | 3 | 11 | 4.333333 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c495cc9d1f757fd8dbdc333c11c76421c5e4573b | 127 | py | Python | marketing/views.py | xNovax/RoomScout | 287240a9d13f2b8f6ce9abdc95cf611671970fc3 | [
"MIT"
] | 24 | 2020-02-01T17:22:47.000Z | 2020-10-24T19:49:36.000Z | marketing/views.py | xNovax/RoomScout | 287240a9d13f2b8f6ce9abdc95cf611671970fc3 | [
"MIT"
] | 16 | 2020-02-01T14:30:15.000Z | 2020-08-13T20:49:56.000Z | marketing/views.py | aaronspindler/RoomScout | 287240a9d13f2b8f6ce9abdc95cf611671970fc3 | [
"MIT"
] | 6 | 2020-02-01T22:07:46.000Z | 2021-03-05T14:05:27.000Z | from django.shortcuts import render
def marketing_roommates(request):
return render(request, 'marketing/roommates.html')
| 21.166667 | 54 | 0.795276 | 15 | 127 | 6.666667 | 0.733333 | 0.36 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11811 | 127 | 5 | 55 | 25.4 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0.188976 | 0.188976 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
c49ba3b0a56945b46530ba7efb5544e15d5eb272 | 9,863 | py | Python | src/target_driven_method/networks/target_driven_navigation_networks.py | aidkilda/understanding-drl-navigation | 0d637c2390a935ec1182d4f2d5165644d98d6404 | [
"MIT"
] | null | null | null | src/target_driven_method/networks/target_driven_navigation_networks.py | aidkilda/understanding-drl-navigation | 0d637c2390a935ec1182d4f2d5165644d98d6404 | [
"MIT"
] | null | null | null | src/target_driven_method/networks/target_driven_navigation_networks.py | aidkilda/understanding-drl-navigation | 0d637c2390a935ec1182d4f2d5165644d98d6404 | [
"MIT"
] | null | null | null | import tensorflow as tf
import numpy as np
from constants import ACTION_SIZE
from constants import HISTORY_LENGTH
from constants import HIDDEN_NEURONS
from target_driven_method.networks.network import ActorCriticNetwork
class TargetDrivenFFNetwork(ActorCriticNetwork):
"""Implementation of the target-driven deep siamese actor-critic network from [Zhu et al., ICRA 2017],
without scene-specific layers.
"""
def __init__(self,
input_size,
device="/cpu:0",
network_scope="network"):
ActorCriticNetwork.__init__(self, ACTION_SIZE, device, network_scope)
with tf.device(self._device):
with tf.variable_scope(network_scope):
with tf.variable_scope("input_layer"):
# state (input)
self.s = tf.placeholder("float", [None, input_size, HISTORY_LENGTH], name="state")
# target (input)
self.t = tf.placeholder("float", [None, input_size, HISTORY_LENGTH], name="target")
# flatten input
self.s_flat = tf.reshape(self.s, [-1, input_size * HISTORY_LENGTH], name="state_flat")
self.t_flat = tf.reshape(self.t, [-1, input_size * HISTORY_LENGTH], name="target_flat")
with tf.variable_scope("shared_siamese_layer"):
self.W_fc1, self.b_fc1 = \
self._fc_variable([input_size * HISTORY_LENGTH, HIDDEN_NEURONS], name="shared_siamese")
h_s_flat = tf.nn.relu(tf.matmul(self.s_flat, self.W_fc1) + self.b_fc1,
name="shared_siamese_state_flat")
h_t_flat = tf.nn.relu(tf.matmul(self.t_flat, self.W_fc1) + self.b_fc1,
name="shared_siamese_target_flat")
h_fc1 = tf.concat(values=[h_s_flat, h_t_flat], axis=1, name="h_shared_siamese")
self.observation_embedding = h_s_flat
with tf.variable_scope("shared_fusion_layer"):
self.W_fc2, self.b_fc2 = \
self._fc_variable([2 * HIDDEN_NEURONS, HIDDEN_NEURONS], name="shared_fusion")
h_fc2 = tf.nn.relu(tf.matmul(h_fc1, self.W_fc2) + self.b_fc2, name="h_shared_fusion")
with tf.variable_scope("fc_layer_3"):
self.W_fc3, self.b_fc3 = self._fc_variable([HIDDEN_NEURONS,HIDDEN_NEURONS], name="fc3")
h_fc3 = tf.nn.relu(tf.matmul(h_fc2, self.W_fc3) + self.b_fc3, name="h_fc3")
self.scene_specific_layer = h_fc3
with tf.variable_scope("policy_output_layer"):
self.W_policy, self.b_policy = self._fc_variable([HIDDEN_NEURONS, ACTION_SIZE], name="policy")
# policy (output)
pi_ = tf.matmul(h_fc3, self.W_policy) + self.b_policy
self.pi_ = pi_
self.pi = tf.nn.softmax(pi_, name="pi")
with tf.variable_scope("value_output_layer"):
self.W_value, self.b_value = self._fc_variable([HIDDEN_NEURONS, 1], name="value")
# value (output)
v_ = tf.matmul(h_fc3, self.W_value) + self.b_value
self.v = tf.reshape(v_, [-1], name="value")
def run_policy_and_value(self, sess, state, target):
pi_out, v_out = sess.run(
[self.pi, self.v],
feed_dict={
self.s: [state],
self.t: [target]
})
return pi_out[0], v_out[0]
def run_policy(self, sess, state, target):
pi_out = sess.run(
self.pi, feed_dict={
self.s: [state],
self.t: [target]
})
return pi_out[0]
def run_value(self, sess, state, target):
v_out = sess.run(
self.v, feed_dict={
self.s: [state],
self.t: [target]
})
return v_out[0]
class TargetDrivenLSTMNetwork(ActorCriticNetwork):
"""Implementation of the target-driven deep siamese actor-critic network from [Zhu et al., ICRA 2017],
without scene-specific layers and with added LSTM layer.
"""
def __init__(self,
input_size,
device="/cpu:0",
network_scope="network"):
ActorCriticNetwork.__init__(self, ACTION_SIZE, device, network_scope)
with tf.device(self._device):
with tf.variable_scope(network_scope) as scope:
with tf.variable_scope("input_layer"):
# state (input)
self.s = tf.placeholder("float", [None, input_size, HISTORY_LENGTH], name="state")
# target (input)
self.t = tf.placeholder("float", [None, input_size, HISTORY_LENGTH], name="target")
# flatten input
self.s_flat = tf.reshape(self.s, [-1, input_size * HISTORY_LENGTH], name="state_flat")
self.t_flat = tf.reshape(self.t, [-1, input_size * HISTORY_LENGTH], name="target_flat")
with tf.variable_scope("shared_siamese_layer"):
self.W_fc1, self.b_fc1 = \
self._fc_variable([input_size * HISTORY_LENGTH, HIDDEN_NEURONS], name="shared_siamese")
h_s_flat = tf.nn.relu(tf.matmul(self.s_flat, self.W_fc1) + self.b_fc1,
name="shared_siamese_state_flat")
h_t_flat = tf.nn.relu(tf.matmul(self.t_flat, self.W_fc1) + self.b_fc1,
name="shared_siamese_target_flat")
h_fc1 = tf.concat(values=[h_s_flat, h_t_flat], axis=1, name="h_shared_siamese")
self.observation_embedding = h_s_flat
with tf.variable_scope("shared_fusion_layer"):
self.W_fc2, self.b_fc2 = \
self._fc_variable([2 * HIDDEN_NEURONS, HIDDEN_NEURONS], name="shared_fusion")
h_fc2 = tf.nn.relu(tf.matmul(h_fc1, self.W_fc2) + self.b_fc2, name="h_shared_fusion")
h_fc_2_reshaped = tf.reshape(h_fc2, [1,-1,HIDDEN_NEURONS])
with tf.variable_scope("lstm_layer"):
self.lstm = tf.contrib.rnn.BasicLSTMCell(HIDDEN_NEURONS, state_is_tuple=True)
self.step_size = tf.placeholder(tf.float32, [1])
self.initial_lstm_state0 = tf.placeholder(tf.float32, [1, HIDDEN_NEURONS])
self.initial_lstm_state1 = tf.placeholder(tf.float32, [1, HIDDEN_NEURONS])
self.initial_lstm_state = tf.contrib.rnn.LSTMStateTuple(self.initial_lstm_state0,
self.initial_lstm_state1)
lstm_outputs, self.lstm_state = tf.nn.dynamic_rnn(self.lstm,
h_fc_2_reshaped,
initial_state=self.initial_lstm_state,
sequence_length=self.step_size,
time_major=False)
h_fc3 = tf.reshape(lstm_outputs, [-1,HIDDEN_NEURONS])
with tf.variable_scope("policy_output_layer"):
self.W_policy, self.b_policy = self._fc_variable([HIDDEN_NEURONS, ACTION_SIZE], name="policy")
# policy (output)
pi_ = tf.matmul(h_fc3, self.W_policy) + self.b_policy
self.pi_ = pi_
self.pi = tf.nn.softmax(pi_, name="pi")
with tf.variable_scope("value_output_layer"):
self.W_value, self.b_value = self._fc_variable([HIDDEN_NEURONS, 1], name="value")
# value (output)
v_ = tf.matmul(h_fc3, self.W_value) + self.b_value
self.v = tf.reshape(v_, [-1], name="value")
self.reset_state()
def reset_state(self):
self.lstm_state_out = tf.contrib.rnn.LSTMStateTuple(np.zeros([1, HIDDEN_NEURONS]),
np.zeros([1, HIDDEN_NEURONS]))
def run_policy_and_value(self, sess, state, target):
pi_out, v_out, self.lstm_state_out = sess.run(
[self.pi, self.v, self.lstm_state],
feed_dict={
self.s: [state],
self.t: [target],
self.initial_lstm_state0: self.lstm_state_out[0],
self.initial_lstm_state1: self.lstm_state_out[1],
self.step_size: [1]
})
return pi_out[0], v_out[0]
def run_policy(self, sess, state, target):
pi_out, self.lstm_state_out = sess.run(
[self.pi, self.lstm_state],
feed_dict={
self.s: [state],
self.t: [target],
self.initial_lstm_state0: self.lstm_state_out[0],
self.initial_lstm_state1: self.lstm_state_out[1],
self.step_size: [1]
})
return pi_out[0]
def run_value(self, sess, state, target):
prev_lstm_state_out = self.lstm_state_out
v_out, _ = sess.run(
[self.v, self.lstm_state],
feed_dict={
self.s: [state],
self.t: [target],
self.initial_lstm_state0: self.lstm_state_out[0],
self.initial_lstm_state1: self.lstm_state_out[1],
self.step_size: [1]
})
self.lstm_state_out = prev_lstm_state_out
return v_out[0] | 45.662037 | 114 | 0.53797 | 1,174 | 9,863 | 4.208688 | 0.099659 | 0.052621 | 0.039466 | 0.053835 | 0.836268 | 0.803279 | 0.788302 | 0.776159 | 0.776159 | 0.776159 | 0 | 0.016861 | 0.356585 | 9,863 | 216 | 115 | 45.662037 | 0.76174 | 0.044206 | 0 | 0.703226 | 0 | 0 | 0.060277 | 0.010863 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058065 | false | 0 | 0.03871 | 0 | 0.148387 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c4a03803b3666718fd51de577b192e217d0f8d65 | 79 | py | Python | mcconf/Bukkit.py | OmniTroid/mcconf | b248f0fce911b02e4cc8213c1b0e0ddc0105b6f8 | [
"MIT"
] | 1 | 2022-03-03T12:17:40.000Z | 2022-03-03T12:17:40.000Z | mcconf/Bukkit.py | OmniTroid/mcconf | b248f0fce911b02e4cc8213c1b0e0ddc0105b6f8 | [
"MIT"
] | null | null | null | mcconf/Bukkit.py | OmniTroid/mcconf | b248f0fce911b02e4cc8213c1b0e0ddc0105b6f8 | [
"MIT"
] | null | null | null | #TODO: implement
from .Provider import Provider
class Bukkit(Provider):
pass | 13.166667 | 30 | 0.78481 | 10 | 79 | 6.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139241 | 79 | 6 | 31 | 13.166667 | 0.911765 | 0.189873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
f200345f4d675826ea387ec756c09df773283aaf | 5,464 | py | Python | tests/test_riso8601.py | suhailpatel/riso8601 | e92611e034f0c9de56b04780a138dc1bf5a81ebf | [
"MIT"
] | 2 | 2020-07-06T09:11:05.000Z | 2020-12-11T12:53:11.000Z | tests/test_riso8601.py | suhailpatel/riso8601 | e92611e034f0c9de56b04780a138dc1bf5a81ebf | [
"MIT"
] | null | null | null | tests/test_riso8601.py | suhailpatel/riso8601 | e92611e034f0c9de56b04780a138dc1bf5a81ebf | [
"MIT"
] | null | null | null | import datetime
from datetime import timedelta, timezone
import pytest
import riso8601
def test_naive_times():
expected = datetime.datetime(2014, 1, 9, 21, 48)
assert expected == riso8601.parse_datetime("20140109T2148")
assert expected == riso8601.parse_datetime("2014-01-09T2148")
assert expected == riso8601.parse_datetime("20140109T21:48")
assert expected == riso8601.parse_datetime("2014-01-09T21:48")
def test_naive_times_with_seconds():
expected = datetime.datetime(2014, 1, 9, 21, 48, 53)
assert expected == riso8601.parse_datetime("20140109T214853")
assert expected == riso8601.parse_datetime("2014-01-09T214853")
assert expected == riso8601.parse_datetime("20140109T21:48:53")
assert expected == riso8601.parse_datetime("2014-01-09T21:48:53")
assert expected == riso8601.parse_datetime("2014-01-09T21:4853") # this one might be legally invalid?
assert expected == riso8601.parse_datetime("2014-01-09T2148:53") # this one might be legally invalid?
def test_naive_times_with_microseconds():
expected = datetime.datetime(2014, 1, 9, 21, 48, 53, 0)
assert expected == riso8601.parse_datetime("20140109T214853.000000")
assert expected == riso8601.parse_datetime("2014-01-09T214853.0")
assert expected == riso8601.parse_datetime("20140109T21:48:53.0")
assert expected == riso8601.parse_datetime("2014-01-09T21:48:53.000")
assert expected == riso8601.parse_datetime("2014-01-09T21:4853.0000") # this one might be legally invalid?
assert expected == riso8601.parse_datetime("2014-01-09T2148:53.00000") # this one might be legally invalid?
def test_edge_case_times():
assert datetime.datetime(2020, 1, 1, 0, 0) == riso8601.parse_datetime("2020-01-01T00:00")
assert datetime.datetime(2020, 1, 1, 0, 0, 0) == riso8601.parse_datetime("2020-01-01T00:00:00")
assert datetime.datetime(2020, 1, 1, 0, 0, 0, 0) == riso8601.parse_datetime("2020-01-01T00:00:00.000000")
assert datetime.datetime(2020, 12, 31, 23, 59) == riso8601.parse_datetime("2020-12-31T23:59")
assert datetime.datetime(2020, 12, 31, 23, 59, 59) == riso8601.parse_datetime("2020-12-31T23:59:59")
assert datetime.datetime(2020, 12, 31, 23, 59, 59, 999999) == riso8601.parse_datetime("2020-12-31T23:59:59.999999")
def test_tz_times():
assert datetime.datetime(2014, 1, 9, 21, 48, tzinfo=timezone.utc) == riso8601.parse_datetime("2014-01-09T21:48Z")
tz = timezone(timedelta(seconds=43200))
assert datetime.datetime(2014, 1, 9, 21, 48, tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48+12")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30+12")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, 99999,
tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30.99999+12")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, 99999,
tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30.99999+12:00")
tz = timezone(timedelta(seconds=-43200))
assert datetime.datetime(2014, 1, 9, 21, 48, tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48-12")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30-12")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, 99999,
tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30.99999-12")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, 99999,
tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30.99999-12:00")
tz = timezone(timedelta(seconds=1800))
assert datetime.datetime(2014, 1, 9, 21, 48, tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48+00:30")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, 99999,
tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30.99999+00:30")
tz = timezone(timedelta(seconds=-1800))
assert datetime.datetime(2014, 1, 9, 21, 48, tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48-00:30")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, 99999,
tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30.99999-00:30")
tz = timezone(timedelta(seconds=0))
assert datetime.datetime(2014, 1, 9, 21, 48, tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48-00")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, 99999,
tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30.99999+00")
assert datetime.datetime(2014, 1, 9, 21, 48, tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48-00:00")
assert datetime.datetime(2014, 1, 9, 21, 48, 30, 99999,
tzinfo=tz) == riso8601.parse_datetime("2014-01-09T21:48:30.99999+00:00")
def test_invalid_times():
invalids = [
"-5000-01-01T00:00:00", # bad year
"2014-00-01T00:00:00", # bad month
"2014-13-01T00:00:00", # bad month
"2014-10-00T00:00:00", # bad day
"2014-10-32T00:00:00", # bad day
"2014-10-32T24:00:00", # bad hour
"2014-10-32T00:60:00", # bad minute
"2014-10-32T00:00:60", # bad second
"2014-10-32T00:00", # no second
"2014-10-32T00:00:00.", # no microsecond
]
for dt_str in invalids:
with pytest.raises(Exception):
riso8601.parse_datetime(dt_str)
| 52.538462 | 119 | 0.661786 | 788 | 5,464 | 4.513959 | 0.106599 | 0.158561 | 0.236154 | 0.189767 | 0.884734 | 0.862243 | 0.813888 | 0.790554 | 0.63621 | 0.567051 | 0 | 0.292304 | 0.186676 | 5,464 | 103 | 120 | 53.048544 | 0.508101 | 0.043924 | 0 | 0.1 | 0 | 0 | 0.196967 | 0.094644 | 0 | 0 | 0 | 0 | 0.4875 | 1 | 0.075 | false | 0 | 0.05 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f215f579c300b1e7273cf7b6b5bd3c094ecaa980 | 34,778 | py | Python | ddganAE/wandb/train_wandb_pred.py | acse-zrw20/DD-GAN-AE | 4b362b31c55c95a63156ed58320ed75bc7473cc0 | [
"MIT"
] | 1 | 2021-12-27T06:14:32.000Z | 2021-12-27T06:14:32.000Z | ddganAE/wandb/train_wandb_pred.py | acse-xl620/DD-GAN-AE | 4b362b31c55c95a63156ed58320ed75bc7473cc0 | [
"MIT"
] | null | null | null | ddganAE/wandb/train_wandb_pred.py | acse-xl620/DD-GAN-AE | 4b362b31c55c95a63156ed58320ed75bc7473cc0 | [
"MIT"
] | 3 | 2021-08-05T11:17:37.000Z | 2021-09-02T02:37:44.000Z | """
Functions used for weights and biases hyperparameter optimization of
predictive models on slug flow dataset.
"""
import wandb
import tensorflow as tf
import argparse
import os
import json
import keras
from sklearn.preprocessing import MinMaxScaler
from ddganAE.models import Predictive_adversarial, Predictive
from ddganAE.architectures.svdae import (
build_vinicius_encoder_decoder,
build_slimmer_vinicius_encoder_decoder,
build_smaller_vinicius_encoder_decoder,
build_dense_decoder,
build_deeper_dense_encoder,
build_dense_encoder,
build_slimmer_dense_decoder,
build_wider_dense_decoder,
build_wider_dense_encoder,
build_deeper_dense_decoder,
build_slimmer_dense_encoder,
)
from ddganAE.architectures.discriminators import (
build_custom_discriminator,
build_custom_wider_discriminator
)
from ddganAE.wandb.get_snapshots_3d_continuous import \
get_snapshots_3D
import numpy as np
__author__ = "Zef Wolffs"
__credits__ = []
__license__ = "MIT"
__version__ = "1.0.0"
__maintainer__ = "Zef Wolffs"
__email__ = "zefwolffs@gmail.com"
__status__ = "Development"
def train_wandb_pred_aae(config=None):
"""
Construct and subsequently train the model while reporting losses to
weights and biases platform. Weights and biases also controls
hyperparameters.
Args:
config (dict, optional): Dictionary with hyperparameters, set by
weights and biases. Defaults to None.
"""
with wandb.init(config=config, tags=["central_doms_pred_mse"]):
# If called by wandb.agent, as below,
# this config will be set by Sweep Controller
config = wandb.config
# Data processing
latent_vars = np.load(config.datafile)
nfiles = int(latent_vars.shape[0]/config.domains)
latent_vars_reshaped = np.moveaxis(
latent_vars.reshape(nfiles, config.domains, config.in_vars),
0, 2)
train_data = latent_vars_reshaped[:config.domains]
# Scaling the latent variables
scaler = MinMaxScaler((-1, 1))
train_data = scaler.fit_transform(
train_data.reshape(-1, 1)).reshape(train_data.shape)
initializer = tf.keras.initializers.RandomNormal(
mean=0.0, stddev=0.05, seed=None
)
if config.optimizer == "nadam":
optimizer = tf.keras.optimizers.Nadam(
lr=config.learning_rate,
beta_1=config.momentum,
beta_2=config.beta_2,
)
elif config.optimizer == "adam":
optimizer = tf.keras.optimizers.Adam(
lr=config.learning_rate,
beta_1=config.momentum,
beta_2=config.beta_2,
)
elif config.optimizer == "sgd":
optimizer = tf.keras.optimizers.SGD(
learning_rate=config.learning_rate, momentum=config.momentum
)
if config.architecture == "dense":
encoder = build_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
)
decoder = build_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "deeper_dense":
encoder = build_deeper_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout
)
decoder = build_deeper_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "wider_dense":
encoder = build_wider_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout
)
decoder = build_wider_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "slimmer_dense":
encoder = build_slimmer_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout
)
decoder = build_slimmer_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "vinicius":
encoder, decoder = build_vinicius_encoder_decoder(
config.in_vars,
config.latent_vars,
initializer,
act=config.activation,
dense_act=config.dense_activation,
dropout=config.dropout,
reg=config.regularization,
batchnorm=config.batch_normalization,
final_act=config.final_act
)
elif config.architecture == "smaller_vinicius":
encoder, decoder = build_smaller_vinicius_encoder_decoder(
config.in_vars,
config.latent_vars,
initializer,
act=config.activation,
dense_act=config.dense_activation,
dropout=config.dropout,
reg=config.regularization,
batchnorm=config.batch_normalization,
final_act=config.final_act
)
elif config.architecture == "slimmer_vinicius":
encoder, decoder = build_slimmer_vinicius_encoder_decoder(
config.in_vars,
config.latent_vars,
initializer,
act=config.activation,
dense_act=config.dense_activation,
dropout=config.dropout,
reg=config.regularization,
batchnorm=config.batch_normalization,
final_act=config.final_act
)
if config.discriminator_architecture == "custom":
discriminator = build_custom_discriminator(
config.latent_vars, initializer, info=False
)
elif config.discriminator_architecture == "custom_wider":
discriminator = build_custom_wider_discriminator(
config.latent_vars, initializer, info=False
)
pred_adv = Predictive_adversarial(encoder, decoder, discriminator,
optimizer)
pred_adv.compile(config.in_vars, increment=config.increment)
pred_adv.train(
train_data,
config.epochs,
interval=config.interval,
batch_size=config.batch_size,
val_size=0.1,
wandb_log=True,
noise_std=config.noise_std,
n_discriminator=config.n_discriminator,
n_gradient_ascent=config.n_gradient_ascent
)
# Check how well the model actually performs to also predict the
# results
# Create boundaries and initial values arrays for prediction later
boundaries = np.zeros((2, config.in_vars, nfiles))
boundaries[0] = train_data[4]
boundaries[1] = train_data[7]
init_values = np.zeros((2, config.in_vars))
init_values[0] = train_data[5][:, 0]
init_values[1] = train_data[6][:, 0]
predicted = pred_adv.predict(boundaries, init_values,
int(nfiles/config.interval)-1, iters=5)
train_data_int = train_data[4:8, :, ::config.interval]
mse = tf.keras.losses.MeanSquaredError()
mse_pred = mse(predicted[:, :, :int(nfiles/config.interval)-2],
train_data_int[:, :, :int(nfiles/config.interval)-2])\
.numpy()
log = {"prediction_mse": mse_pred}
wandb.log(log)
if config.savemodel:
dirname = "model_" + wandb.run.name
os.mkdir(dirname)
pred_adv.encoder.save(dirname + '/encoder')
pred_adv.decoder.save(dirname + '/decoder')
def train_wandb_pred_ae(config=None):
"""
Construct and subsequently train the model while reporting losses to
weights and biases platform. Weights and biases also controls
hyperparameters.
Args:
config (dict, optional): Dictionary with hyperparameters, set by
weights and biases. Defaults to None.
"""
with wandb.init(config=config, tags=["central_doms_pred_mse"]):
# If called by wandb.agent, as below,
# this config will be set by Sweep Controller
config = wandb.config
# Data processing
latent_vars = np.load(config.datafile)
nfiles = int(latent_vars.shape[0]/config.domains)
latent_vars_reshaped = np.moveaxis(
latent_vars.reshape(nfiles, config.domains, config.in_vars),
0, 2)
train_data = latent_vars_reshaped[:config.domains]
# Scaling the latent variables
scaler = MinMaxScaler((-1, 1))
train_data = scaler.fit_transform(
train_data.reshape(-1, 1)).reshape(train_data.shape)
initializer = tf.keras.initializers.RandomNormal(
mean=0.0, stddev=0.05, seed=None
)
if config.optimizer == "nadam":
optimizer = tf.keras.optimizers.Nadam(
lr=config.learning_rate,
beta_1=config.momentum,
beta_2=config.beta_2,
)
elif config.optimizer == "adam":
optimizer = tf.keras.optimizers.Adam(
lr=config.learning_rate,
beta_1=config.momentum,
beta_2=config.beta_2,
)
elif config.optimizer == "sgd":
optimizer = tf.keras.optimizers.SGD(
learning_rate=config.learning_rate, momentum=config.momentum
)
if config.architecture == "dense":
encoder = build_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
)
decoder = build_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "deeper_dense":
encoder = build_deeper_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout
)
decoder = build_deeper_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "wider_dense":
encoder = build_wider_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout
)
decoder = build_wider_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "slimmer_dense":
encoder = build_slimmer_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout
)
decoder = build_slimmer_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "vinicius":
encoder, decoder = build_vinicius_encoder_decoder(
config.in_vars,
config.latent_vars,
initializer,
act=config.activation,
dense_act=config.dense_activation,
dropout=config.dropout,
reg=config.regularization,
batchnorm=config.batch_normalization,
final_act=config.final_act
)
elif config.architecture == "smaller_vinicius":
encoder, decoder = build_smaller_vinicius_encoder_decoder(
config.in_vars,
config.latent_vars,
initializer,
act=config.activation,
dense_act=config.dense_activation,
dropout=config.dropout,
reg=config.regularization,
batchnorm=config.batch_normalization,
final_act=config.final_act
)
elif config.architecture == "slimmer_vinicius":
encoder, decoder = build_slimmer_vinicius_encoder_decoder(
config.in_vars,
config.latent_vars,
initializer,
act=config.activation,
dense_act=config.dense_activation,
dropout=config.dropout,
reg=config.regularization,
batchnorm=config.batch_normalization,
final_act=config.final_act
)
pred_adv = Predictive(encoder, decoder,
optimizer)
pred_adv.compile(config.in_vars, increment=config.increment)
pred_adv.train(
train_data,
config.epochs,
interval=config.interval,
batch_size=config.batch_size,
val_size=0.1,
wandb_log=True,
noise_std=config.noise_std
)
# Check how well the model actually performs to also predict the
# results
# Create boundaries and initial values arrays for prediction later
boundaries = np.zeros((2, config.in_vars, nfiles))
boundaries[0] = train_data[4]
boundaries[1] = train_data[7]
init_values = np.zeros((2, config.in_vars))
init_values[0] = train_data[5][:, 0]
init_values[1] = train_data[6][:, 0]
predicted = pred_adv.predict(boundaries, init_values,
int(nfiles/config.interval)-1, iters=5)
train_data_int = train_data[4:8, :, ::config.interval]
mse = tf.keras.losses.MeanSquaredError()
mse_pred = mse(predicted[:, :, :int(nfiles/config.interval)-2],
train_data_int[:, :, :int(nfiles/config.interval)-2])\
.numpy()
log = {"prediction_mse": mse_pred}
wandb.log(log)
if config.savemodel:
dirname = "model_" + wandb.run.name
os.mkdir(dirname)
pred_adv.encoder.save(dirname + '/encoder')
pred_adv.decoder.save(dirname + '/decoder')
def continuous_train_wandb_pred_aae(config=None):
"""
Construct and subsequently train the model while reporting losses to
weights and biases platform. Weights and biases also controls
hyperparameters.
Args:
config (dict, optional): Dictionary with hyperparameters, set by
weights and biases. Defaults to None.
"""
with wandb.init(config=config):
# If called by wandb.agent, as below,
# this config will be set by Sweep Controller
config = wandb.config
initializer = tf.keras.initializers.RandomNormal(
mean=0.0, stddev=0.05, seed=None
)
if config.optimizer == "nadam":
optimizer = tf.keras.optimizers.Nadam(
lr=config.learning_rate,
beta_1=config.momentum,
beta_2=config.beta_2,
)
elif config.optimizer == "adam":
optimizer = tf.keras.optimizers.Adam(
lr=config.learning_rate,
beta_1=config.momentum,
beta_2=config.beta_2,
)
elif config.optimizer == "sgd":
optimizer = tf.keras.optimizers.SGD(
learning_rate=config.learning_rate, momentum=config.momentum
)
if config.architecture == "dense":
encoder = build_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
)
decoder = build_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "deeper_dense":
encoder = build_deeper_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout
)
decoder = build_deeper_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "wider_dense":
encoder = build_wider_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout
)
decoder = build_wider_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "slimmer_dense":
encoder = build_slimmer_dense_encoder(
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout
)
decoder = build_slimmer_dense_decoder(
config.in_vars,
config.latent_vars,
initializer,
info=False,
act=config.activation,
dropout=config.dropout,
final_act=config.final_act
)
elif config.architecture == "vinicius":
encoder, decoder = build_vinicius_encoder_decoder(
config.in_vars,
config.latent_vars,
initializer,
act=config.activation,
dense_act=config.dense_activation,
dropout=config.dropout,
reg=config.regularization,
batchnorm=config.batch_normalization,
final_act=config.final_act
)
elif config.architecture == "smaller_vinicius":
encoder, decoder = build_smaller_vinicius_encoder_decoder(
config.in_vars,
config.latent_vars,
initializer,
act=config.activation,
dense_act=config.dense_activation,
dropout=config.dropout,
reg=config.regularization,
batchnorm=config.batch_normalization,
final_act=config.final_act
)
elif config.architecture == "slimmer_vinicius":
encoder, decoder = build_slimmer_vinicius_encoder_decoder(
config.in_vars,
config.latent_vars,
initializer,
act=config.activation,
dense_act=config.dense_activation,
dropout=config.dropout,
reg=config.regularization,
batchnorm=config.batch_normalization,
final_act=config.final_act
)
if config.discriminator_architecture == "custom":
discriminator = build_custom_discriminator(
config.latent_vars, initializer, info=False
)
elif config.discriminator_architecture == "custom_wider":
discriminator = build_custom_wider_discriminator(
config.latent_vars, initializer, info=False
)
pred_adv = Predictive_adversarial(encoder, decoder, discriminator,
optimizer)
pred_adv.compile(config.in_vars, increment=config.increment)
nfiles = 800
for i in range(config.n_epochs):
# Data processing
grids = get_snapshots_3D(nfiles=nfiles,
ndomains=config.domains,
in_file_base=config.datafile,
save=False)
# Set all >0 to 0 and all <1 to 1 for alpha field
grids[:, :, :, :, 3][np.where(grids[:, :, :, :, 3] < 0)] = 0
grids[:, :, :, :, 3][np.where(grids[:, :, :, :, 3] > 1)] = 1
# Rescale all the velocity fields
scaler = MinMaxScaler()
grids[:, :, :, :, :3] = scaler.fit_transform(grids[:, :, :, :, :3]
.reshape(-1, 1))\
.reshape(grids[:, :, :, :, :3].shape)
initializer = tf.keras.initializers.RandomNormal(mean=0.0,
stddev=0.05,
seed=None)
optimizer = tf.keras.optimizers.Adam(lr=0.0005, beta_1=0.98,
beta_2=0.9)
encoder = keras.models.load_model(config.encoder_folder +
"/encoder")
latent_vars = encoder.predict(grids)
latent_vars_reshaped = np.moveaxis(
latent_vars.reshape(nfiles, config.domains, config.in_vars),
0, 2)
train_data = latent_vars_reshaped
# Scaling the latent variables
scaler = MinMaxScaler((-1, 1))
train_data = scaler.fit_transform(
train_data.reshape(-1, 1)).reshape(train_data.shape)
# Generate a new set of training data every n epochs
pred_adv.train(
train_data,
config.epochs,
interval=config.interval,
batch_size=config.batch_size,
val_size=0.1,
wandb_log=True,
noise_std=config.noise_std,
n_discriminator=config.n_discriminator,
n_gradient_ascent=config.n_gradient_ascent
)
# Check how well the model actually performs to also predict the
# results
# Create boundaries and initial values arrays for prediction later
boundaries = np.zeros((2, config.in_vars, nfiles))
boundaries[0] = train_data[0]
boundaries[1] = train_data[3]
init_values = np.zeros((2, 10))
init_values[0] = train_data[1][:, 0]
init_values[1] = train_data[2][:, 0]
predicted = pred_adv.predict(boundaries, init_values,
int(nfiles/config.interval)-1,
iters=5)
train_data_int = train_data[:, :, ::config.interval]
mse = tf.keras.losses.MeanSquaredError()
mse_pred = mse(predicted[:, :, :int(nfiles/config.interval)-2],
train_data_int[:4, :, :int(nfiles/config.interval) -
2])\
.numpy()
log = {"prediction_mse": mse_pred}
wandb.log(log)
if config.savemodel:
dirname = "model_" + wandb.run.name
os.mkdir(dirname)
pred_adv.encoder.save(dirname + '/encoder')
pred_adv.decoder.save(dirname + '/decoder')
# Configuration options for hyperparameter optimization
Predictive_adversarial_sweep_config = {
"method": "bayes",
"metric": {"name": "prediction_mse", "goal": "minimize"},
"parameters": {
"architecture": {
"values": [
"dense",
"deeper_dense",
"wider_dense",
"slimmer_dense",
"vinicius",
"smaller_vinicius",
"slimmer_vinicius",
]
},
"activation": {"values": ["relu", "elu", "sigmoid", "tanh"]},
"discriminator_architecture": {"values": ["custom", "custom_wider"]},
"in_vars": {"values": [100]},
"dense_activation": {"values": ["relu", "linear"]},
"batch_size": {"values": [32, 64, 128]},
"learning_rate": {"values": [5e-3, 5e-4, 5e-5]},
"dropout": {"values": [0.3, 0.55, 0.8]},
"optimizer": {"values": ["nadam", "adam", "sgd"]},
"momentum": {"values": [0.8, 0.9, 0.98]},
"beta_2": {"values": [0.9, 0.999, 0.99999]},
"batch_normalization": {"values": [True, False]},
"regularization": {"values": [1e-3, 1e-4, 1e-5, 1e-6, 0]},
"savemodel": {"values": [False]},
"latent_vars": {"values": [100, 300, 500]},
"interval": {"values": [1, 2, 4, 6]},
"final_act": {
"values": [
"linear",
"sigmoid",
"tanh"
]
},
"noise_std": {"values": [0.00001, 0.001, 0.01, 0.05, 0.1]},
"increment": {"values": [True, False]},
"epochs": {"values": [200, 500, 1000, 2000]},
"n_discriminator": {"values": [1, 2, 4, 5]},
"n_gradient_ascent": {"values": [3, 8, 15, 30]},
"domains": {"values": [10]}
},
}
# Configuration options for hyperparameter optimization
Predictive_ae_sweep_config = {
"method": "random",
"metric": {"name": "prediction_mse", "goal": "minimize"},
"parameters": {
"architecture": {
"values": [
"dense",
"deeper_dense",
"wider_dense",
"slimmer_dense",
"vinicius",
"smaller_vinicius",
"slimmer_vinicius",
]
},
"activation": {"values": ["relu", "elu", "sigmoid", "tanh"]},
"in_vars": {"values": [20]},
"dense_activation": {"values": ["relu", "linear"]},
"batch_size": {"values": [32, 64, 128]},
"learning_rate": {"values": [5e-3, 5e-4, 5e-5]},
"dropout": {"values": [0.3, 0.55, 0.8]},
"optimizer": {"values": ["nadam", "adam", "sgd"]},
"momentum": {"values": [0.8, 0.9, 0.98]},
"beta_2": {"values": [0.9, 0.999, 0.99999]},
"batch_normalization": {"values": [True, False]},
"regularization": {"values": [1e-3, 1e-4, 1e-5, 1e-6, 0]},
"savemodel": {"values": [False]},
"latent_vars": {"values": [30, 60, 100, 300]},
"interval": {"values": [1, 2, 4, 6]},
"final_act": {
"values": [
"linear",
"sigmoid",
"tanh"
]
},
"noise_std": {"values": [0.00001, 0.001, 0.01, 0.05, 0.1]},
"increment": {"values": [True, False]},
"epochs": {"values": [100, 200, 500, 1000]},
"domains": {"values": [10]}
},
}
# Configuration options for hyperparameter optimization
Continuous_predictive_adversarial_sweep_config = {
"method": "random",
"metric": {"name": "prediction_mse", "goal": "minimize"},
"parameters": {
"architecture": {
"values": [
"dense",
"deeper_dense",
"wider_dense",
"slimmer_dense",
"vinicius",
"smaller_vinicius",
"slimmer_vinicius",
]
},
"activation": {"values": ["relu", "elu", "sigmoid", "tanh"]},
"discriminator_architecture": {"values": ["custom", "custom_wider"]},
"in_vars": {"values": [10]},
"dense_activation": {"values": ["relu", "linear"]},
"batch_size": {"values": [32, 64, 128]},
"learning_rate": {"values": [5e-3, 5e-4, 5e-5]},
"dropout": {"values": [0.3, 0.55, 0.8]},
"optimizer": {"values": ["nadam", "adam", "sgd"]},
"momentum": {"values": [0.8, 0.9, 0.98]},
"beta_2": {"values": [0.9, 0.999, 0.99999]},
"batch_normalization": {"values": [True, False]},
"regularization": {"values": [1e-3, 1e-4, 1e-5, 1e-6, 0]},
"savemodel": {"values": [False]},
"latent_vars": {"values": [30, 50, 100]},
"interval": {"values": [1, 2, 4, 6]},
"final_act": {
"values": [
"linear",
"sigmoid",
"tanh"
]
},
"noise_std": {"values": [0.00001, 0.001, 0.01, 0.05, 0.1]},
"increment": {"values": [True, False]},
"epochs": {"values": [50, 100, 150]},
"n_discriminator": {"values": [1]},
"n_gradient_ascent": {"values": [3, 8, 15, 30]},
"domains": {"values": [6]},
"n_epochs": {"values": [20]}
},
}
# Build a small CLI
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Do hyperparameter \
optimization on slug flow dataset")
parser.add_argument('--datafile', type=str, nargs='?',
default="/home/zef/Documents/master/acse-9/DD-GAN-AE/\
submodules/DD-GAN/data/processed/cae_latent_sf_10vars_800steps_different.npy",
help='path to structured grid data file')
parser.add_argument('--savemodel', type=str, nargs='?',
default="False",
help='Wether or not to save the models, set "True" for \
saving')
parser.add_argument('--niters', type=int, nargs='?',
default=200,
help='Number of sweeps to execute')
parser.add_argument('--custom_config', type=str, nargs='?',
default=None,
help='json file with custom configurations for sweep')
parser.add_argument('--continuous', action='store_true', default=False,
help='whether to use continuous learning \
functionality')
parser.add_argument('--encoder_folder', type=str, nargs='?',
default=None,
help='folder with autoencoder for generating latent \
variables')
parser.add_argument('--model', type=str, nargs='?',
default=None,
help='Choose either ae (normal autoencoder) or aae \
(adversarial autoencoder)')
args = parser.parse_args()
arg_dict = vars(args)
if args.continuous:
if arg_dict['custom_config'] is not None:
with open(arg_dict["custom_config"]) as json_file:
Continuous_predictive_adversarial_sweep_config = \
json.load(json_file)
if arg_dict["savemodel"] == "True":
Continuous_predictive_adversarial_sweep_config['parameters'][
'savemodel'] = \
{'values': [True]}
Continuous_predictive_adversarial_sweep_config['parameters'][
'datafile'] = \
{'values': [arg_dict['datafile']]}
Continuous_predictive_adversarial_sweep_config['parameters'][
'encoder_folder'] = \
{'values': [arg_dict['encoder_folder']]}
sweep_id = wandb.sweep(Continuous_predictive_adversarial_sweep_config,
project='pred-aae', entity='zeff020')
wandb.agent(sweep_id, continuous_train_wandb_pred_aae,
count=arg_dict['niters'])
if args.model == "ae":
# Use the normal autoencoder for predictions
if arg_dict['custom_config'] is not None:
with open(arg_dict["custom_config"]) as json_file:
Predictive_ae_sweep_config = json.load(json_file)
if arg_dict["savemodel"] == "True":
Predictive_ae_sweep_config['parameters']['savemodel'] = \
{'values': [True]}
Predictive_ae_sweep_config['parameters']['datafile'] = \
{'values': [arg_dict['datafile']]}
sweep_id = wandb.sweep(Predictive_ae_sweep_config,
project='pred-ae', entity='zeff020')
wandb.agent(sweep_id, train_wandb_pred_ae, count=arg_dict['niters'])
elif args.model == "aae":
# Use the adversarial autoencoder for predictions
if arg_dict['custom_config'] is not None:
with open(arg_dict["custom_config"]) as json_file:
Predictive_adversarial_sweep_config = json.load(json_file)
if arg_dict["savemodel"] == "True":
Predictive_adversarial_sweep_config['parameters']['savemodel'] = \
{'values': [True]}
Predictive_adversarial_sweep_config['parameters']['datafile'] = \
{'values': [arg_dict['datafile']]}
sweep_id = wandb.sweep(Predictive_adversarial_sweep_config,
project='pred-aae', entity='zeff020')
wandb.agent(sweep_id, train_wandb_pred_aae, count=arg_dict['niters'])
| 37.597838 | 80 | 0.537696 | 3,353 | 34,778 | 5.363555 | 0.096033 | 0.031528 | 0.032918 | 0.055549 | 0.877169 | 0.860654 | 0.844139 | 0.840247 | 0.819506 | 0.816281 | 0 | 0.021464 | 0.356979 | 34,778 | 924 | 81 | 37.638528 | 0.782721 | 0.06231 | 0 | 0.732737 | 0 | 0 | 0.089137 | 0.002895 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003836 | false | 0 | 0.015345 | 0 | 0.019182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f221b4dd51eb94f00b9f56c3a80b1f6e1a43f03c | 132 | py | Python | Solutions/Training/Lesson_02/__init__.py | dev-11/codility-solutions | 01b0ce4a43b1390fe15f2daabea95e90b834fbfc | [
"MIT"
] | null | null | null | Solutions/Training/Lesson_02/__init__.py | dev-11/codility-solutions | 01b0ce4a43b1390fe15f2daabea95e90b834fbfc | [
"MIT"
] | null | null | null | Solutions/Training/Lesson_02/__init__.py | dev-11/codility-solutions | 01b0ce4a43b1390fe15f2daabea95e90b834fbfc | [
"MIT"
] | null | null | null | from .cyclic_rotation import solution as cyclic_rotation
from .odd_occurrences_in_array import solution as odd_occurrences_in_array
| 44 | 74 | 0.893939 | 20 | 132 | 5.5 | 0.5 | 0.254545 | 0.290909 | 0.381818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 132 | 2 | 75 | 66 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
1ee802cf12c08e0454f05b3d4678ac52361fd9b0 | 1,864 | py | Python | windc_data/windc_2_0_1/parse/bea_pce.py | uw-windc/windc_datastream | 8b6277acf943358a064f109a873852a82991ba3d | [
"MIT"
] | 3 | 2019-11-07T05:15:18.000Z | 2020-06-30T15:30:10.000Z | windc_data/windc_2_0_1/parse/bea_pce.py | uw-windc/windc_datastream | 8b6277acf943358a064f109a873852a82991ba3d | [
"MIT"
] | null | null | null | windc_data/windc_2_0_1/parse/bea_pce.py | uw-windc/windc_datastream | 8b6277acf943358a064f109a873852a82991ba3d | [
"MIT"
] | 4 | 2019-11-07T05:15:29.000Z | 2021-09-01T17:01:03.000Z | import pandas as pd
import os
def _saexp1(data_dir):
file = "SAEXP1_1997_2017_ALL_AREAS_.csv"
t = pd.read_csv(
os.path.join(data_dir, file),
index_col=None,
engine="c",
nrows=1440,
low_memory=False,
)
t["GeoFIPS"] = t["GeoFIPS"].replace({'"': ""}, regex=True)
t["GeoFIPS"] = t["GeoFIPS"].map(int)
# melt data
t = pd.melt(t, id_vars=t.keys()[0:9], var_name="year")
# typing
t["GeoFIPS"] = t["GeoFIPS"].map(str)
t["GeoName"] = t["GeoName"].map(str)
t["Region"] = t["Region"].map(str)
t["TableName"] = t["TableName"].map(str)
t["ComponentName"] = t["ComponentName"].map(str)
t["Unit"] = t["Unit"].map(str)
t["Line"] = t["Line"].map(str)
t["IndustryClassification"] = t["IndustryClassification"].map(str)
t["Description"] = t["Description"].map(str)
t["year"] = t["year"].map(str)
t["value"] = t["value"].map(float)
return t
def _saexp2(data_dir):
file = "SAEXP2_1997_2017_ALL_AREAS_.csv"
t = pd.read_csv(
os.path.join(data_dir, file),
index_col=None,
engine="c",
nrows=1440,
low_memory=False,
)
t["GeoFIPS"] = t["GeoFIPS"].replace({'"': ""}, regex=True)
t["GeoFIPS"] = t["GeoFIPS"].map(int)
# melt data
t = pd.melt(t, id_vars=t.keys()[0:9], var_name="year")
# typing
t["GeoFIPS"] = t["GeoFIPS"].map(str)
t["GeoName"] = t["GeoName"].map(str)
t["Region"] = t["Region"].map(str)
t["TableName"] = t["TableName"].map(str)
t["ComponentName"] = t["ComponentName"].map(str)
t["Unit"] = t["Unit"].map(str)
t["Line"] = t["Line"].map(str)
t["IndustryClassification"] = t["IndustryClassification"].map(str)
t["Description"] = t["Description"].map(str)
t["year"] = t["year"].map(str)
t["value"] = t["value"].map(float)
return t
| 27.820896 | 70 | 0.562232 | 262 | 1,864 | 3.900763 | 0.221374 | 0.117417 | 0.136986 | 0.093933 | 0.925636 | 0.925636 | 0.925636 | 0.925636 | 0.925636 | 0.925636 | 0 | 0.021769 | 0.211373 | 1,864 | 66 | 71 | 28.242424 | 0.673469 | 0.017704 | 0 | 0.84 | 0 | 0 | 0.272727 | 0.082147 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.04 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4824fe816e4bb5e6d4053c8eba4306d6cdb15456 | 3,133 | py | Python | test/test_commandline_nested.py | Anselmoo/copy2hash | 5c8e6dd7a5503b2beb33396c036cf5b42cd6e2d1 | [
"MIT"
] | 2 | 2020-06-04T15:47:28.000Z | 2021-08-14T17:56:11.000Z | test/test_commandline_nested.py | Anselmoo/copy2hash | 5c8e6dd7a5503b2beb33396c036cf5b42cd6e2d1 | [
"MIT"
] | 13 | 2020-05-21T18:54:28.000Z | 2021-06-26T19:43:53.000Z | test/test_commandline_nested.py | Anselmoo/copy2hash | 5c8e6dd7a5503b2beb33396c036cf5b42cd6e2d1 | [
"MIT"
] | null | null | null | from copy2hash import copy2hash
from pathlib import Path
class TestNestedDirectories(object):
def test_local_nested_files_i(self):
args = {
"infile": list(Path("test").rglob("*.txt")),
"report": ["json"],
"report_name": "copy_report",
"sha": ["sha256"],
"directory": None,
"move": False,
"file_extension": False,
"file_suffix": False,
"no_file_extension": False,
"verbose": True,
"version": False,
}
copy2hash.command_line_runner(opt=args)
assert 1
def test_local_nested_files_ii(self):
# Double check concerning copy rights
args = {
"infile": list(Path("test").rglob("*.txt")),
"report": ["json"],
"report_name": "copy_report",
"sha": ["sha256"],
"directory": None,
"move": False,
"file_extension": False,
"file_suffix": False,
"no_file_extension": False,
"verbose": True,
"version": False,
}
copy2hash.command_line_runner(opt=args)
assert 1
def test_local_nested_files_iii(self):
args = {
"infile": list(Path("test").rglob("*.txt")),
"report": ["csv", "json", "pkl", "yaml", "txt", "xml"],
"report_name": "report4travis",
"sha": [
"sha1",
"sha224",
"sha256",
"sha384",
"sha512",
"blake2b",
"blake2s",
"md5",
"sha3_224",
"sha3_256",
"sha3_384",
"sha3_512",
"shake_128",
"shake_256",
],
"directory": None,
"move": False,
"file_extension": False,
"file_suffix": False,
"no_file_extension": False,
"verbose": True,
"version": False,
}
copy2hash.command_line_runner(opt=args)
assert 1
def test_local_nested_files_iiii(self):
# Double check concerning copy rights
args = {
"infile": list(Path("test").rglob("*.txt")),
"report": ["csv", "json", "pkl", "yaml", "txt", "xml"],
"report_name": "report4travis",
"sha": [
"sha1",
"sha224",
"sha256",
"sha384",
"sha512",
"blake2b",
"blake2s",
"md5",
"sha3_224",
"sha3_256",
"sha3_384",
"sha3_512",
"shake_128",
"shake_256",
],
"directory": None,
"move": False,
"file_extension": False,
"file_suffix": False,
"no_file_extension": False,
"verbose": True,
"version": False,
}
copy2hash.command_line_runner(opt=args)
assert 1
| 27.725664 | 67 | 0.431216 | 259 | 3,133 | 4.992278 | 0.266409 | 0.055684 | 0.111369 | 0.055684 | 0.929621 | 0.911833 | 0.911833 | 0.911833 | 0.911833 | 0.905646 | 0 | 0.052988 | 0.43377 | 3,133 | 112 | 68 | 27.973214 | 0.675874 | 0.022662 | 0 | 0.886598 | 0 | 0 | 0.23341 | 0 | 0 | 0 | 0 | 0 | 0.041237 | 1 | 0.041237 | false | 0 | 0.020619 | 0 | 0.072165 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
486bb74a319a55c7d1f21e10a9df3f1b3e0d49a6 | 6,351 | py | Python | py/HW3/option_models/sabr.py | zwc00098/ASP | bae3919534bc18fb6cfd732862b2173d924cf12f | [
"MIT"
] | null | null | null | py/HW3/option_models/sabr.py | zwc00098/ASP | bae3919534bc18fb6cfd732862b2173d924cf12f | [
"MIT"
] | null | null | null | py/HW3/option_models/sabr.py | zwc00098/ASP | bae3919534bc18fb6cfd732862b2173d924cf12f | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Tue Oct 10
@author: jaehyuk
"""
import numpy as np
import scipy.stats as ss
import scipy.optimize as sopt
import pyfeng as pf
import scipy.integrate as spint
'''
MC model class for Beta=1
'''
class ModelBsmMC:
beta = 1.0 # fixed (not used)
vov, rho = 0.0, 0.0
sigma, intr, divr = None, None, None
bsm_model = None
'''
You may define more members for MC: time step, etc
'''
def __init__(self, sigma, vov=0, rho=0.0, beta=1.0, intr=0, divr=0):
self.sigma = sigma
self.vov = vov
self.rho = rho
self.intr = intr
self.divr = divr
self.bsm_model = pf.Bsm(sigma, intr=intr, divr=divr)
def bsm_vol(self, strike, spot, texp=None, sigma=None):
''''
From the price from self.price() compute the implied vol
Use self.bsm_model.impvol() method
'''
p = self.price(strike, spot, texp).mean(axis=0)
return self.bsm_model.impvol(p, strike, spot, texp)
def price(self, strike, spot, texp=None, sigma=None, cp=1):
'''
Your MC routine goes here
Generate paths for vol and price first. Then get prices (vector) for all strikes
You may fix the random number seed
'''
m = pf.BsmNdMc(self.sigma, rn_seed = 12345)
m.simulate(tobs = [texp], n_path = 10000)
payoff = lambda x: np.fmax(np.mean(x, axis=1) - strike, 0)
price = []
for strike in strike:
price.append(m.price_european(spot, texp, payoff))
return np.array(price)
'''
MC model class for Beta=0
'''
class ModelNormalMC:
beta = 0.0 # fixed (not used)
vov, rho = 0.0, 0.0
sigma, intr, divr = None, None, None
normal_model = None
def __init__(self, sigma, vov=0, rho=0.0, beta=0.0, intr=0, divr=0):
self.sigma = sigma
self.vov = vov
self.rho = rho
self.intr = intr
self.divr = divr
self.normal_model = pf.Norm(sigma, intr=intr, divr=divr)
def norm_vol(self, strike, spot, texp=None, sigma=None):
''''
From the price from self.price() compute the implied vol.
Use self.normal_model.impvol() method
'''
p = self.price(strike, spot, texp).mean(axis=0)
return self.normal_model.impvol(p, stike, spot, texp)
def price(self, strike, spot, texp=None, sigma=None, cp=1):
'''
Your MC routine goes here
Generate paths for vol and price first. Then get prices (vector) for all strikes
You may fix the random number seed
'''
znorm = np.random.normal(size=10000)
forward = spot
prices = []
for strike in strike:
price = forward + np.sqrt(texp) * znorm * self.sigma
price = np.mean(np.fmax(cp*(price - strike), 0))
prices.append(price)
return np.array(prices)
'''
Conditional MC model class for Beta=1
'''
class ModelBsmCondMC:
beta = 1.0 # fixed (not used)
vov, rho = 0.0, 0.0
sigma, intr, divr = None, None, None
bsm_model = None
'''
You may define more members for MC: time step, etc
'''
def __init__(self, sigma, vov=0, rho=0.0, beta=1.0, intr=0, divr=0):
self.sigma = sigma
self.vov = vov
self.rho = rho
self.intr = intr
self.divr = divr
self.bsm_model = pf.Bsm(sigma, intr=intr, divr=divr)
def bsm_vol(self, strike, spot, texp=None):
''''
should be same as bsm_vol method in ModelBsmMC (just copy & paste)
'''
p = self.price(strike, spot, texp).mean(axis=0)
return self.bsm_model.impvol(p, strike, spot, texp)
def price(self, strike, spot, texp=None, cp=1):
'''
Your MC routine goes here
Generate paths for vol only. Then compute integrated variance and BSM price.
Then get prices (vector) for all strikes
You may fix the random number seed
'''
m = pf.BsmNdMc(self.vov, rn_seed=12345)
tobs = np.arange(0, 101)/100*texp
_ = m.simulate(tobs = tobs, n_path=1000)
sigma_path = np.squeeze(m.path)
sigma_final = sigma_path[-1,:]
int_var = spint.simps(sigma_path**2, dx=1, axis=0)/100
price = []
model = pf.Bsm(sigma = np.sqrt((1 - self.rho ** 2) * np.mean(int_var)) * self.sigma , intr = self.intr, divr = self.divr)
for strike in strike:
price.append(model.price(strike, spot * np.exp(self.rho * (np.mean(sigma_final) * self.sigma - self.sigma) / self.vov - (self.rho ** 2) * (self.sigma ** 2) * texp * np.mean(int_var) / 2), texp))
return np.array(price)
'''
Conditional MC model class for Beta=0
'''
class ModelNormalCondMC:
beta = 0.0 # fixed (not used)
vov, rho = 0.0, 0.0
sigma, intr, divr = None, None, None
normal_model = None
def __init__(self, sigma, vov=0, rho=0.0, beta=0.0, intr=0, divr=0):
self.sigma = sigma
self.vov = vov
self.rho = rho
self.intr = intr
self.divr = divr
self.normal_model = pf.Norm(sigma, intr=intr, divr=divr)
def norm_vol(self, strike, spot, texp=None):
''''
should be same as norm_vol method in ModelNormalMC (just copy & paste)
'''
p = self.price(strike, spot, texp).mean(axis=0)
return self.normal_model.impvol(p, stike, spot, texp)
def price(self, strike, spot, texp=None, cp=1):
'''
Your MC routine goes here
Generate paths for vol only. Then compute integrated variance and normal price.
You may fix the random number seed
'''
m = pf.BsmNdMc(self.vov, rn_seed=12345)
tobs = np.arange(0, 101)/100*texp
_ = m.simulate(tobs = tobs, n_path=1000)
sigma_path = np.squeeze(m.path)
sigma_final = sigma_path[-1,:]
int_var = spint.simps(sigma_path**2, dx=1, axis=0)/100
price = []
model = pf.Norm(sigma = np.sqrt((1 - self.rho ** 2) * np.mean(int_var)) * self.sigma , intr = self.intr, divr = self.divr)
for strike in strike:
price.append(model.price(strike, spot + self.rho * (np.mean(sigma_final) * self.sigma - self.sigma) / self.vov, texp))
return np.array(price)
| 34.704918 | 206 | 0.581483 | 931 | 6,351 | 3.902256 | 0.146079 | 0.01101 | 0.05395 | 0.039637 | 0.835122 | 0.816956 | 0.803193 | 0.775667 | 0.775667 | 0.775667 | 0 | 0.030762 | 0.293655 | 6,351 | 182 | 207 | 34.895604 | 0.779091 | 0.169422 | 0 | 0.735849 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113208 | false | 0 | 0.04717 | 0 | 0.349057 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6f99ddab74a24507154ac4fbbd3b36ba16c5c33e | 2,099 | py | Python | tests/test_readme_rst.py | roksys/license-changer | ba11744b200e8237f72da600abf97d6a29623dfa | [
"MIT"
] | null | null | null | tests/test_readme_rst.py | roksys/license-changer | ba11744b200e8237f72da600abf97d6a29623dfa | [
"MIT"
] | null | null | null | tests/test_readme_rst.py | roksys/license-changer | ba11744b200e8237f72da600abf97d6a29623dfa | [
"MIT"
] | null | null | null | from utils import update_readme_rst
from textwrap import dedent
def test_pypy_classifier():
text1 = dedent("""\
================
Invenio-MARC21
================
.. image:: https://img.shields.io/travis/inveniosoftware/invenio-marc21.svg
:target: https://travis-ci.org/inveniosoftware/invenio-marc21
.. image:: https://img.shields.io/coveralls/inveniosoftware/invenio-marc21.svg
:target: https://coveralls.io/r/inveniosoftware/invenio-marc21
.. image:: https://img.shields.io/github/tag/inveniosoftware/invenio-marc21.svg
:target: https://github.com/inveniosoftware/invenio-marc21/releases
.. image:: https://img.shields.io/pypi/dm/invenio-marc21.svg
:target: https://pypi.python.org/pypi/invenio-marc21
.. image:: https://img.shields.io/github/license/inveniosoftware/invenio-marc21.svg
:target: https://github.com/inveniosoftware/invenio-marc21/blob/master/LICENSE
Invenio module with nice defaults for MARC21 overlay.
*This is an experimental developer preview release.*
* Free software: MIT license
* Documentation: https://invenio-marc21.readthedocs.io/""")
exp_text1 = dedent("""\
================
Invenio-MARC21
================
.. image:: https://img.shields.io/travis/inveniosoftware/invenio-marc21.svg
:target: https://travis-ci.org/inveniosoftware/invenio-marc21
.. image:: https://img.shields.io/coveralls/inveniosoftware/invenio-marc21.svg
:target: https://coveralls.io/r/inveniosoftware/invenio-marc21
.. image:: https://img.shields.io/pypi/v/invenio-marc21.svg
:target: https://pypi.org/pypi/invenio-marc21
Invenio module with nice defaults for MARC21 overlay.
*This is an experimental developer preview release.*
* Free software: MIT license
* Documentation: https://invenio-marc21.readthedocs.io/""")
assert update_readme_rst(text1) == exp_text1
| 37.482143 | 94 | 0.634588 | 227 | 2,099 | 5.832599 | 0.259912 | 0.196375 | 0.253776 | 0.120846 | 0.861782 | 0.861782 | 0.792296 | 0.792296 | 0.756798 | 0.756798 | 0 | 0.029091 | 0.213911 | 2,099 | 55 | 95 | 38.163636 | 0.773333 | 0 | 0 | 0.611111 | 0 | 0.055556 | 0.903764 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 1 | 0.027778 | false | 0 | 0.055556 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6fcf3fa8d428ce836ec30a05ef91771040f6bec0 | 164 | py | Python | mp3frame/errors.py | ralic/gnu_pymp3frame | fb6a66b26e426f2e33217bcb0d68aba17c8244b0 | [
"MIT"
] | null | null | null | mp3frame/errors.py | ralic/gnu_pymp3frame | fb6a66b26e426f2e33217bcb0d68aba17c8244b0 | [
"MIT"
] | null | null | null | mp3frame/errors.py | ralic/gnu_pymp3frame | fb6a66b26e426f2e33217bcb0d68aba17c8244b0 | [
"MIT"
] | null | null | null | class MP3DataError(Exception): pass
class MP3ReservedError(MP3DataError): pass
class MP3UsageError(Exception): pass
class MP3ImplementationLimit(Exception): pass
| 23.428571 | 45 | 0.841463 | 16 | 164 | 8.625 | 0.4375 | 0.282609 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 0.085366 | 164 | 6 | 46 | 27.333333 | 0.886667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
6fe095545dab3e6d4b84b602e3c08bfdbbcef80d | 198 | py | Python | neon/errors.py | adaamz/neon-py | 3d75f723cdada210751fb141994fa17e683fedd7 | [
"BSD-3-Clause"
] | 24 | 2016-05-14T15:07:29.000Z | 2021-07-13T07:14:47.000Z | neon/errors.py | adaamz/neon-py | 3d75f723cdada210751fb141994fa17e683fedd7 | [
"BSD-3-Clause"
] | 9 | 2018-12-12T14:44:44.000Z | 2021-07-16T07:28:27.000Z | neon/errors.py | adaamz/neon-py | 3d75f723cdada210751fb141994fa17e683fedd7 | [
"BSD-3-Clause"
] | 2 | 2019-08-21T13:58:35.000Z | 2021-07-13T05:16:43.000Z | # -*- coding: utf-8 -*-
class TokenError(Exception):
"""Raised when tokenization ends up with an error."""
class ParserError(Exception):
"""Raised when parsing ends up with an error."""
| 19.8 | 57 | 0.666667 | 25 | 198 | 5.28 | 0.64 | 0.227273 | 0.287879 | 0.181818 | 0.257576 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006211 | 0.186869 | 198 | 9 | 58 | 22 | 0.813665 | 0.570707 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
d201a2eeb7c7b8892369af628bbc29793aaaa2fc | 115 | py | Python | sltxpkg/lithie/compile/__init__.py | EagleoutIce/sltx-inst | cb45346177c22fd5bf47f29cebf34f09f16b9a4b | [
"MIT"
] | 2 | 2020-09-28T20:27:29.000Z | 2020-10-07T20:30:58.000Z | sltxpkg/lithie/compile/__init__.py | EagleoutIce/sltx | be71e6245356b8c8a8e42b4a44ceee5d4da4e89c | [
"MIT"
] | null | null | null | sltxpkg/lithie/compile/__init__.py | EagleoutIce/sltx | be71e6245356b8c8a8e42b4a44ceee5d4da4e89c | [
"MIT"
] | null | null | null | import sltxpkg.lithie.compile.cooker
import sltxpkg.lithie.compile.latexmk_mos
import sltxpkg.lithie.compile.tools
| 28.75 | 41 | 0.869565 | 16 | 115 | 6.1875 | 0.5 | 0.393939 | 0.575758 | 0.787879 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052174 | 115 | 3 | 42 | 38.333333 | 0.908257 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
d213bf981e238aad22a4cb4a563fdfbce64e4aea | 159,055 | py | Python | pysnmp-with-texts/ROOMALERT32E-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/ROOMALERT32E-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/ROOMALERT32E-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module ROOMALERT32E-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/ROOMALERT32E-MIB
# Produced by pysmi-0.3.4 at Wed May 1 14:58:18 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, OctetString, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "Integer", "OctetString", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ConstraintsIntersection, ConstraintsUnion, ValueRangeConstraint, ValueSizeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ConstraintsIntersection", "ConstraintsUnion", "ValueRangeConstraint", "ValueSizeConstraint")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
ObjectIdentity, Counter64, Gauge32, Unsigned32, IpAddress, enterprises, Integer32, iso, MibScalar, MibTable, MibTableRow, MibTableColumn, Counter32, NotificationType, ModuleIdentity, Bits, NotificationType, MibIdentifier, TimeTicks = mibBuilder.importSymbols("SNMPv2-SMI", "ObjectIdentity", "Counter64", "Gauge32", "Unsigned32", "IpAddress", "enterprises", "Integer32", "iso", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Counter32", "NotificationType", "ModuleIdentity", "Bits", "NotificationType", "MibIdentifier", "TimeTicks")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
avtech = MibIdentifier((1, 3, 6, 1, 4, 1, 20916))
products = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1))
roomalert32E = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8))
sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1))
internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1))
temperature = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 1))
humidity = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 2))
power = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 3))
heat_index = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 4)).setLabel("heat-index")
analog = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 5))
digital = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2))
digital_sen1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 1)).setLabel("digital-sen1")
digital_sen2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 2)).setLabel("digital-sen2")
digital_sen3 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 3)).setLabel("digital-sen3")
digital_sen4 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 4)).setLabel("digital-sen4")
digital_sen5 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 5)).setLabel("digital-sen5")
digital_sen6 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 6)).setLabel("digital-sen6")
digital_sen7 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 7)).setLabel("digital-sen7")
digital_sen8 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 8)).setLabel("digital-sen8")
switch = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3))
wireless = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4))
wish_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1)).setLabel("wish-1")
wish_1_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4)).setLabel("wish-1-sensors")
wish_1_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 1)).setLabel("wish-1-internal")
wish_1_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2)).setLabel("wish-1-external")
wish_1_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 1)).setLabel("wish-1-external-1")
wish_1_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 2)).setLabel("wish-1-external-2")
wish_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2)).setLabel("wish-2")
wish_2_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4)).setLabel("wish-2-sensors")
wish_2_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 1)).setLabel("wish-2-internal")
wish_2_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2)).setLabel("wish-2-external")
wish_2_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 1)).setLabel("wish-2-external-1")
wish_2_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 2)).setLabel("wish-2-external-2")
wish_3 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3)).setLabel("wish-3")
wish_3_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4)).setLabel("wish-3-sensors")
wish_3_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 1)).setLabel("wish-3-internal")
wish_3_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2)).setLabel("wish-3-external")
wish_3_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 1)).setLabel("wish-3-external-1")
wish_3_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 2)).setLabel("wish-3-external-2")
wish_4 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4)).setLabel("wish-4")
wish_4_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4)).setLabel("wish-4-sensors")
wish_4_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 1)).setLabel("wish-4-internal")
wish_4_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2)).setLabel("wish-4-external")
wish_4_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 1)).setLabel("wish-4-external-1")
wish_4_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 2)).setLabel("wish-4-external-2")
wish_5 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5)).setLabel("wish-5")
wish_5_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4)).setLabel("wish-5-sensors")
wish_5_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 1)).setLabel("wish-5-internal")
wish_5_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2)).setLabel("wish-5-external")
wish_5_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 1)).setLabel("wish-5-external-1")
wish_5_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 2)).setLabel("wish-5-external-2")
wish_6 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6)).setLabel("wish-6")
wish_6_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4)).setLabel("wish-6-sensors")
wish_6_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 1)).setLabel("wish-6-internal")
wish_6_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2)).setLabel("wish-6-external")
wish_6_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 1)).setLabel("wish-6-external-1")
wish_6_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 2)).setLabel("wish-6-external-2")
wish_7 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7)).setLabel("wish-7")
wish_7_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4)).setLabel("wish-7-sensors")
wish_7_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 1)).setLabel("wish-7-internal")
wish_7_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2)).setLabel("wish-7-external")
wish_7_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 1)).setLabel("wish-7-external-1")
wish_7_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 2)).setLabel("wish-7-external-2")
wish_8 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8)).setLabel("wish-8")
wish_8_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4)).setLabel("wish-8-sensors")
wish_8_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 1)).setLabel("wish-8-internal")
wish_8_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2)).setLabel("wish-8-external")
wish_8_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 1)).setLabel("wish-8-external-1")
wish_8_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 2)).setLabel("wish-8-external-2")
wish_9 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9)).setLabel("wish-9")
wish_9_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4)).setLabel("wish-9-sensors")
wish_9_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 1)).setLabel("wish-9-internal")
wish_9_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2)).setLabel("wish-9-external")
wish_9_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 1)).setLabel("wish-9-external-1")
wish_9_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 2)).setLabel("wish-9-external-2")
wish_10 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10)).setLabel("wish-10")
wish_10_sensors = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4)).setLabel("wish-10-sensors")
wish_10_internal = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 1)).setLabel("wish-10-internal")
wish_10_external = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2)).setLabel("wish-10-external")
wish_10_external_1 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 1)).setLabel("wish-10-external-1")
wish_10_external_2 = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 2)).setLabel("wish-10-external-2")
traps = MibIdentifier((1, 3, 6, 1, 4, 1, 20916, 1, 8, 2))
internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: internal_tempf.setDescription('The internal temperature reading in Fahrenheit. Because the SNMP Protocol does not support floating point numbers, values are scaled by 100 and should be divided by 100 to get the actual value.')
internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: internal_tempc.setDescription('The internal temperature reading in Celsius. Because the SNMP Protocol does not support floating point numbers, values are scaled by 100 and should be divided by 100 to get the actual value.')
internal_humidity = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("internal-humidity").setMaxAccess("readonly")
if mibBuilder.loadTexts: internal_humidity.setStatus('mandatory')
if mibBuilder.loadTexts: internal_humidity.setDescription('The internal relative humidity reading in %RH. Because the SNMP Protocol does not support floating point numbers, values are scaled by 100 and should be divided by 100 to get the actual value.')
internal_heat_index = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 4, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("internal-heat-index").setMaxAccess("readonly")
if mibBuilder.loadTexts: internal_heat_index.setStatus('optional')
if mibBuilder.loadTexts: internal_heat_index.setDescription('The internal heat index reading in Fahrenheit. Because the SNMP Protocol does not support floating point numbers, values are scaled by 100 and should be divided by 100 to get the actual value.')
internal_heat_indexC = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 4, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("internal-heat-indexC").setMaxAccess("readonly")
if mibBuilder.loadTexts: internal_heat_indexC.setStatus('optional')
if mibBuilder.loadTexts: internal_heat_indexC.setDescription('The internal heat index reading in Celsius. Because the SNMP Protocol does not support floating point numbers, values are scaled by 100 and should be divided by 100 to get the actual value.')
internal_power = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 3, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("internal-power").setMaxAccess("readonly")
if mibBuilder.loadTexts: internal_power.setStatus('mandatory')
if mibBuilder.loadTexts: internal_power.setDescription("The current status of the Room Alert 32E power supply. A '0' indicates the unit is running on AC/Utility power. A '1' indicates the unit is running on battery backup power.")
internal_analog1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 5, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("internal-analog1").setMaxAccess("readonly")
if mibBuilder.loadTexts: internal_analog1.setStatus('mandatory')
if mibBuilder.loadTexts: internal_analog1.setDescription('The current status of the Room Alert 32E analog input (0-5VDC).')
internal_analog2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 1, 5, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("internal-analog2").setMaxAccess("readonly")
if mibBuilder.loadTexts: internal_analog2.setStatus('mandatory')
if mibBuilder.loadTexts: internal_analog2.setDescription('The current status of the Room Alert 32E analog input (0-5VDC).')
digital_sen1_1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen1-1").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen1_1.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen1_1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor, this value represents the Current reading in Amperage.')
digital_sen1_2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen1-2").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen1_2.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen1_2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Power reading in Watts.')
digital_sen1_3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen1-3").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen1_3.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen1_3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor, this value represents the Voltage reading in Volts.')
digital_sen1_4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen1-4").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen1_4.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen1_4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Reference reading in Volts.')
digital_sen1_5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen1-5").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen1_5.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen1_5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
digital_sen2_1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen2-1").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen2_1.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen2_1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor, this value represents the Current reading in Amperage.')
digital_sen2_2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen2-2").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen2_2.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen2_2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Power reading in Watts.')
digital_sen2_3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen2-3").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen2_3.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen2_3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor, this value represents the Voltage reading in Volts.')
digital_sen2_4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen2-4").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen2_4.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen2_4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Reference reading in Volts.')
digital_sen2_5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen2-5").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen2_5.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen2_5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
digital_sen3_1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 3, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen3-1").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen3_1.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen3_1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor, this value represents the Current reading in Amperage.')
digital_sen3_2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 3, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen3-2").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen3_2.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen3_2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Power reading in Watts.')
digital_sen3_3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 3, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen3-3").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen3_3.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen3_3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor, this value represents the Voltage reading in Volts.')
digital_sen3_4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 3, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen3-4").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen3_4.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen3_4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Reference reading in Volts.')
digital_sen3_5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 3, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen3-5").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen3_5.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen3_5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
digital_sen4_1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 4, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen4-1").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen4_1.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen4_1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor, this value represents the Current reading in Amperage.')
digital_sen4_2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 4, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen4-2").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen4_2.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen4_2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Power reading in Watts.')
digital_sen4_3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 4, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen4-3").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen4_3.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen4_3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor, this value represents the Voltage reading in Volts.')
digital_sen4_4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 4, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen4-4").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen4_4.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen4_4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Reference reading in Volts.')
digital_sen4_5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 4, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen4-5").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen4_5.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen4_5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
digital_sen5_1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 5, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen5-1").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen5_1.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen5_1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor, this value represents the Current reading in Amperage.')
digital_sen5_2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 5, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen5-2").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen5_2.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen5_2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Power reading in Watts.')
digital_sen5_3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 5, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen5-3").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen5_3.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen5_3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor, this value represents the Voltage reading in Volts.')
digital_sen5_4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 5, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen5-4").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen5_4.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen5_4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Reference reading in Volts.')
digital_sen5_5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 5, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen5-5").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen5_5.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen5_5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
digital_sen6_1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 6, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen6-1").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen6_1.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen6_1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor, this value represents the Current reading in Amperage.')
digital_sen6_2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 6, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen6-2").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen6_2.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen6_2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Power reading in Watts.')
digital_sen6_3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 6, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen6-3").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen6_3.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen6_3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor, this value represents the Voltage reading in Volts.')
digital_sen6_4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 6, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen6-4").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen6_4.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen6_4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Reference reading in Volts.')
digital_sen1_6 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 6, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen1-6").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen1_6.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen1_6.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
digital_sen7_1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 7, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen7-1").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen7_1.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen7_1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor, this value represents the Current reading in Amperage.')
digital_sen7_2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 7, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen7-2").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen7_2.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen7_2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Power reading in Watts.')
digital_sen7_3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 7, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen7-3").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen7_3.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen7_3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor, this value represents the Voltage reading in Volts.')
digital_sen7_4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 7, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen7-4").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen7_4.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen7_4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Reference reading in Volts.')
digital_sen7_5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 7, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen7-5").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen7_5.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen7_5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
digital_sen8_1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 8, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen8-1").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen8_1.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen8_1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor, this value represents the Current reading in Amperage.')
digital_sen8_2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 8, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen8-2").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen8_2.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen8_2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Power reading in Watts.')
digital_sen8_3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 8, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen8-3").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen8_3.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen8_3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor, this value represents the Voltage reading in Volts.')
digital_sen8_4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 8, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen8-4").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen8_4.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen8_4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor, this value represents the Reference reading in Volts.')
digital_sen8_5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 2, 8, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("digital-sen8-5").setMaxAccess("readonly")
if mibBuilder.loadTexts: digital_sen8_5.setStatus('mandatory')
if mibBuilder.loadTexts: digital_sen8_5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
switch_sen1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen1").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen1.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen1.setDescription('The reading for switch sensor 1 (0 = OPEN, 1 = CLOSED).')
switch_sen2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen2").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen2.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen2.setDescription('The reading for switch sensor 2 (0 = OPEN, 1 = CLOSED).')
switch_sen3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen3").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen3.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen3.setDescription('The reading for switch sensor 3 (0 = OPEN, 1 = CLOSED).')
switch_sen4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen4").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen4.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen4.setDescription('The reading for switch sensor 4 (0 = OPEN, 1 = CLOSED).')
switch_sen5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen5").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen5.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen5.setDescription('The reading for switch sensor 5 (0 = OPEN, 1 = CLOSED).')
switch_sen6 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen6").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen6.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen6.setDescription('The reading for switch sensor 6 (0 = OPEN, 1 = CLOSED).')
switch_sen7 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 7), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen7").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen7.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen7.setDescription('The reading for switch sensor 7 (0 = OPEN, 1 = CLOSED).')
switch_sen8 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 8), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen8").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen8.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen8.setDescription('The reading for switch sensor 8 (0 = OPEN, 1 = CLOSED).')
switch_sen9 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 9), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen9").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen9.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen9.setDescription('The reading for switch sensor 9 (0 = OPEN, 1 = CLOSED).')
switch_sen10 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen10").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen10.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen10.setDescription('The reading for switch sensor 10 (0 = OPEN, 1 = CLOSED).')
switch_sen11 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 11), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen11").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen11.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen11.setDescription('The reading for switch sensor 11 (0 = OPEN, 1 = CLOSED).')
switch_sen12 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 12), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen12").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen12.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen12.setDescription('The reading for switch sensor 12 (0 = OPEN, 1 = CLOSED).')
switch_sen13 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 13), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen13").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen13.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen13.setDescription('The reading for switch sensor 13 (0 = OPEN, 1 = CLOSED).')
switch_sen14 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 14), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen14").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen14.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen14.setDescription('The reading for switch sensor 14 (0 = OPEN, 1 = CLOSED).')
switch_sen15 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 15), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen15").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen15.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen15.setDescription('The reading for switch sensor 15 (0 = OPEN, 1 = CLOSED).')
switch_sen16 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 3, 16), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("switch-sen16").setMaxAccess("readonly")
if mibBuilder.loadTexts: switch_sen16.setStatus('mandatory')
if mibBuilder.loadTexts: switch_sen16.setDescription('The reading for switch sensor 16 (0 = OPEN, 1 = CLOSED).')
wish_1_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-1-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_1_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 2), OctetString()).setLabel("wish-1-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_1_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_1_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_1_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_1_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_1_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_1_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_1_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_1_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_1_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_1_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_1_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_1_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_1_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_1_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_1_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_1_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-1-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_1_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 1, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-1-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_1_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_1_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
wish_2_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-2-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_2_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 2), OctetString()).setLabel("wish-2-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_2_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_2_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_2_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_2_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_2_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_2_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_2_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_2_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_2_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_2_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_2_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_2_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_2_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_2_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_2_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_2_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-2-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_2_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 2, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-2-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_2_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_2_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
wish_3_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-3-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_3_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 2), OctetString()).setLabel("wish-3-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_3_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_3_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_3_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_3_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_3_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_3_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_3_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_3_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_3_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_3_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_3_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_3_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_3_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_3_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_3_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_3_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-3-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_3_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 3, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-3-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_3_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_3_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
wish_4_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-4-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_4_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 2), OctetString()).setLabel("wish-4-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_4_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_4_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_4_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_4_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_4_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_4_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_4_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_4_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_4_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_4_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_4_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_4_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_4_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_4_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_4_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_4_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-4-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_4_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 4, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-4-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_4_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_4_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
wish_5_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-5-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_5_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 2), OctetString()).setLabel("wish-5-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_5_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_5_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_5_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_5_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_5_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_5_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_5_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_5_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_5_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_5_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_5_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_5_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_5_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_5_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_5_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_5_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-5-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_5_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 5, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-5-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_5_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_5_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
wish_6_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-6-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_6_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 2), OctetString()).setLabel("wish-6-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_6_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_6_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_6_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_6_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_6_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_6_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_6_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_6_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_6_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_6_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_6_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_6_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_6_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_6_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_6_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_6_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-6-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_6_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 6, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-6-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_6_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_6_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
wish_7_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-7-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_7_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 2), OctetString()).setLabel("wish-7-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_7_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_7_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_7_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_7_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_7_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_7_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_7_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_7_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_7_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_7_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_7_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_7_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_7_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_7_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_7_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_7_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-7-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_7_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 7, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-7-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_7_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_7_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
wish_8_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-8-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_8_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 2), OctetString()).setLabel("wish-8-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_8_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_8_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_8_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_8_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_8_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_8_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_8_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_8_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_8_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_8_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_8_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_8_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_8_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_8_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_8_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_8_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-8-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_8_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 8, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-8-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_8_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_8_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
wish_9_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-9-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_9_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 2), OctetString()).setLabel("wish-9-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_9_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_9_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_9_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_9_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_9_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_9_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_9_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_9_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_9_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_9_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_9_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_9_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_9_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_9_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_9_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_9_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-9-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_9_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 9, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-9-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_9_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_9_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
wish_10_enabled = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-10-enabled").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_enabled.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_enabled.setDescription("The current 'enabled' status for this WiSH/WiSPR Sensor. A '0' indicates the WiSH/WiSPR is disabled. A '1' indicates the WiSH/WiSPR is enabled.")
wish_10_serial_num = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 2), OctetString()).setLabel("wish-10-serial-num").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_serial_num.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_serial_num.setDescription('The unique serial number for this WiSH/WiSPR Sensor.')
wish_10_updates = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-updates").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_updates.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_updates.setDescription('The current update interval for this WiSH/WiSPR Sensor.')
wish_10_battery_voltage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-battery-voltage").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_battery_voltage.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_battery_voltage.setDescription('The current voltage reading of the internal battery for this WiSH/WiSPR Sensor.')
wish_10_internal_tempc = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-internal-tempc").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_internal_tempc.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_internal_tempc.setDescription('The current temperature of the internal sensor in Celsius (C) for this WiSH/WiSPR Sensor.')
wish_10_internal_tempf = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-internal-tempf").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_internal_tempf.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_internal_tempf.setDescription('The current temperature of the internal sensor in Fahrenheit (F) for this WiSH/WiSPR Sensor.')
wish_10_external_1_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-1-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_1_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_1_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_10_external_1_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-1-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_1_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_1_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_10_external_1_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-1-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_1_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_1_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_10_external_1_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-1-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_1_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_1_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_10_external_1_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-1-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_1_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_1_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_10_external_1_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-1-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_1_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_1_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_10_external_2_type = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 2, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-2-type").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_2_type.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_2_type.setDescription('The sensor type of the digital sensor attached to this digital sensor port on the WiSH/WiSPR Sensor.')
wish_10_external_2_val1 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-2-val1").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_2_val1.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_2_val1.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Celsius. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Current reading in Amperage.')
wish_10_external_2_val2 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-2-val2").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_2_val2.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_2_val2.setDescription('If this sensor is a Temperature or Temp/Humidity sensor, this value represents the current temperature in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Power reading in Watts.')
wish_10_external_2_val3 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-2-val3").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_2_val3.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_2_val3.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current relative humidity in % Relative Humidity. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Voltage reading in Volts.')
wish_10_external_2_val4 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 2, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-2-val4").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_2_val4.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_2_val4.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Fahrenheit. If this sensor is a Digital Power Sensor and connection of a Digital Power Sensor is supported by your model, this value represents the Reference reading in Volts.')
wish_10_external_2_val5 = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 2, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setLabel("wish-10-external-2-val5").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_2_val5.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_2_val5.setDescription('If this sensor is a Temp/Humidity sensor, this value represents the current heat index in Celsius.')
wish_10_external_switch = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 1, 4, 10, 4, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1))).setLabel("wish-10-external-switch").setMaxAccess("readonly")
if mibBuilder.loadTexts: wish_10_external_switch.setStatus('mandatory')
if mibBuilder.loadTexts: wish_10_external_switch.setDescription('The reading for switch sensor contacts of this WiSH/WiSPR Sensor (0 = OPEN, 1 = CLOSED).')
alarmmessage = MibScalar((1, 3, 6, 1, 4, 1, 20916, 1, 8, 2, 1), OctetString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: alarmmessage.setStatus('mandatory')
if mibBuilder.loadTexts: alarmmessage.setDescription('Last Alarm Message')
tempalarm1_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,1)).setLabel("tempalarm1-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: tempalarm1_32E.setDescription('A tempalarm1 trap signifies that the current temperature on external sensor 1 is outside the defined high or low threshold.')
room_alert_32E_snmp_trap = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,2)).setLabel("room-alert-32E-snmp-trap").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: room_alert_32E_snmp_trap.setDescription('A room-alert-32E-snmp-trap indicates that an alarm condition has occurred on the sensor inidcated by the alarmMessage variable.')
tempalarm2_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,3)).setLabel("tempalarm2-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: tempalarm2_32E.setDescription('A tempalarm2 trap signifies that the current temperature on external sensor 2 is outside the defined high or low threshold.')
tempclear2_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,4)).setLabel("tempclear2-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: tempclear2_32E.setDescription('A tempclear2 trap signifies that the current temperature on external sensor 2 has returned to a normal condition and is within the defined high or low threshold.')
tempalarm3_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,5)).setLabel("tempalarm3-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: tempalarm3_32E.setDescription('A tempalarm3 trap signifies that the current temperature on external sensor 3 is outside the defined high or low threshold.')
tempclear3_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,6)).setLabel("tempclear3-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: tempclear3_32E.setDescription('A tempclear3 trap signifies that the current temperature on external sensor 3 has returned to a normal condition and is within the defined high or low threshold.')
humidityalarm1_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,7)).setLabel("humidityalarm1-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: humidityalarm1_32E.setDescription('A humidityalarm1 trap signifies that the current humidity on external sensor 1 is outside the defined high or low threshold.')
humidityclear1_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,8)).setLabel("humidityclear1-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: humidityclear1_32E.setDescription('A humidityclear1 trap signifies that the current humidity on external sensor 1 has returned to a normal condition and is within the defined high or low threshold.')
humidityalarm2_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,9)).setLabel("humidityalarm2-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: humidityalarm2_32E.setDescription('A humidityalarm2 trap signifies that the current humidity on external sensor 2 is outside the defined high or low threshold.')
humidityclear2_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,10)).setLabel("humidityclear2-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: humidityclear2_32E.setDescription('A humidityclear2 trap signifies that the current humidity on external sensor 2 has returned to a normal condition and is within the defined high or low threshold.')
humidityalarm3_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,11)).setLabel("humidityalarm3-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: humidityalarm3_32E.setDescription('A humidityalarm3 trap signifies that the current humidity on external sensor 3 is outside the defined high or low threshold.')
humidityclear3_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,12)).setLabel("humidityclear3-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: humidityclear3_32E.setDescription('A humidityclear3 trap signifies that the current humidity on external sensor 3 has returned to a normal condition and is within the defined high or low threshold.')
switchalarm1_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,13)).setLabel("switchalarm1-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchalarm1_32E.setDescription('A switchalarm1 trap signifies that switch sensor 1 is in an alarm state.')
switchclear1_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,14)).setLabel("switchclear1-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchclear1_32E.setDescription('A switchclear1 trap signifies that the switch sensor 1 has returned to a normal state.')
switchalarm2_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,15)).setLabel("switchalarm2-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchalarm2_32E.setDescription('A switchalarm2 trap signifies that switch sensor 2 is in an alarm state.')
switchclear2_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,16)).setLabel("switchclear2-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchclear2_32E.setDescription('A switchclear2 trap signifies that the switch sensor 2 has returned to a normal state.')
switchalarm3_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,17)).setLabel("switchalarm3-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchalarm3_32E.setDescription('A switchalarm3 trap signifies that switch sensor 3 is in an alarm state.')
switchclear3_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,18)).setLabel("switchclear3-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchclear3_32E.setDescription('A switchclear3 trap signifies that the switch sensor 3 has returned to a normal state.')
switchalarm4_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,19)).setLabel("switchalarm4-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchalarm4_32E.setDescription('A switchalarm4 trap signifies that switch sensor 4 is in an alarm state.')
switchclear4_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,20)).setLabel("switchclear4-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchclear4_32E.setDescription('A switchclear4 trap signifies that the switch sensor 4 has returned to a normal state.')
switchalarm5_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,21)).setLabel("switchalarm5-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchalarm5_32E.setDescription('A switchalarm5 trap signifies that switch sensor 5 is in an alarm state.')
switchclear5_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,22)).setLabel("switchclear5-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchclear5_32E.setDescription('A switchclear5 trap signifies that the switch sensor 5 has returned to a normal state.')
switchalarm6_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,23)).setLabel("switchalarm6-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchalarm6_32E.setDescription('A switchalarm6 trap signifies that switch sensor 6 is in an alarm state.')
switchclear6_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,24)).setLabel("switchclear6-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchclear6_32E.setDescription('A switchclear6 trap signifies that the switch sensor 6 has returned to a normal state.')
switchalarm7_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,25)).setLabel("switchalarm7-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchalarm7_32E.setDescription('A switchalarm7 trap signifies that switch sensor 7 is in an alarm state.')
switchclear7_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,26)).setLabel("switchclear7-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchclear7_32E.setDescription('A switchclear7 trap signifies that the switch sensor 7 has returned to a normal state.')
switchalarm8_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,27)).setLabel("switchalarm8-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchalarm8_32E.setDescription('A switchalarm8 trap signifies that switch sensor 8 is in an alarm state.')
switchclear8_32E = NotificationType((1, 3, 6, 1, 4, 1, 20916, 1, 8) + (0,28)).setLabel("switchclear8-32E").setObjects(("ROOMALERT32E-MIB", "alarmmessage"))
if mibBuilder.loadTexts: switchclear8_32E.setDescription('A switchclear8 trap signifies that the switch sensor 8 has returned to a normal state.')
mibBuilder.exportSymbols("ROOMALERT32E-MIB", wish_1_external_1_val5=wish_1_external_1_val5, wish_3_internal_tempf=wish_3_internal_tempf, wish_1_external_1_val2=wish_1_external_1_val2, wish_3_enabled=wish_3_enabled, wish_7_external_2_val5=wish_7_external_2_val5, wish_7_updates=wish_7_updates, wish_3_serial_num=wish_3_serial_num, wish_3_external_1_val1=wish_3_external_1_val1, wish_8_internal_tempc=wish_8_internal_tempc, humidityclear3_32E=humidityclear3_32E, switchclear1_32E=switchclear1_32E, wish_9_updates=wish_9_updates, wish_6_external_1_val4=wish_6_external_1_val4, wish_2_external_1_val1=wish_2_external_1_val1, switchalarm6_32E=switchalarm6_32E, wish_2_external_1_val5=wish_2_external_1_val5, wish_6_external_2_val1=wish_6_external_2_val1, wish_2_external_1_val3=wish_2_external_1_val3, digital_sen5_2=digital_sen5_2, wish_2_external_2_val3=wish_2_external_2_val3, wish_7_external_2_val1=wish_7_external_2_val1, wish_7_external_1_val1=wish_7_external_1_val1, wish_7_external=wish_7_external, wish_7_external_1_val2=wish_7_external_1_val2, tempclear3_32E=tempclear3_32E, wish_9_internal_tempc=wish_9_internal_tempc, wish_1_external_2_type=wish_1_external_2_type, switch_sen13=switch_sen13, switchclear8_32E=switchclear8_32E, wish_1_sensors=wish_1_sensors, tempalarm1_32E=tempalarm1_32E, digital_sen5_3=digital_sen5_3, wish_5_external_1_val1=wish_5_external_1_val1, wish_10_enabled=wish_10_enabled, wish_10_internal_tempf=wish_10_internal_tempf, wish_10_external_2_type=wish_10_external_2_type, wish_2_battery_voltage=wish_2_battery_voltage, digital_sen1_6=digital_sen1_6, wish_6_external_2_type=wish_6_external_2_type, wish_10_internal_tempc=wish_10_internal_tempc, switch_sen3=switch_sen3, wish_2_external_2=wish_2_external_2, traps=traps, switchalarm1_32E=switchalarm1_32E, digital_sen2_3=digital_sen2_3, wish_9_external_2_val2=wish_9_external_2_val2, wish_3_external_2_val5=wish_3_external_2_val5, internal=internal, wish_1_external_2_val3=wish_1_external_2_val3, digital_sen4_4=digital_sen4_4, wish_2_external_1_val4=wish_2_external_1_val4, wish_3_external_1_val2=wish_3_external_1_val2, wish_4_external_2=wish_4_external_2, wish_4_external_1_val2=wish_4_external_1_val2, digital_sen7_1=digital_sen7_1, humidityalarm2_32E=humidityalarm2_32E, wish_3_external_2_type=wish_3_external_2_type, digital_sen4_2=digital_sen4_2, wish_2_updates=wish_2_updates, wish_2_external_1=wish_2_external_1, wish_9_external_1=wish_9_external_1, wish_2_external_2_val4=wish_2_external_2_val4, wish_10_external_1_type=wish_10_external_1_type, sensors=sensors, digital_sen4_5=digital_sen4_5, wish_1_updates=wish_1_updates, wish_10_internal=wish_10_internal, wish_8_external_1_val1=wish_8_external_1_val1, wish_4_internal=wish_4_internal, switch_sen4=switch_sen4, wish_8_serial_num=wish_8_serial_num, wish_7_external_2=wish_7_external_2, wish_7_serial_num=wish_7_serial_num, wish_1_external_2_val5=wish_1_external_2_val5, wish_5_external_1=wish_5_external_1, wish_9=wish_9, internal_heat_indexC=internal_heat_indexC, wish_9_enabled=wish_9_enabled, wish_4_external_1_val5=wish_4_external_1_val5, wish_6_internal_tempc=wish_6_internal_tempc, wish_5_external_2_val4=wish_5_external_2_val4, wish_3_external=wish_3_external, switch_sen10=switch_sen10, wish_8_external_2_val1=wish_8_external_2_val1, wish_3_external_1_val4=wish_3_external_1_val4, digital_sen1_3=digital_sen1_3, wish_5_internal_tempc=wish_5_internal_tempc, wish_6_external_1_val3=wish_6_external_1_val3, wish_8_battery_voltage=wish_8_battery_voltage, digital_sen8_2=digital_sen8_2, wish_9_external_2_val3=wish_9_external_2_val3, wish_1_battery_voltage=wish_1_battery_voltage, wish_9_internal=wish_9_internal, humidityclear1_32E=humidityclear1_32E, wish_10=wish_10, wish_2_external_2_val1=wish_2_external_2_val1, digital_sen5=digital_sen5, digital_sen7=digital_sen7, wish_4_external_2_val5=wish_4_external_2_val5, humidityalarm3_32E=humidityalarm3_32E, power=power, wish_1_external_1_val1=wish_1_external_1_val1, wish_7_internal_tempc=wish_7_internal_tempc, switchclear3_32E=switchclear3_32E, wish_8_updates=wish_8_updates, wish_4_external_1_val1=wish_4_external_1_val1, switch_sen6=switch_sen6, wish_7_external_switch=wish_7_external_switch, wish_4=wish_4, wish_3_sensors=wish_3_sensors, analog=analog, wish_10_external_switch=wish_10_external_switch, wish_2_external=wish_2_external, wish_8_external_2_val3=wish_8_external_2_val3, wish_2_enabled=wish_2_enabled, wish_6_internal=wish_6_internal, wish_1_external_2_val1=wish_1_external_2_val1, digital_sen3_2=digital_sen3_2, wish_1_external_2_val2=wish_1_external_2_val2, wish_6_external_2_val2=wish_6_external_2_val2, tempclear2_32E=tempclear2_32E, switchalarm2_32E=switchalarm2_32E, wish_1_external_1_val3=wish_1_external_1_val3, wish_1=wish_1, digital_sen7_5=digital_sen7_5, switch=switch, digital_sen3_5=digital_sen3_5, wish_5_external_2_type=wish_5_external_2_type, wish_4_external_2_val4=wish_4_external_2_val4, wish_6_external_1_type=wish_6_external_1_type, wish_1_enabled=wish_1_enabled, internal_heat_index=internal_heat_index, wish_6_internal_tempf=wish_6_internal_tempf, wish_10_external=wish_10_external, products=products, wish_4_external_1_type=wish_4_external_1_type, switchalarm7_32E=switchalarm7_32E, switchalarm8_32E=switchalarm8_32E, wish_3_battery_voltage=wish_3_battery_voltage, wish_8_external_2=wish_8_external_2, wish_10_serial_num=wish_10_serial_num, wish_1_external=wish_1_external, wish_4_external_2_type=wish_4_external_2_type, wish_10_external_1_val2=wish_10_external_1_val2, heat_index=heat_index, wish_7_external_2_val4=wish_7_external_2_val4, switchclear6_32E=switchclear6_32E, digital_sen6=digital_sen6, wish_2_external_1_type=wish_2_external_1_type, wish_5_internal=wish_5_internal, digital_sen1=digital_sen1, avtech=avtech, switchalarm5_32E=switchalarm5_32E, wish_6_external_2_val3=wish_6_external_2_val3, wish_9_external_2=wish_9_external_2, wish_1_external_2=wish_1_external_2, switch_sen16=switch_sen16, digital_sen8_3=digital_sen8_3, wish_2_internal_tempf=wish_2_internal_tempf, digital_sen6_1=digital_sen6_1, wish_7_enabled=wish_7_enabled, wish_3=wish_3, wish_9_external_1_type=wish_9_external_1_type, wish_4_battery_voltage=wish_4_battery_voltage, wish_9_battery_voltage=wish_9_battery_voltage, wish_2_external_2_type=wish_2_external_2_type, digital_sen4_3=digital_sen4_3, wish_9_external_2_val5=wish_9_external_2_val5, wish_6_serial_num=wish_6_serial_num, wish_1_internal_tempf=wish_1_internal_tempf, wish_2_external_2_val5=wish_2_external_2_val5, switchalarm4_32E=switchalarm4_32E, digital_sen3=digital_sen3, switch_sen11=switch_sen11, digital_sen5_1=digital_sen5_1, wish_7_battery_voltage=wish_7_battery_voltage, wish_10_external_1_val4=wish_10_external_1_val4, wish_4_internal_tempc=wish_4_internal_tempc, wish_7_sensors=wish_7_sensors, wish_3_external_1=wish_3_external_1, tempalarm2_32E=tempalarm2_32E, wish_3_external_1_val5=wish_3_external_1_val5, wish_6_external=wish_6_external, wish_3_external_2_val4=wish_3_external_2_val4, wish_5_enabled=wish_5_enabled, wish_5_external_1_val2=wish_5_external_1_val2, wish_7_external_2_val3=wish_7_external_2_val3, wish_9_external_1_val1=wish_9_external_1_val1, wish_6_external_1_val1=wish_6_external_1_val1, digital_sen3_1=digital_sen3_1, wish_10_external_1_val5=wish_10_external_1_val5, wish_9_external=wish_9_external, wish_5=wish_5, wish_8_external_1_type=wish_8_external_1_type, wish_7_internal=wish_7_internal, switch_sen5=switch_sen5, wish_2_external_switch=wish_2_external_switch, wish_4_updates=wish_4_updates, tempalarm3_32E=tempalarm3_32E, wish_8_external_2_val4=wish_8_external_2_val4, internal_analog2=internal_analog2, wish_6=wish_6, wish_10_external_1_val3=wish_10_external_1_val3, wish_5_updates=wish_5_updates, wish_7_external_1_val3=wish_7_external_1_val3, wish_5_external_2_val1=wish_5_external_2_val1, wish_4_external_2_val1=wish_4_external_2_val1, digital_sen8_4=digital_sen8_4, wish_3_external_switch=wish_3_external_switch, wish_3_external_1_type=wish_3_external_1_type, wish_5_external_switch=wish_5_external_switch, wish_6_external_switch=wish_6_external_switch, humidityalarm1_32E=humidityalarm1_32E, room_alert_32E_snmp_trap=room_alert_32E_snmp_trap, digital=digital, internal_tempc=internal_tempc, wish_8_external_1=wish_8_external_1, wish_7_external_1=wish_7_external_1, wish_9_external_2_val4=wish_9_external_2_val4, switch_sen7=switch_sen7, wish_4_external_1=wish_4_external_1, wish_4_external_2_val3=wish_4_external_2_val3, digital_sen2_4=digital_sen2_4, internal_analog1=internal_analog1, wish_4_serial_num=wish_4_serial_num, wish_9_internal_tempf=wish_9_internal_tempf, digital_sen3_3=digital_sen3_3, wish_4_internal_tempf=wish_4_internal_tempf, wish_5_external=wish_5_external, switchclear4_32E=switchclear4_32E, wish_1_external_1=wish_1_external_1, internal_power=internal_power, switchclear2_32E=switchclear2_32E, wish_10_external_2=wish_10_external_2, humidityclear2_32E=humidityclear2_32E, roomalert32E=roomalert32E, wish_4_external=wish_4_external, wish_6_external_2_val4=wish_6_external_2_val4, wish_6_updates=wish_6_updates, switch_sen1=switch_sen1, switch_sen2=switch_sen2, wish_10_external_1_val1=wish_10_external_1_val1, wish_6_external_1=wish_6_external_1, wish_5_battery_voltage=wish_5_battery_voltage, wish_3_external_2_val3=wish_3_external_2_val3, wish_2_external_1_val2=wish_2_external_1_val2, wireless=wireless, wish_4_external_switch=wish_4_external_switch, wish_10_external_2_val3=wish_10_external_2_val3, wish_5_external_2=wish_5_external_2, digital_sen4_1=digital_sen4_1, wish_7_internal_tempf=wish_7_internal_tempf, wish_2_sensors=wish_2_sensors, wish_6_external_1_val5=wish_6_external_1_val5)
mibBuilder.exportSymbols("ROOMALERT32E-MIB", wish_8_internal=wish_8_internal, wish_10_external_2_val5=wish_10_external_2_val5, wish_3_external_1_val3=wish_3_external_1_val3, wish_6_battery_voltage=wish_6_battery_voltage, internal_tempf=internal_tempf, wish_2_serial_num=wish_2_serial_num, wish_4_external_1_val3=wish_4_external_1_val3, wish_1_external_switch=wish_1_external_switch, wish_5_serial_num=wish_5_serial_num, switchclear7_32E=switchclear7_32E, wish_1_external_2_val4=wish_1_external_2_val4, digital_sen2=digital_sen2, wish_9_external_2_type=wish_9_external_2_type, wish_8_external=wish_8_external, digital_sen6_2=digital_sen6_2, wish_1_serial_num=wish_1_serial_num, digital_sen5_4=digital_sen5_4, wish_10_external_2_val2=wish_10_external_2_val2, wish_3_external_2_val1=wish_3_external_2_val1, wish_3_external_2_val2=wish_3_external_2_val2, wish_8_internal_tempf=wish_8_internal_tempf, wish_7_external_1_val5=wish_7_external_1_val5, wish_7_external_1_val4=wish_7_external_1_val4, wish_8_external_2_val2=wish_8_external_2_val2, temperature=temperature, wish_7_external_2_val2=wish_7_external_2_val2, digital_sen7_4=digital_sen7_4, wish_4_external_1_val4=wish_4_external_1_val4, wish_1_external_1_val4=wish_1_external_1_val4, wish_6_enabled=wish_6_enabled, wish_7=wish_7, wish_7_external_1_type=wish_7_external_1_type, digital_sen8_1=digital_sen8_1, wish_2_external_2_val2=wish_2_external_2_val2, wish_8_external_2_val5=wish_8_external_2_val5, wish_5_external_1_type=wish_5_external_1_type, digital_sen4=digital_sen4, digital_sen1_1=digital_sen1_1, digital_sen2_1=digital_sen2_1, digital_sen6_3=digital_sen6_3, wish_8_external_switch=wish_8_external_switch, wish_5_external_1_val5=wish_5_external_1_val5, digital_sen5_5=digital_sen5_5, wish_8_external_1_val2=wish_8_external_1_val2, digital_sen1_5=digital_sen1_5, wish_4_external_2_val2=wish_4_external_2_val2, switch_sen12=switch_sen12, wish_9_external_1_val2=wish_9_external_1_val2, wish_8=wish_8, wish_10_external_2_val4=wish_10_external_2_val4, internal_humidity=internal_humidity, wish_5_external_2_val3=wish_5_external_2_val3, digital_sen1_4=digital_sen1_4, wish_9_external_1_val5=wish_9_external_1_val5, wish_8_external_1_val5=wish_8_external_1_val5, wish_4_sensors=wish_4_sensors, digital_sen8_5=digital_sen8_5, switch_sen14=switch_sen14, digital_sen7_3=digital_sen7_3, wish_1_internal_tempc=wish_1_internal_tempc, wish_8_external_1_val3=wish_8_external_1_val3, wish_8_external_2_type=wish_8_external_2_type, wish_7_external_2_type=wish_7_external_2_type, wish_2_internal_tempc=wish_2_internal_tempc, wish_5_external_1_val4=wish_5_external_1_val4, wish_10_external_2_val1=wish_10_external_2_val1, switch_sen9=switch_sen9, switchclear5_32E=switchclear5_32E, digital_sen7_2=digital_sen7_2, wish_5_external_2_val2=wish_5_external_2_val2, wish_5_external_1_val3=wish_5_external_1_val3, digital_sen2_5=digital_sen2_5, wish_5_internal_tempf=wish_5_internal_tempf, digital_sen8=digital_sen8, wish_4_enabled=wish_4_enabled, wish_9_external_1_val3=wish_9_external_1_val3, wish_9_external_switch=wish_9_external_switch, wish_10_external_1=wish_10_external_1, wish_5_external_2_val5=wish_5_external_2_val5, wish_8_external_1_val4=wish_8_external_1_val4, wish_6_external_1_val2=wish_6_external_1_val2, digital_sen3_4=digital_sen3_4, digital_sen6_4=digital_sen6_4, wish_3_updates=wish_3_updates, digital_sen2_2=digital_sen2_2, humidity=humidity, wish_6_external_2_val5=wish_6_external_2_val5, switch_sen15=switch_sen15, wish_9_serial_num=wish_9_serial_num, wish_2=wish_2, wish_9_external_1_val4=wish_9_external_1_val4, switch_sen8=switch_sen8, wish_10_battery_voltage=wish_10_battery_voltage, wish_2_internal=wish_2_internal, wish_3_external_2=wish_3_external_2, wish_5_sensors=wish_5_sensors, wish_3_internal_tempc=wish_3_internal_tempc, wish_6_external_2=wish_6_external_2, digital_sen1_2=digital_sen1_2, wish_8_enabled=wish_8_enabled, switchalarm3_32E=switchalarm3_32E, wish_8_sensors=wish_8_sensors, wish_1_external_1_type=wish_1_external_1_type, wish_6_sensors=wish_6_sensors, wish_9_external_2_val1=wish_9_external_2_val1, wish_10_updates=wish_10_updates, alarmmessage=alarmmessage, wish_10_sensors=wish_10_sensors, wish_3_internal=wish_3_internal, wish_1_internal=wish_1_internal, wish_9_sensors=wish_9_sensors)
| 173.073993 | 9,634 | 0.781227 | 25,316 | 159,055 | 4.733647 | 0.011416 | 0.010865 | 0.094278 | 0.079274 | 0.933535 | 0.892771 | 0.843371 | 0.774928 | 0.703305 | 0.564433 | 0 | 0.088877 | 0.097363 | 159,055 | 918 | 9,635 | 173.262527 | 0.745823 | 0.00205 | 0 | 0 | 0 | 0.164654 | 0.354807 | 0.022732 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006586 | 0 | 0.006586 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9657726c53151c5a1e1c651f1911c02640b28f93 | 629,084 | py | Python | openconfig/ydk/models/openconfig/openconfig_mpls.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 177 | 2016-03-15T17:03:51.000Z | 2022-03-18T16:48:44.000Z | openconfig/ydk/models/openconfig/openconfig_mpls.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 18 | 2016-03-30T10:45:22.000Z | 2020-07-14T16:28:13.000Z | openconfig/ydk/models/openconfig/openconfig_mpls.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 85 | 2016-03-16T20:38:57.000Z | 2022-02-22T04:26:02.000Z | """ openconfig_mpls
This module provides data definitions for configuration of
Multiprotocol Label Switching (MPLS) and associated protocols for
signaling and traffic engineering.
RFC 3031\: Multiprotocol Label Switching Architecture
The MPLS / TE data model consists of several modules and
submodules as shown below. The top\-level MPLS module describes
the overall framework. Three types of LSPs are supported\:
i) traffic\-engineered (or constrained\-path)
ii) IGP\-congruent (LSPs that follow the IGP path)
iii) static LSPs which are not signaled
The structure of each of these LSP configurations is defined in
corresponding submodules. Companion modules define the relevant
configuration and operational data specific to key signaling
protocols used in operational practice.
+\-\-\-\-\-\-\-+
+\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\->\| MPLS \|<\-\-\-\-\-\-\-\-\-\-\-\-\-\-+
\| +\-\-\-\-\-\-\-+ \|
\| ^ \|
\| \| \|
+\-\-\-\-+\-\-\-\-\-+ +\-\-\-\-\-\-\-\-+\-\-\-\-\-\-\-+ +\-\-\-\-\-+\-\-\-\-\-+
\| TE LSPs \| \| IGP\-based LSPs \| \|static LSPs\|
\| \| \| \| \| \|
+\-\-\-\-\-\-\-\-\-\-+ +\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-+ +\-\-\-\-\-\-\-\-\-\-\-+
^ ^ ^ ^
\| +\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-+ \| +\-\-\-\-\-\-\-\-+
\| \| \| \|
\| +\-\-\-\-\-\-+ +\-+\-\-\-+\-+ +\-\-+\-\-+
+\-\-\-+ RSVP \| \|SEGMENT\| \| LDP \|
+\-\-\-\-\-\-+ \|ROUTING\| +\-\-\-\-\-+
+\-\-\-\-\-\-\-+
"""
import sys
from collections import OrderedDict
from ydk.types import Entity as _Entity_
from ydk.types import EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.types import Entity, EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.filters import YFilter
from ydk.errors import YError, YModelError
from ydk.errors.error_handler import handle_type_error as _handle_type_error
class CspfTieBreaking(Enum):
"""
CspfTieBreaking (Enum Class)
type to indicate the CSPF selection policy when
multiple equal cost paths are available
.. data:: RANDOM = 0
CSPF calculation selects a random path among
multiple equal-cost paths to the destination
.. data:: LEAST_FILL = 1
CSPF calculation selects the path with greatest
available bandwidth
.. data:: MOST_FILL = 2
CSPF calculation selects the path with the least
available bandwidth
"""
RANDOM = Enum.YLeaf(0, "RANDOM")
LEAST_FILL = Enum.YLeaf(1, "LEAST_FILL")
MOST_FILL = Enum.YLeaf(2, "MOST_FILL")
class MplsHopType(Enum):
"""
MplsHopType (Enum Class)
enumerated type for specifying loose or strict
paths
.. data:: LOOSE = 0
loose hop in an explicit path
.. data:: STRICT = 1
strict hop in an explicit path
"""
LOOSE = Enum.YLeaf(0, "LOOSE")
STRICT = Enum.YLeaf(1, "STRICT")
class MplsSrlgFloodingType(Enum):
"""
MplsSrlgFloodingType (Enum Class)
Enumerated bype for specifying how the SRLG is flooded
.. data:: FLOODED_SRLG = 0
SRLG is flooded in the IGP
.. data:: STATIC_SRLG = 1
SRLG is not flooded, the members are
statically configured
"""
FLOODED_SRLG = Enum.YLeaf(0, "FLOODED_SRLG")
STATIC_SRLG = Enum.YLeaf(1, "STATIC_SRLG")
class TeBandwidthType(Enum):
"""
TeBandwidthType (Enum Class)
enumerated type for specifying whether bandwidth is
explicitly specified or automatically computed
.. data:: SPECIFIED = 0
Bandwidth is explicitly specified
.. data:: AUTO = 1
Bandwidth is automatically computed
"""
SPECIFIED = Enum.YLeaf(0, "SPECIFIED")
AUTO = Enum.YLeaf(1, "AUTO")
class TeMetricType(Enum):
"""
TeMetricType (Enum Class)
union type for setting the LSP TE metric to a
static value, or to track the IGP metric
.. data:: IGP = 0
set the LSP metric to track the underlying
IGP metric
"""
IGP = Enum.YLeaf(0, "IGP")
class Mpls(_Entity_):
"""
Anchor point for mpls configuration and operational
data
.. attribute:: global_
general mpls configuration applicable to any type of LSP and signaling protocol \- label ranges, entropy label supportmay be added here
**type**\: :py:class:`Global <ydk.models.openconfig.openconfig_mpls.Mpls.Global>`
.. attribute:: te_global_attributes
traffic\-engineering global attributes
**type**\: :py:class:`TeGlobalAttributes <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes>`
.. attribute:: te_interface_attributes
traffic engineering attributes specific for interfaces
**type**\: :py:class:`TeInterfaceAttributes <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes>`
.. attribute:: signaling_protocols
top\-level signaling protocol configuration
**type**\: :py:class:`SignalingProtocols <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols>`
.. attribute:: lsps
LSP definitions and configuration
**type**\: :py:class:`Lsps <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls, self).__init__()
self._top_entity = None
self.yang_name = "mpls"
self.yang_parent_name = "openconfig-mpls"
self.is_top_level_class = True
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("global", ("global_", Mpls.Global)), ("te-global-attributes", ("te_global_attributes", Mpls.TeGlobalAttributes)), ("te-interface-attributes", ("te_interface_attributes", Mpls.TeInterfaceAttributes)), ("signaling-protocols", ("signaling_protocols", Mpls.SignalingProtocols)), ("lsps", ("lsps", Mpls.Lsps))])
self._leafs = OrderedDict()
self.global_ = Mpls.Global()
self.global_.parent = self
self._children_name_map["global_"] = "global"
self.te_global_attributes = Mpls.TeGlobalAttributes()
self.te_global_attributes.parent = self
self._children_name_map["te_global_attributes"] = "te-global-attributes"
self.te_interface_attributes = Mpls.TeInterfaceAttributes()
self.te_interface_attributes.parent = self
self._children_name_map["te_interface_attributes"] = "te-interface-attributes"
self.signaling_protocols = Mpls.SignalingProtocols()
self.signaling_protocols.parent = self
self._children_name_map["signaling_protocols"] = "signaling-protocols"
self.lsps = Mpls.Lsps()
self.lsps.parent = self
self._children_name_map["lsps"] = "lsps"
self._segment_path = lambda: "openconfig-mpls:mpls"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls, [], name, value)
class Global(_Entity_):
"""
general mpls configuration applicable to any
type of LSP and signaling protocol \- label ranges,
entropy label supportmay be added here
.. attribute:: config
Top level global MPLS configuration
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Global.Config>`
.. attribute:: state
Top level global MPLS state
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Global.State>`
**config**\: False
.. attribute:: interface_attributes
Parameters related to MPLS interfaces
**type**\: :py:class:`InterfaceAttributes <ydk.models.openconfig.openconfig_mpls.Mpls.Global.InterfaceAttributes>`
.. attribute:: reserved_label_blocks
A range of labels starting with the start\-label and up\-to and including the end label that should be allocated as reserved. These labels should not be utilised by any dynamic label allocation on the local system unless the allocating protocol is explicitly configured to specify that allocation of labels should be out of the label block specified
**type**\: :py:class:`ReservedLabelBlocks <ydk.models.openconfig.openconfig_mpls.Mpls.Global.ReservedLabelBlocks>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global, self).__init__()
self.yang_name = "global"
self.yang_parent_name = "mpls"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Global.Config)), ("state", ("state", Mpls.Global.State)), ("interface-attributes", ("interface_attributes", Mpls.Global.InterfaceAttributes)), ("reserved-label-blocks", ("reserved_label_blocks", Mpls.Global.ReservedLabelBlocks))])
self._leafs = OrderedDict()
self.config = Mpls.Global.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Global.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.interface_attributes = Mpls.Global.InterfaceAttributes()
self.interface_attributes.parent = self
self._children_name_map["interface_attributes"] = "interface-attributes"
self.reserved_label_blocks = Mpls.Global.ReservedLabelBlocks()
self.reserved_label_blocks.parent = self
self._children_name_map["reserved_label_blocks"] = "reserved-label-blocks"
self._segment_path = lambda: "global"
self._absolute_path = lambda: "openconfig-mpls:mpls/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global, [], name, value)
class Config(_Entity_):
"""
Top level global MPLS configuration
.. attribute:: null_label
The null\-label type used, implicit or explicit
**type**\: :py:class:`NULLLABELTYPE <ydk.models.openconfig.openconfig_mpls_types.NULLLABELTYPE>`
**default value**\: oc-mplst:IMPLICIT
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('null_label', (YLeaf(YType.identityref, 'null-label'), [('ydk.models.openconfig.openconfig_mpls_types', 'NULLLABELTYPE')])),
])
self.null_label = None
self._segment_path = lambda: "config"
self._absolute_path = lambda: "openconfig-mpls:mpls/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.Config, ['null_label'], name, value)
class State(_Entity_):
"""
Top level global MPLS state
.. attribute:: null_label
The null\-label type used, implicit or explicit
**type**\: :py:class:`NULLLABELTYPE <ydk.models.openconfig.openconfig_mpls_types.NULLLABELTYPE>`
**config**\: False
**default value**\: oc-mplst:IMPLICIT
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('null_label', (YLeaf(YType.identityref, 'null-label'), [('ydk.models.openconfig.openconfig_mpls_types', 'NULLLABELTYPE')])),
])
self.null_label = None
self._segment_path = lambda: "state"
self._absolute_path = lambda: "openconfig-mpls:mpls/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.State, ['null_label'], name, value)
class InterfaceAttributes(_Entity_):
"""
Parameters related to MPLS interfaces
.. attribute:: interface
List of TE interfaces
**type**\: list of :py:class:`Interface <ydk.models.openconfig.openconfig_mpls.Mpls.Global.InterfaceAttributes.Interface>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.InterfaceAttributes, self).__init__()
self.yang_name = "interface-attributes"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("interface", ("interface", Mpls.Global.InterfaceAttributes.Interface))])
self._leafs = OrderedDict()
self.interface = YList(self)
self._segment_path = lambda: "interface-attributes"
self._absolute_path = lambda: "openconfig-mpls:mpls/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.InterfaceAttributes, [], name, value)
class Interface(_Entity_):
"""
List of TE interfaces
.. attribute:: interface_id (key)
Reference to the interface id list key
**type**\: str
**refers to**\: :py:class:`interface_id <ydk.models.openconfig.openconfig_mpls.Mpls.Global.InterfaceAttributes.Interface.Config>`
.. attribute:: config
Configuration parameters related to MPLS interfaces\:
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Global.InterfaceAttributes.Interface.Config>`
.. attribute:: state
State parameters related to TE interfaces
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Global.InterfaceAttributes.Interface.State>`
**config**\: False
.. attribute:: interface_ref
Reference to an interface or subinterface
**type**\: :py:class:`InterfaceRef <ydk.models.openconfig.openconfig_mpls.Mpls.Global.InterfaceAttributes.Interface.InterfaceRef>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.InterfaceAttributes.Interface, self).__init__()
self.yang_name = "interface"
self.yang_parent_name = "interface-attributes"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_id']
self._child_classes = OrderedDict([("config", ("config", Mpls.Global.InterfaceAttributes.Interface.Config)), ("state", ("state", Mpls.Global.InterfaceAttributes.Interface.State)), ("interface-ref", ("interface_ref", Mpls.Global.InterfaceAttributes.Interface.InterfaceRef))])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
])
self.interface_id = None
self.config = Mpls.Global.InterfaceAttributes.Interface.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Global.InterfaceAttributes.Interface.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.interface_ref = Mpls.Global.InterfaceAttributes.Interface.InterfaceRef()
self.interface_ref.parent = self
self._children_name_map["interface_ref"] = "interface-ref"
self._segment_path = lambda: "interface" + "[interface-id='" + str(self.interface_id) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/global/interface-attributes/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.InterfaceAttributes.Interface, ['interface_id'], name, value)
class Config(_Entity_):
"""
Configuration parameters related to MPLS interfaces\:
.. attribute:: interface_id
Indentifier for the MPLS interface
**type**\: str
.. attribute:: mpls_enabled
Enable MPLS forwarding on this interface
**type**\: bool
**default value**\: false
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.InterfaceAttributes.Interface.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
('mpls_enabled', (YLeaf(YType.boolean, 'mpls-enabled'), ['bool'])),
])
self.interface_id = None
self.mpls_enabled = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.InterfaceAttributes.Interface.Config, ['interface_id', 'mpls_enabled'], name, value)
class State(_Entity_):
"""
State parameters related to TE interfaces
.. attribute:: interface_id
Indentifier for the MPLS interface
**type**\: str
**config**\: False
.. attribute:: mpls_enabled
Enable MPLS forwarding on this interface
**type**\: bool
**config**\: False
**default value**\: false
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.InterfaceAttributes.Interface.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
('mpls_enabled', (YLeaf(YType.boolean, 'mpls-enabled'), ['bool'])),
])
self.interface_id = None
self.mpls_enabled = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.InterfaceAttributes.Interface.State, ['interface_id', 'mpls_enabled'], name, value)
class InterfaceRef(_Entity_):
"""
Reference to an interface or subinterface
.. attribute:: config
Configured reference to interface / subinterface
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.Config>`
.. attribute:: state
Operational state for interface\-ref
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.InterfaceAttributes.Interface.InterfaceRef, self).__init__()
self.yang_name = "interface-ref"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.Config)), ("state", ("state", Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.State))])
self._leafs = OrderedDict()
self.config = Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "interface-ref"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.InterfaceAttributes.Interface.InterfaceRef, [], name, value)
class Config(_Entity_):
"""
Configured reference to interface / subinterface
.. attribute:: interface
Reference to a base interface. If a reference to a subinterface is required, this leaf must be specified to indicate the base interface
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface>`
.. attribute:: subinterface
Reference to a subinterface \-\- this requires the base interface to be specified using the interface leaf in this container. If only a reference to a base interface is requuired, this leaf should not be set
**type**\: int
**range:** 0..4294967295
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface.Subinterfaces.Subinterface>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "interface-ref"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('subinterface', (YLeaf(YType.str, 'subinterface'), ['int'])),
])
self.interface = None
self.subinterface = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.Config, ['interface', 'subinterface'], name, value)
class State(_Entity_):
"""
Operational state for interface\-ref
.. attribute:: interface
Reference to a base interface. If a reference to a subinterface is required, this leaf must be specified to indicate the base interface
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface>`
**config**\: False
.. attribute:: subinterface
Reference to a subinterface \-\- this requires the base interface to be specified using the interface leaf in this container. If only a reference to a base interface is requuired, this leaf should not be set
**type**\: int
**range:** 0..4294967295
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface.Subinterfaces.Subinterface>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "interface-ref"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('subinterface', (YLeaf(YType.str, 'subinterface'), ['int'])),
])
self.interface = None
self.subinterface = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.InterfaceAttributes.Interface.InterfaceRef.State, ['interface', 'subinterface'], name, value)
class ReservedLabelBlocks(_Entity_):
"""
A range of labels starting with the start\-label and up\-to and including
the end label that should be allocated as reserved. These labels should
not be utilised by any dynamic label allocation on the local system unless
the allocating protocol is explicitly configured to specify that
allocation of labels should be out of the label block specified.
.. attribute:: reserved_label_block
A range of labels starting with the start\-label up to and including the end label that should be allocated for use by a specific protocol
**type**\: list of :py:class:`ReservedLabelBlock <ydk.models.openconfig.openconfig_mpls.Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.ReservedLabelBlocks, self).__init__()
self.yang_name = "reserved-label-blocks"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("reserved-label-block", ("reserved_label_block", Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock))])
self._leafs = OrderedDict()
self.reserved_label_block = YList(self)
self._segment_path = lambda: "reserved-label-blocks"
self._absolute_path = lambda: "openconfig-mpls:mpls/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.ReservedLabelBlocks, [], name, value)
class ReservedLabelBlock(_Entity_):
"""
A range of labels starting with the start\-label up to and including
the end label that should be allocated for use by a specific protocol.
.. attribute:: local_id (key)
A reference to a unique local identifier for this label block
**type**\: str
**refers to**\: :py:class:`local_id <ydk.models.openconfig.openconfig_mpls.Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.Config>`
.. attribute:: config
Configuration parameters relating to the label block
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.Config>`
.. attribute:: state
State parameters relating to the label block
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock, self).__init__()
self.yang_name = "reserved-label-block"
self.yang_parent_name = "reserved-label-blocks"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['local_id']
self._child_classes = OrderedDict([("config", ("config", Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.Config)), ("state", ("state", Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.State))])
self._leafs = OrderedDict([
('local_id', (YLeaf(YType.str, 'local-id'), ['str'])),
])
self.local_id = None
self.config = Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "reserved-label-block" + "[local-id='" + str(self.local_id) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/global/reserved-label-blocks/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock, ['local_id'], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to the label block.
.. attribute:: local_id
A local identifier for the global label block allocation
**type**\: str
.. attribute:: lower_bound
Lower bound of the global label block. The block is defined to include this label
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
.. attribute:: upper_bound
Upper bound for the global label block. The block is defined to include this label
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "reserved-label-block"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('local_id', (YLeaf(YType.str, 'local-id'), ['str'])),
('lower_bound', (YLeaf(YType.str, 'lower-bound'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('upper_bound', (YLeaf(YType.str, 'upper-bound'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
])
self.local_id = None
self.lower_bound = None
self.upper_bound = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.Config, ['local_id', 'lower_bound', 'upper_bound'], name, value)
class State(_Entity_):
"""
State parameters relating to the label block.
.. attribute:: local_id
A local identifier for the global label block allocation
**type**\: str
**config**\: False
.. attribute:: lower_bound
Lower bound of the global label block. The block is defined to include this label
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
.. attribute:: upper_bound
Upper bound for the global label block. The block is defined to include this label
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "reserved-label-block"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('local_id', (YLeaf(YType.str, 'local-id'), ['str'])),
('lower_bound', (YLeaf(YType.str, 'lower-bound'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('upper_bound', (YLeaf(YType.str, 'upper-bound'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
])
self.local_id = None
self.lower_bound = None
self.upper_bound = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Global.ReservedLabelBlocks.ReservedLabelBlock.State, ['local_id', 'lower_bound', 'upper_bound'], name, value)
class TeGlobalAttributes(_Entity_):
"""
traffic\-engineering global attributes
.. attribute:: srlgs
Shared risk link groups attributes
**type**\: :py:class:`Srlgs <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs>`
.. attribute:: mpls_admin_groups
Top\-level container for admin\-groups configuration and state
**type**\: :py:class:`MplsAdminGroups <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups>`
.. attribute:: te_lsp_timers
Definition for delays associated with setup and cleanup of TE LSPs
**type**\: :py:class:`TeLspTimers <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.TeLspTimers>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes, self).__init__()
self.yang_name = "te-global-attributes"
self.yang_parent_name = "mpls"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("srlgs", ("srlgs", Mpls.TeGlobalAttributes.Srlgs)), ("mpls-admin-groups", ("mpls_admin_groups", Mpls.TeGlobalAttributes.MplsAdminGroups)), ("te-lsp-timers", ("te_lsp_timers", Mpls.TeGlobalAttributes.TeLspTimers))])
self._leafs = OrderedDict()
self.srlgs = Mpls.TeGlobalAttributes.Srlgs()
self.srlgs.parent = self
self._children_name_map["srlgs"] = "srlgs"
self.mpls_admin_groups = Mpls.TeGlobalAttributes.MplsAdminGroups()
self.mpls_admin_groups.parent = self
self._children_name_map["mpls_admin_groups"] = "mpls-admin-groups"
self.te_lsp_timers = Mpls.TeGlobalAttributes.TeLspTimers()
self.te_lsp_timers.parent = self
self._children_name_map["te_lsp_timers"] = "te-lsp-timers"
self._segment_path = lambda: "te-global-attributes"
self._absolute_path = lambda: "openconfig-mpls:mpls/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes, [], name, value)
class Srlgs(_Entity_):
"""
Shared risk link groups attributes
.. attribute:: srlg
List of shared risk link groups
**type**\: list of :py:class:`Srlg <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.Srlgs, self).__init__()
self.yang_name = "srlgs"
self.yang_parent_name = "te-global-attributes"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("srlg", ("srlg", Mpls.TeGlobalAttributes.Srlgs.Srlg))])
self._leafs = OrderedDict()
self.srlg = YList(self)
self._segment_path = lambda: "srlgs"
self._absolute_path = lambda: "openconfig-mpls:mpls/te-global-attributes/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.Srlgs, [], name, value)
class Srlg(_Entity_):
"""
List of shared risk link groups
.. attribute:: name (key)
The SRLG group identifier
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg.Config>`
.. attribute:: config
Configuration parameters related to the SRLG
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg.Config>`
.. attribute:: state
State parameters related to the SRLG
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg.State>`
**config**\: False
.. attribute:: static_srlg_members
SRLG members for static (not flooded) SRLGs
**type**\: :py:class:`StaticSrlgMembers <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.Srlgs.Srlg, self).__init__()
self.yang_name = "srlg"
self.yang_parent_name = "srlgs"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['name']
self._child_classes = OrderedDict([("config", ("config", Mpls.TeGlobalAttributes.Srlgs.Srlg.Config)), ("state", ("state", Mpls.TeGlobalAttributes.Srlgs.Srlg.State)), ("static-srlg-members", ("static_srlg_members", Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers))])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
])
self.name = None
self.config = Mpls.TeGlobalAttributes.Srlgs.Srlg.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.TeGlobalAttributes.Srlgs.Srlg.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.static_srlg_members = Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers()
self.static_srlg_members.parent = self
self._children_name_map["static_srlg_members"] = "static-srlg-members"
self._segment_path = lambda: "srlg" + "[name='" + str(self.name) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/te-global-attributes/srlgs/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.Srlgs.Srlg, ['name'], name, value)
class Config(_Entity_):
"""
Configuration parameters related to the SRLG
.. attribute:: name
SRLG group identifier
**type**\: str
.. attribute:: value
group ID for the SRLG
**type**\: int
**range:** 0..4294967295
.. attribute:: cost
The cost of the SRLG to the computation algorithm
**type**\: int
**range:** 0..4294967295
.. attribute:: flooding_type
The type of SRLG, either flooded in the IGP or statically configured
**type**\: :py:class:`MplsSrlgFloodingType <ydk.models.openconfig.openconfig_mpls.MplsSrlgFloodingType>`
**default value**\: FLOODED_SRLG
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.Srlgs.Srlg.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "srlg"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('value', (YLeaf(YType.uint32, 'value'), ['int'])),
('cost', (YLeaf(YType.uint32, 'cost'), ['int'])),
('flooding_type', (YLeaf(YType.enumeration, 'flooding-type'), [('ydk.models.openconfig.openconfig_mpls', 'MplsSrlgFloodingType', '')])),
])
self.name = None
self.value = None
self.cost = None
self.flooding_type = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.Srlgs.Srlg.Config, ['name', 'value', 'cost', 'flooding_type'], name, value)
class State(_Entity_):
"""
State parameters related to the SRLG
.. attribute:: name
SRLG group identifier
**type**\: str
**config**\: False
.. attribute:: value
group ID for the SRLG
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: cost
The cost of the SRLG to the computation algorithm
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: flooding_type
The type of SRLG, either flooded in the IGP or statically configured
**type**\: :py:class:`MplsSrlgFloodingType <ydk.models.openconfig.openconfig_mpls.MplsSrlgFloodingType>`
**config**\: False
**default value**\: FLOODED_SRLG
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.Srlgs.Srlg.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "srlg"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('value', (YLeaf(YType.uint32, 'value'), ['int'])),
('cost', (YLeaf(YType.uint32, 'cost'), ['int'])),
('flooding_type', (YLeaf(YType.enumeration, 'flooding-type'), [('ydk.models.openconfig.openconfig_mpls', 'MplsSrlgFloodingType', '')])),
])
self.name = None
self.value = None
self.cost = None
self.flooding_type = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.Srlgs.Srlg.State, ['name', 'value', 'cost', 'flooding_type'], name, value)
class StaticSrlgMembers(_Entity_):
"""
SRLG members for static (not flooded) SRLGs
.. attribute:: members_list
List of SRLG members, which are expressed as IP address endpoints of links contained in the SRLG
**type**\: list of :py:class:`MembersList <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers, self).__init__()
self.yang_name = "static-srlg-members"
self.yang_parent_name = "srlg"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("members-list", ("members_list", Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList))])
self._leafs = OrderedDict()
self.members_list = YList(self)
self._segment_path = lambda: "static-srlg-members"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers, [], name, value)
class MembersList(_Entity_):
"""
List of SRLG members, which are expressed
as IP address endpoints of links contained in the
SRLG
.. attribute:: from_address (key)
The from address of the link in the SRLG
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**refers to**\: :py:class:`from_address <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.Config>`
.. attribute:: config
Configuration parameters relating to the SRLG members
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.Config>`
.. attribute:: state
State parameters relating to the SRLG members
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList, self).__init__()
self.yang_name = "members-list"
self.yang_parent_name = "static-srlg-members"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['from_address']
self._child_classes = OrderedDict([("config", ("config", Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.Config)), ("state", ("state", Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.State))])
self._leafs = OrderedDict([
('from_address', (YLeaf(YType.str, 'from-address'), ['str'])),
])
self.from_address = None
self.config = Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "members-list" + "[from-address='" + str(self.from_address) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList, ['from_address'], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to the
SRLG members
.. attribute:: from_address
IP address of the a\-side of the SRLG link
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
.. attribute:: to_address
IP address of the z\-side of the SRLG link
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "members-list"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('from_address', (YLeaf(YType.str, 'from-address'), ['str','str'])),
('to_address', (YLeaf(YType.str, 'to-address'), ['str','str'])),
])
self.from_address = None
self.to_address = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.Config, ['from_address', 'to_address'], name, value)
class State(_Entity_):
"""
State parameters relating to the SRLG
members
.. attribute:: from_address
IP address of the a\-side of the SRLG link
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: to_address
IP address of the z\-side of the SRLG link
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "members-list"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('from_address', (YLeaf(YType.str, 'from-address'), ['str','str'])),
('to_address', (YLeaf(YType.str, 'to-address'), ['str','str'])),
])
self.from_address = None
self.to_address = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.Srlgs.Srlg.StaticSrlgMembers.MembersList.State, ['from_address', 'to_address'], name, value)
class MplsAdminGroups(_Entity_):
"""
Top\-level container for admin\-groups configuration
and state
.. attribute:: admin_group
configuration of value to name mapping for mpls affinities/admin\-groups
**type**\: list of :py:class:`AdminGroup <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.MplsAdminGroups, self).__init__()
self.yang_name = "mpls-admin-groups"
self.yang_parent_name = "te-global-attributes"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("admin-group", ("admin_group", Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup))])
self._leafs = OrderedDict()
self.admin_group = YList(self)
self._segment_path = lambda: "mpls-admin-groups"
self._absolute_path = lambda: "openconfig-mpls:mpls/te-global-attributes/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.MplsAdminGroups, [], name, value)
class AdminGroup(_Entity_):
"""
configuration of value to name mapping
for mpls affinities/admin\-groups
.. attribute:: admin_group_name (key)
name for mpls admin\-group
**type**\: str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.Config>`
.. attribute:: config
Configurable items for admin\-groups
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.Config>`
.. attribute:: state
Operational state for admin\-groups
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup, self).__init__()
self.yang_name = "admin-group"
self.yang_parent_name = "mpls-admin-groups"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['admin_group_name']
self._child_classes = OrderedDict([("config", ("config", Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.Config)), ("state", ("state", Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.State))])
self._leafs = OrderedDict([
('admin_group_name', (YLeaf(YType.str, 'admin-group-name'), ['str'])),
])
self.admin_group_name = None
self.config = Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "admin-group" + "[admin-group-name='" + str(self.admin_group_name) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/te-global-attributes/mpls-admin-groups/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup, ['admin_group_name'], name, value)
class Config(_Entity_):
"""
Configurable items for admin\-groups
.. attribute:: admin_group_name
name for mpls admin\-group
**type**\: str
.. attribute:: bit_position
bit\-position value for mpls admin\-group. The value for the admin group is an integer that represents one of the bit positions in the admin\-group bitmask. Values between 0 and 31 are interpreted as the original limit of 32 admin groups. Values >=32 are interpreted as extended admin group values as per RFC7308
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "admin-group"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('admin_group_name', (YLeaf(YType.str, 'admin-group-name'), ['str'])),
('bit_position', (YLeaf(YType.uint32, 'bit-position'), ['int'])),
])
self.admin_group_name = None
self.bit_position = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.Config, ['admin_group_name', 'bit_position'], name, value)
class State(_Entity_):
"""
Operational state for admin\-groups
.. attribute:: admin_group_name
name for mpls admin\-group
**type**\: str
**config**\: False
.. attribute:: bit_position
bit\-position value for mpls admin\-group. The value for the admin group is an integer that represents one of the bit positions in the admin\-group bitmask. Values between 0 and 31 are interpreted as the original limit of 32 admin groups. Values >=32 are interpreted as extended admin group values as per RFC7308
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "admin-group"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('admin_group_name', (YLeaf(YType.str, 'admin-group-name'), ['str'])),
('bit_position', (YLeaf(YType.uint32, 'bit-position'), ['int'])),
])
self.admin_group_name = None
self.bit_position = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup.State, ['admin_group_name', 'bit_position'], name, value)
class TeLspTimers(_Entity_):
"""
Definition for delays associated with setup
and cleanup of TE LSPs
.. attribute:: config
Configuration parameters related to timers for TE LSPs
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.TeLspTimers.Config>`
.. attribute:: state
State related to timers for TE LSPs
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.TeLspTimers.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.TeLspTimers, self).__init__()
self.yang_name = "te-lsp-timers"
self.yang_parent_name = "te-global-attributes"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.TeGlobalAttributes.TeLspTimers.Config)), ("state", ("state", Mpls.TeGlobalAttributes.TeLspTimers.State))])
self._leafs = OrderedDict()
self.config = Mpls.TeGlobalAttributes.TeLspTimers.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.TeGlobalAttributes.TeLspTimers.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "te-lsp-timers"
self._absolute_path = lambda: "openconfig-mpls:mpls/te-global-attributes/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.TeLspTimers, [], name, value)
class Config(_Entity_):
"""
Configuration parameters related
to timers for TE LSPs
.. attribute:: install_delay
delay the use of newly installed te lsp for a specified amount of time
**type**\: int
**range:** 0..3600
**units**\: seconds
.. attribute:: cleanup_delay
delay the removal of old te lsp for a specified amount of time
**type**\: int
**range:** 0..65535
**units**\: seconds
.. attribute:: reoptimize_timer
frequency of reoptimization of a traffic engineered LSP
**type**\: int
**range:** 0..65535
**units**\: seconds
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.TeLspTimers.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "te-lsp-timers"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('install_delay', (YLeaf(YType.uint16, 'install-delay'), ['int'])),
('cleanup_delay', (YLeaf(YType.uint16, 'cleanup-delay'), ['int'])),
('reoptimize_timer', (YLeaf(YType.uint16, 'reoptimize-timer'), ['int'])),
])
self.install_delay = None
self.cleanup_delay = None
self.reoptimize_timer = None
self._segment_path = lambda: "config"
self._absolute_path = lambda: "openconfig-mpls:mpls/te-global-attributes/te-lsp-timers/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.TeLspTimers.Config, ['install_delay', 'cleanup_delay', 'reoptimize_timer'], name, value)
class State(_Entity_):
"""
State related to timers for TE LSPs
.. attribute:: install_delay
delay the use of newly installed te lsp for a specified amount of time
**type**\: int
**range:** 0..3600
**config**\: False
**units**\: seconds
.. attribute:: cleanup_delay
delay the removal of old te lsp for a specified amount of time
**type**\: int
**range:** 0..65535
**config**\: False
**units**\: seconds
.. attribute:: reoptimize_timer
frequency of reoptimization of a traffic engineered LSP
**type**\: int
**range:** 0..65535
**config**\: False
**units**\: seconds
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeGlobalAttributes.TeLspTimers.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "te-lsp-timers"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('install_delay', (YLeaf(YType.uint16, 'install-delay'), ['int'])),
('cleanup_delay', (YLeaf(YType.uint16, 'cleanup-delay'), ['int'])),
('reoptimize_timer', (YLeaf(YType.uint16, 'reoptimize-timer'), ['int'])),
])
self.install_delay = None
self.cleanup_delay = None
self.reoptimize_timer = None
self._segment_path = lambda: "state"
self._absolute_path = lambda: "openconfig-mpls:mpls/te-global-attributes/te-lsp-timers/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeGlobalAttributes.TeLspTimers.State, ['install_delay', 'cleanup_delay', 'reoptimize_timer'], name, value)
class TeInterfaceAttributes(_Entity_):
"""
traffic engineering attributes specific
for interfaces
.. attribute:: interface
List of TE interfaces
**type**\: list of :py:class:`Interface <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes, self).__init__()
self.yang_name = "te-interface-attributes"
self.yang_parent_name = "mpls"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("interface", ("interface", Mpls.TeInterfaceAttributes.Interface))])
self._leafs = OrderedDict()
self.interface = YList(self)
self._segment_path = lambda: "te-interface-attributes"
self._absolute_path = lambda: "openconfig-mpls:mpls/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes, [], name, value)
class Interface(_Entity_):
"""
List of TE interfaces
.. attribute:: interface_id (key)
Reference to the interface id list key
**type**\: str
**refers to**\: :py:class:`interface_id <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.Config>`
.. attribute:: config
Configuration parameters related to TE interfaces\:
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.Config>`
.. attribute:: state
State parameters related to TE interfaces
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.State>`
**config**\: False
.. attribute:: interface_ref
Reference to an interface or subinterface
**type**\: :py:class:`InterfaceRef <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.InterfaceRef>`
.. attribute:: igp_flooding_bandwidth
Interface bandwidth change percentages that trigger update events into the IGP traffic engineering database (TED)
**type**\: :py:class:`IgpFloodingBandwidth <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes.Interface, self).__init__()
self.yang_name = "interface"
self.yang_parent_name = "te-interface-attributes"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_id']
self._child_classes = OrderedDict([("config", ("config", Mpls.TeInterfaceAttributes.Interface.Config)), ("state", ("state", Mpls.TeInterfaceAttributes.Interface.State)), ("interface-ref", ("interface_ref", Mpls.TeInterfaceAttributes.Interface.InterfaceRef)), ("igp-flooding-bandwidth", ("igp_flooding_bandwidth", Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth))])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
])
self.interface_id = None
self.config = Mpls.TeInterfaceAttributes.Interface.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.TeInterfaceAttributes.Interface.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.interface_ref = Mpls.TeInterfaceAttributes.Interface.InterfaceRef()
self.interface_ref.parent = self
self._children_name_map["interface_ref"] = "interface-ref"
self.igp_flooding_bandwidth = Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth()
self.igp_flooding_bandwidth.parent = self
self._children_name_map["igp_flooding_bandwidth"] = "igp-flooding-bandwidth"
self._segment_path = lambda: "interface" + "[interface-id='" + str(self.interface_id) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/te-interface-attributes/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes.Interface, ['interface_id'], name, value)
class Config(_Entity_):
"""
Configuration parameters related to TE interfaces\:
.. attribute:: interface_id
Id of the interface
**type**\: str
.. attribute:: te_metric
TE specific metric for the link
**type**\: int
**range:** 0..4294967295
.. attribute:: srlg_membership
list of references to named shared risk link groups that the interface belongs to
**type**\: list of str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg>`
.. attribute:: admin_group
list of admin groups (by name) on the interface
**type**\: list of str
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes.Interface.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
('te_metric', (YLeaf(YType.uint32, 'te-metric'), ['int'])),
('srlg_membership', (YLeafList(YType.str, 'srlg-membership'), ['str'])),
('admin_group', (YLeafList(YType.str, 'admin-group'), ['str'])),
])
self.interface_id = None
self.te_metric = None
self.srlg_membership = []
self.admin_group = []
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes.Interface.Config, ['interface_id', 'te_metric', 'srlg_membership', 'admin_group'], name, value)
class State(_Entity_):
"""
State parameters related to TE interfaces
.. attribute:: interface_id
Id of the interface
**type**\: str
**config**\: False
.. attribute:: te_metric
TE specific metric for the link
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: srlg_membership
list of references to named shared risk link groups that the interface belongs to
**type**\: list of str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.Srlgs.Srlg>`
**config**\: False
.. attribute:: admin_group
list of admin groups (by name) on the interface
**type**\: list of str
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes.Interface.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
('te_metric', (YLeaf(YType.uint32, 'te-metric'), ['int'])),
('srlg_membership', (YLeafList(YType.str, 'srlg-membership'), ['str'])),
('admin_group', (YLeafList(YType.str, 'admin-group'), ['str'])),
])
self.interface_id = None
self.te_metric = None
self.srlg_membership = []
self.admin_group = []
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes.Interface.State, ['interface_id', 'te_metric', 'srlg_membership', 'admin_group'], name, value)
class InterfaceRef(_Entity_):
"""
Reference to an interface or subinterface
.. attribute:: config
Configured reference to interface / subinterface
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.InterfaceRef.Config>`
.. attribute:: state
Operational state for interface\-ref
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.InterfaceRef.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes.Interface.InterfaceRef, self).__init__()
self.yang_name = "interface-ref"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.TeInterfaceAttributes.Interface.InterfaceRef.Config)), ("state", ("state", Mpls.TeInterfaceAttributes.Interface.InterfaceRef.State))])
self._leafs = OrderedDict()
self.config = Mpls.TeInterfaceAttributes.Interface.InterfaceRef.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.TeInterfaceAttributes.Interface.InterfaceRef.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "interface-ref"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes.Interface.InterfaceRef, [], name, value)
class Config(_Entity_):
"""
Configured reference to interface / subinterface
.. attribute:: interface
Reference to a base interface. If a reference to a subinterface is required, this leaf must be specified to indicate the base interface
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface>`
.. attribute:: subinterface
Reference to a subinterface \-\- this requires the base interface to be specified using the interface leaf in this container. If only a reference to a base interface is requuired, this leaf should not be set
**type**\: int
**range:** 0..4294967295
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface.Subinterfaces.Subinterface>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes.Interface.InterfaceRef.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "interface-ref"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('subinterface', (YLeaf(YType.str, 'subinterface'), ['int'])),
])
self.interface = None
self.subinterface = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes.Interface.InterfaceRef.Config, ['interface', 'subinterface'], name, value)
class State(_Entity_):
"""
Operational state for interface\-ref
.. attribute:: interface
Reference to a base interface. If a reference to a subinterface is required, this leaf must be specified to indicate the base interface
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface>`
**config**\: False
.. attribute:: subinterface
Reference to a subinterface \-\- this requires the base interface to be specified using the interface leaf in this container. If only a reference to a base interface is requuired, this leaf should not be set
**type**\: int
**range:** 0..4294967295
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface.Subinterfaces.Subinterface>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes.Interface.InterfaceRef.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "interface-ref"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('subinterface', (YLeaf(YType.str, 'subinterface'), ['int'])),
])
self.interface = None
self.subinterface = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes.Interface.InterfaceRef.State, ['interface', 'subinterface'], name, value)
class IgpFloodingBandwidth(_Entity_):
"""
Interface bandwidth change percentages
that trigger update events into the IGP traffic
engineering database (TED)
.. attribute:: config
Configuration parameters for TED update threshold
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.Config>`
.. attribute:: state
State parameters for TED update threshold
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth, self).__init__()
self.yang_name = "igp-flooding-bandwidth"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.Config)), ("state", ("state", Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.State))])
self._leafs = OrderedDict()
self.config = Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "igp-flooding-bandwidth"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth, [], name, value)
class Config(_Entity_):
"""
Configuration parameters for TED
update threshold
.. attribute:: threshold_type
The type of threshold that should be used to specify the values at which bandwidth is flooded. DELTA indicates that the local system should flood IGP updates when a change in reserved bandwidth >= the specified delta occurs on the interface. Where THRESHOLD\_CROSSED is specified, the local system should trigger an update (and hence flood) the reserved bandwidth when the reserved bandwidth changes such that it crosses, or becomes equal to one of the threshold values
**type**\: :py:class:`ThresholdType <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.Config.ThresholdType>`
.. attribute:: delta_percentage
The percentage of the maximum\-reservable\-bandwidth considered as the delta that results in an IGP update being flooded
**type**\: int
**range:** 0..100
.. attribute:: threshold_specification
This value specifies whether a single set of threshold values should be used for both increasing and decreasing bandwidth when determining whether to trigger updated bandwidth values to be flooded in the IGP TE extensions. MIRRORED\-UP\-DOWN indicates that a single value (or set of values) should be used for both increasing and decreasing values, where SEPARATE\-UP\-DOWN specifies that the increasing and decreasing values will be separately specified
**type**\: :py:class:`ThresholdSpecification <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.Config.ThresholdSpecification>`
.. attribute:: up_thresholds
The thresholds (expressed as a percentage of the maximum reservable bandwidth) at which bandwidth updates are to be triggered when the bandwidth is increasing
**type**\: list of int
**range:** 0..100
.. attribute:: down_thresholds
The thresholds (expressed as a percentage of the maximum reservable bandwidth) at which bandwidth updates are to be triggered when the bandwidth is decreasing
**type**\: list of int
**range:** 0..100
.. attribute:: up_down_thresholds
The thresholds (expressed as a percentage of the maximum reservable bandwidth of the interface) at which bandwidth updates are flooded \- used both when the bandwidth is increasing and decreasing
**type**\: list of int
**range:** 0..100
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "igp-flooding-bandwidth"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('threshold_type', (YLeaf(YType.enumeration, 'threshold-type'), [('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'TeInterfaceAttributes.Interface.IgpFloodingBandwidth.Config.ThresholdType')])),
('delta_percentage', (YLeaf(YType.uint8, 'delta-percentage'), ['int'])),
('threshold_specification', (YLeaf(YType.enumeration, 'threshold-specification'), [('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'TeInterfaceAttributes.Interface.IgpFloodingBandwidth.Config.ThresholdSpecification')])),
('up_thresholds', (YLeafList(YType.uint8, 'up-thresholds'), ['int'])),
('down_thresholds', (YLeafList(YType.uint8, 'down-thresholds'), ['int'])),
('up_down_thresholds', (YLeafList(YType.uint8, 'up-down-thresholds'), ['int'])),
])
self.threshold_type = None
self.delta_percentage = None
self.threshold_specification = None
self.up_thresholds = []
self.down_thresholds = []
self.up_down_thresholds = []
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.Config, ['threshold_type', 'delta_percentage', 'threshold_specification', 'up_thresholds', 'down_thresholds', 'up_down_thresholds'], name, value)
class ThresholdSpecification(Enum):
"""
ThresholdSpecification (Enum Class)
This value specifies whether a single set of threshold
values should be used for both increasing and decreasing
bandwidth when determining whether to trigger updated
bandwidth values to be flooded in the IGP TE extensions.
MIRRORED\-UP\-DOWN indicates that a single value (or set of
values) should be used for both increasing and decreasing
values, where SEPARATE\-UP\-DOWN specifies that the increasing
and decreasing values will be separately specified
.. data:: MIRRORED_UP_DOWN = 0
MIRRORED_UP_DOWN indicates that a single set of
threshold values should be used for both increasing
and decreasing bandwidth when determining whether
to trigger updated bandwidth values to be flooded
in the IGP TE extensions.
.. data:: SEPARATE_UP_DOWN = 1
SEPARATE_UP_DOWN indicates that a separate
threshold values should be used for the increasing
and decreasing bandwidth when determining whether
to trigger updated bandwidth values to be flooded
in the IGP TE extensions.
"""
MIRRORED_UP_DOWN = Enum.YLeaf(0, "MIRRORED_UP_DOWN")
SEPARATE_UP_DOWN = Enum.YLeaf(1, "SEPARATE_UP_DOWN")
class ThresholdType(Enum):
"""
ThresholdType (Enum Class)
The type of threshold that should be used to specify the
values at which bandwidth is flooded. DELTA indicates that
the local system should flood IGP updates when a change in
reserved bandwidth >= the specified delta occurs on the
interface. Where THRESHOLD\_CROSSED is specified, the local
system should trigger an update (and hence flood) the
reserved bandwidth when the reserved bandwidth changes such
that it crosses, or becomes equal to one of the threshold
values
.. data:: DELTA = 0
DELTA indicates that the local
system should flood IGP updates when a
change in reserved bandwidth >= the specified
delta occurs on the interface.
.. data:: THRESHOLD_CROSSED = 1
THRESHOLD-CROSSED indicates that
the local system should trigger an update (and
hence flood) the reserved bandwidth when the
reserved bandwidth changes such that it crosses,
or becomes equal to one of the threshold values.
"""
DELTA = Enum.YLeaf(0, "DELTA")
THRESHOLD_CROSSED = Enum.YLeaf(1, "THRESHOLD_CROSSED")
class State(_Entity_):
"""
State parameters for TED update threshold
.. attribute:: threshold_type
The type of threshold that should be used to specify the values at which bandwidth is flooded. DELTA indicates that the local system should flood IGP updates when a change in reserved bandwidth >= the specified delta occurs on the interface. Where THRESHOLD\_CROSSED is specified, the local system should trigger an update (and hence flood) the reserved bandwidth when the reserved bandwidth changes such that it crosses, or becomes equal to one of the threshold values
**type**\: :py:class:`ThresholdType <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.State.ThresholdType>`
**config**\: False
.. attribute:: delta_percentage
The percentage of the maximum\-reservable\-bandwidth considered as the delta that results in an IGP update being flooded
**type**\: int
**range:** 0..100
**config**\: False
.. attribute:: threshold_specification
This value specifies whether a single set of threshold values should be used for both increasing and decreasing bandwidth when determining whether to trigger updated bandwidth values to be flooded in the IGP TE extensions. MIRRORED\-UP\-DOWN indicates that a single value (or set of values) should be used for both increasing and decreasing values, where SEPARATE\-UP\-DOWN specifies that the increasing and decreasing values will be separately specified
**type**\: :py:class:`ThresholdSpecification <ydk.models.openconfig.openconfig_mpls.Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.State.ThresholdSpecification>`
**config**\: False
.. attribute:: up_thresholds
The thresholds (expressed as a percentage of the maximum reservable bandwidth) at which bandwidth updates are to be triggered when the bandwidth is increasing
**type**\: list of int
**range:** 0..100
**config**\: False
.. attribute:: down_thresholds
The thresholds (expressed as a percentage of the maximum reservable bandwidth) at which bandwidth updates are to be triggered when the bandwidth is decreasing
**type**\: list of int
**range:** 0..100
**config**\: False
.. attribute:: up_down_thresholds
The thresholds (expressed as a percentage of the maximum reservable bandwidth of the interface) at which bandwidth updates are flooded \- used both when the bandwidth is increasing and decreasing
**type**\: list of int
**range:** 0..100
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "igp-flooding-bandwidth"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('threshold_type', (YLeaf(YType.enumeration, 'threshold-type'), [('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'TeInterfaceAttributes.Interface.IgpFloodingBandwidth.State.ThresholdType')])),
('delta_percentage', (YLeaf(YType.uint8, 'delta-percentage'), ['int'])),
('threshold_specification', (YLeaf(YType.enumeration, 'threshold-specification'), [('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'TeInterfaceAttributes.Interface.IgpFloodingBandwidth.State.ThresholdSpecification')])),
('up_thresholds', (YLeafList(YType.uint8, 'up-thresholds'), ['int'])),
('down_thresholds', (YLeafList(YType.uint8, 'down-thresholds'), ['int'])),
('up_down_thresholds', (YLeafList(YType.uint8, 'up-down-thresholds'), ['int'])),
])
self.threshold_type = None
self.delta_percentage = None
self.threshold_specification = None
self.up_thresholds = []
self.down_thresholds = []
self.up_down_thresholds = []
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.TeInterfaceAttributes.Interface.IgpFloodingBandwidth.State, ['threshold_type', 'delta_percentage', 'threshold_specification', 'up_thresholds', 'down_thresholds', 'up_down_thresholds'], name, value)
class ThresholdSpecification(Enum):
"""
ThresholdSpecification (Enum Class)
This value specifies whether a single set of threshold
values should be used for both increasing and decreasing
bandwidth when determining whether to trigger updated
bandwidth values to be flooded in the IGP TE extensions.
MIRRORED\-UP\-DOWN indicates that a single value (or set of
values) should be used for both increasing and decreasing
values, where SEPARATE\-UP\-DOWN specifies that the increasing
and decreasing values will be separately specified
.. data:: MIRRORED_UP_DOWN = 0
MIRRORED_UP_DOWN indicates that a single set of
threshold values should be used for both increasing
and decreasing bandwidth when determining whether
to trigger updated bandwidth values to be flooded
in the IGP TE extensions.
.. data:: SEPARATE_UP_DOWN = 1
SEPARATE_UP_DOWN indicates that a separate
threshold values should be used for the increasing
and decreasing bandwidth when determining whether
to trigger updated bandwidth values to be flooded
in the IGP TE extensions.
"""
MIRRORED_UP_DOWN = Enum.YLeaf(0, "MIRRORED_UP_DOWN")
SEPARATE_UP_DOWN = Enum.YLeaf(1, "SEPARATE_UP_DOWN")
class ThresholdType(Enum):
"""
ThresholdType (Enum Class)
The type of threshold that should be used to specify the
values at which bandwidth is flooded. DELTA indicates that
the local system should flood IGP updates when a change in
reserved bandwidth >= the specified delta occurs on the
interface. Where THRESHOLD\_CROSSED is specified, the local
system should trigger an update (and hence flood) the
reserved bandwidth when the reserved bandwidth changes such
that it crosses, or becomes equal to one of the threshold
values
.. data:: DELTA = 0
DELTA indicates that the local
system should flood IGP updates when a
change in reserved bandwidth >= the specified
delta occurs on the interface.
.. data:: THRESHOLD_CROSSED = 1
THRESHOLD-CROSSED indicates that
the local system should trigger an update (and
hence flood) the reserved bandwidth when the
reserved bandwidth changes such that it crosses,
or becomes equal to one of the threshold values.
"""
DELTA = Enum.YLeaf(0, "DELTA")
THRESHOLD_CROSSED = Enum.YLeaf(1, "THRESHOLD_CROSSED")
class SignalingProtocols(_Entity_):
"""
top\-level signaling protocol configuration
.. attribute:: rsvp_te
RSVP\-TE global signaling protocol configuration
**type**\: :py:class:`RsvpTe <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe>`
.. attribute:: ldp
LDP global signaling configuration
**type**\: :py:class:`Ldp <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.Ldp>`
.. attribute:: segment_routing
MPLS\-specific Segment Routing configuration and operational state parameters
**type**\: :py:class:`SegmentRouting <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols, self).__init__()
self.yang_name = "signaling-protocols"
self.yang_parent_name = "mpls"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("rsvp-te", ("rsvp_te", Mpls.SignalingProtocols.RsvpTe)), ("ldp", ("ldp", Mpls.SignalingProtocols.Ldp)), ("segment-routing", ("segment_routing", Mpls.SignalingProtocols.SegmentRouting))])
self._leafs = OrderedDict()
self.rsvp_te = Mpls.SignalingProtocols.RsvpTe()
self.rsvp_te.parent = self
self._children_name_map["rsvp_te"] = "rsvp-te"
self.ldp = Mpls.SignalingProtocols.Ldp()
self.ldp.parent = self
self._children_name_map["ldp"] = "ldp"
self.segment_routing = Mpls.SignalingProtocols.SegmentRouting()
self.segment_routing.parent = self
self._children_name_map["segment_routing"] = "segment-routing"
self._segment_path = lambda: "signaling-protocols"
self._absolute_path = lambda: "openconfig-mpls:mpls/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols, [], name, value)
class RsvpTe(_Entity_):
"""
RSVP\-TE global signaling protocol configuration
.. attribute:: sessions
Enclosing container for sessions
**type**\: :py:class:`Sessions <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions>`
.. attribute:: neighbors
Configuration and state for RSVP neighbors connecting to the device
**type**\: :py:class:`Neighbors <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Neighbors>`
.. attribute:: global_
Platform wide RSVP configuration and state
**type**\: :py:class:`Global <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global>`
.. attribute:: interface_attributes
Attributes relating to RSVP\-TE enabled interfaces
**type**\: :py:class:`InterfaceAttributes <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe, self).__init__()
self.yang_name = "rsvp-te"
self.yang_parent_name = "signaling-protocols"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("sessions", ("sessions", Mpls.SignalingProtocols.RsvpTe.Sessions)), ("neighbors", ("neighbors", Mpls.SignalingProtocols.RsvpTe.Neighbors)), ("global", ("global_", Mpls.SignalingProtocols.RsvpTe.Global)), ("interface-attributes", ("interface_attributes", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes))])
self._leafs = OrderedDict()
self.sessions = Mpls.SignalingProtocols.RsvpTe.Sessions()
self.sessions.parent = self
self._children_name_map["sessions"] = "sessions"
self.neighbors = Mpls.SignalingProtocols.RsvpTe.Neighbors()
self.neighbors.parent = self
self._children_name_map["neighbors"] = "neighbors"
self.global_ = Mpls.SignalingProtocols.RsvpTe.Global()
self.global_.parent = self
self._children_name_map["global_"] = "global"
self.interface_attributes = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes()
self.interface_attributes.parent = self
self._children_name_map["interface_attributes"] = "interface-attributes"
self._segment_path = lambda: "rsvp-te"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe, [], name, value)
class Sessions(_Entity_):
"""
Enclosing container for sessions
.. attribute:: session
List of RSVP sessions
**type**\: list of :py:class:`Session <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Sessions, self).__init__()
self.yang_name = "sessions"
self.yang_parent_name = "rsvp-te"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("session", ("session", Mpls.SignalingProtocols.RsvpTe.Sessions.Session))])
self._leafs = OrderedDict()
self.session = YList(self)
self._segment_path = lambda: "sessions"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Sessions, [], name, value)
class Session(_Entity_):
"""
List of RSVP sessions
.. attribute:: local_index (key)
Reference to the local index for the RSVP session
**type**\: int
**range:** 0..18446744073709551615
**refers to**\: :py:class:`local_index <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State>`
**config**\: False
.. attribute:: record_route_objects
Enclosing container for MPLS RRO objects associated with the traffic engineered tunnel
**type**\: :py:class:`RecordRouteObjects <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects>`
**config**\: False
.. attribute:: state
Operational state parameters relating to the RSVP session
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Sessions.Session, self).__init__()
self.yang_name = "session"
self.yang_parent_name = "sessions"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['local_index']
self._child_classes = OrderedDict([("record-route-objects", ("record_route_objects", Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State))])
self._leafs = OrderedDict([
('local_index', (YLeaf(YType.str, 'local-index'), ['int'])),
])
self.local_index = None
self.record_route_objects = Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects()
self.record_route_objects.parent = self
self._children_name_map["record_route_objects"] = "record-route-objects"
self.state = Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "session" + "[local-index='" + str(self.local_index) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/sessions/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Sessions.Session, ['local_index'], name, value)
class RecordRouteObjects(_Entity_):
"""
Enclosing container for MPLS RRO objects associated with the
traffic engineered tunnel.
.. attribute:: record_route_object
Read\-only list of record route objects associated with the traffic engineered tunnel. Each entry in the list may contain a hop IP address, MPLS label allocated at the hop, and the flags associated with the entry
**type**\: list of :py:class:`RecordRouteObject <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects, self).__init__()
self.yang_name = "record-route-objects"
self.yang_parent_name = "session"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("record-route-object", ("record_route_object", Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject))])
self._leafs = OrderedDict()
self.record_route_object = YList(self)
self._segment_path = lambda: "record-route-objects"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects, [], name, value)
class RecordRouteObject(_Entity_):
"""
Read\-only list of record route objects associated with the
traffic engineered tunnel. Each entry in the list
may contain a hop IP address, MPLS label allocated
at the hop, and the flags associated with the entry.
.. attribute:: index (key)
Reference to the index of the record route object. The index is used to indicate the ordering of hops in the path
**type**\: int
**range:** 0..255
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject.State>`
**config**\: False
.. attribute:: state
Information related to RRO objects. The hop, label, and optional flags are present for each entry in the list
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject, self).__init__()
self.yang_name = "record-route-object"
self.yang_parent_name = "record-route-objects"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['index']
self._child_classes = OrderedDict([("state", ("state", Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject.State))])
self._leafs = OrderedDict([
('index', (YLeaf(YType.str, 'index'), ['int'])),
])
self.index = None
self.state = Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "record-route-object" + "[index='" + str(self.index) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject, ['index'], name, value)
class State(_Entity_):
"""
Information related to RRO objects. The hop, label, and
optional flags are present for each entry in the list.
.. attribute:: index
Index of object in the list. Used for ordering
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: address
IP router hop for RRO entry
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: reported_label
Label reported for RRO hop
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
.. attribute:: reported_flags
Subobject flags for MPLS label
**type**\: int
**range:** 0..255
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "record-route-object"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('index', (YLeaf(YType.uint8, 'index'), ['int'])),
('address', (YLeaf(YType.str, 'address'), ['str','str'])),
('reported_label', (YLeaf(YType.str, 'reported-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('reported_flags', (YLeaf(YType.uint8, 'reported-flags'), ['int'])),
])
self.index = None
self.address = None
self.reported_label = None
self.reported_flags = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.RecordRouteObjects.RecordRouteObject.State, ['index', 'address', 'reported_label', 'reported_flags'], name, value)
class State(_Entity_):
"""
Operational state parameters relating to the
RSVP session
.. attribute:: local_index
The index used to identify the RSVP session on the local network element. This index is generated by the device and is unique only to the local network element
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: source_address
Origin address of RSVP session
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: destination_address
Destination address of RSVP session
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: tunnel_id
The tunnel ID is an identifier used in the RSVP session, which remains constant over the life of the tunnel
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: lsp_id
The LSP ID distinguishes between two LSPs originated from the same headend, and is commonly used to distinguish RSVP sessions during make before break operations
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: session_name
The signaled name of this RSVP session
**type**\: str
**config**\: False
.. attribute:: status
Enumeration of RSVP session states
**type**\: :py:class:`Status <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State.Status>`
**config**\: False
.. attribute:: type
The type/role of the RSVP session, signifing the session's role on the current device, such as a transit session vs. an ingress session
**type**\: :py:class:`LSPROLE <ydk.models.openconfig.openconfig_mpls_types.LSPROLE>`
**config**\: False
.. attribute:: protection_requested
The type of protection requested for the RSVP session
**type**\: :py:class:`PROTECTIONTYPE <ydk.models.openconfig.openconfig_mpls_types.PROTECTIONTYPE>`
**config**\: False
.. attribute:: label_in
Incoming MPLS label associated with this RSVP session
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
.. attribute:: label_out
Outgoing MPLS label associated with this RSVP session
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
.. attribute:: sender_tspec
Operational state statistics relating to the SENDER\_TSPEC received for the RSVP session
**type**\: :py:class:`SenderTspec <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State.SenderTspec>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "session"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("sender-tspec", ("sender_tspec", Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State.SenderTspec))])
self._leafs = OrderedDict([
('local_index', (YLeaf(YType.uint64, 'local-index'), ['int'])),
('source_address', (YLeaf(YType.str, 'source-address'), ['str','str'])),
('destination_address', (YLeaf(YType.str, 'destination-address'), ['str','str'])),
('tunnel_id', (YLeaf(YType.uint16, 'tunnel-id'), ['int'])),
('lsp_id', (YLeaf(YType.uint16, 'lsp-id'), ['int'])),
('session_name', (YLeaf(YType.str, 'session-name'), ['str'])),
('status', (YLeaf(YType.enumeration, 'status'), [('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'SignalingProtocols.RsvpTe.Sessions.Session.State.Status')])),
('type', (YLeaf(YType.identityref, 'type'), [('ydk.models.openconfig.openconfig_mpls_types', 'LSPROLE')])),
('protection_requested', (YLeaf(YType.identityref, 'protection-requested'), [('ydk.models.openconfig.openconfig_mpls_types', 'PROTECTIONTYPE')])),
('label_in', (YLeaf(YType.str, 'label-in'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('label_out', (YLeaf(YType.str, 'label-out'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
])
self.local_index = None
self.source_address = None
self.destination_address = None
self.tunnel_id = None
self.lsp_id = None
self.session_name = None
self.status = None
self.type = None
self.protection_requested = None
self.label_in = None
self.label_out = None
self.sender_tspec = Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State.SenderTspec()
self.sender_tspec.parent = self
self._children_name_map["sender_tspec"] = "sender-tspec"
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State, ['local_index', 'source_address', 'destination_address', 'tunnel_id', 'lsp_id', 'session_name', 'status', 'type', 'protection_requested', 'label_in', 'label_out'], name, value)
class Status(Enum):
"""
Status (Enum Class)
Enumeration of RSVP session states
.. data:: UP = 0
RSVP session is up
.. data:: DOWN = 1
RSVP session is down
"""
UP = Enum.YLeaf(0, "UP")
DOWN = Enum.YLeaf(1, "DOWN")
class SenderTspec(_Entity_):
"""
Operational state statistics relating to the SENDER\_TSPEC
received for the RSVP session
.. attribute:: rate
The rate at which the head\-end device generates traffic, expressed in bytes per second
**type**\: str
**length:** 4..4
**config**\: False
**units**\: Bps
.. attribute:: size
The size of the token bucket that is used to determine the rate at which the head\-end device generates traffic, expressed in bytes per second
**type**\: str
**length:** 4..4
**config**\: False
**units**\: bytes per second
.. attribute:: peak_data_rate
The maximum traffic generation rate that the head\-end device sends traffic at
**type**\: union of the below types:
**type**\: str
**length:** 4..4
**type**\: :py:class:`PeakDataRate <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State.SenderTspec.PeakDataRate>`
**config**\: False
**units**\: bytes per second
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State.SenderTspec, self).__init__()
self.yang_name = "sender-tspec"
self.yang_parent_name = "state"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('rate', (YLeaf(YType.str, 'rate'), ['str'])),
('size', (YLeaf(YType.str, 'size'), ['str'])),
('peak_data_rate', (YLeaf(YType.str, 'peak-data-rate'), ['str',('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'SignalingProtocols.RsvpTe.Sessions.Session.State.SenderTspec.PeakDataRate')])),
])
self.rate = None
self.size = None
self.peak_data_rate = None
self._segment_path = lambda: "sender-tspec"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Sessions.Session.State.SenderTspec, ['rate', 'size', 'peak_data_rate'], name, value)
class PeakDataRate(Enum):
"""
PeakDataRate (Enum Class)
The maximum traffic generation rate that the head\-end
device sends traffic at.
.. data:: INFINITY = 0
The head-end device has no maximum data rate.
"""
INFINITY = Enum.YLeaf(0, "INFINITY")
class Neighbors(_Entity_):
"""
Configuration and state for RSVP neighbors connecting
to the device
.. attribute:: neighbor
List of RSVP neighbors of the local system
**type**\: list of :py:class:`Neighbor <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Neighbors, self).__init__()
self.yang_name = "neighbors"
self.yang_parent_name = "rsvp-te"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("neighbor", ("neighbor", Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor))])
self._leafs = OrderedDict()
self.neighbor = YList(self)
self._segment_path = lambda: "neighbors"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Neighbors, [], name, value)
class Neighbor(_Entity_):
"""
List of RSVP neighbors of the local system
.. attribute:: address (key)
Reference to the address of the RSVP neighbor
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**refers to**\: :py:class:`address <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor.State>`
**config**\: False
.. attribute:: state
Operational state parameters relating to the RSVP neighbor
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor, self).__init__()
self.yang_name = "neighbor"
self.yang_parent_name = "neighbors"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['address']
self._child_classes = OrderedDict([("state", ("state", Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor.State))])
self._leafs = OrderedDict([
('address', (YLeaf(YType.str, 'address'), ['str'])),
])
self.address = None
self.state = Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "neighbor" + "[address='" + str(self.address) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/neighbors/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor, ['address'], name, value)
class State(_Entity_):
"""
Operational state parameters relating to the
RSVP neighbor
.. attribute:: address
Address of RSVP neighbor
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: detected_interface
Interface where RSVP neighbor was detected
**type**\: str
**config**\: False
.. attribute:: neighbor_status
Enumuration of possible RSVP neighbor states
**type**\: :py:class:`NeighborStatus <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor.State.NeighborStatus>`
**config**\: False
.. attribute:: refresh_reduction
Suppport of neighbor for RSVP refresh reduction
**type**\: bool
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "neighbor"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('address', (YLeaf(YType.str, 'address'), ['str','str'])),
('detected_interface', (YLeaf(YType.str, 'detected-interface'), ['str'])),
('neighbor_status', (YLeaf(YType.enumeration, 'neighbor-status'), [('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'SignalingProtocols.RsvpTe.Neighbors.Neighbor.State.NeighborStatus')])),
('refresh_reduction', (YLeaf(YType.boolean, 'refresh-reduction'), ['bool'])),
])
self.address = None
self.detected_interface = None
self.neighbor_status = None
self.refresh_reduction = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Neighbors.Neighbor.State, ['address', 'detected_interface', 'neighbor_status', 'refresh_reduction'], name, value)
class NeighborStatus(Enum):
"""
NeighborStatus (Enum Class)
Enumuration of possible RSVP neighbor states
.. data:: UP = 0
RSVP hello messages are detected from the neighbor
.. data:: DOWN = 1
RSVP neighbor not detected as up, due to a
communication failure or IGP notification
the neighbor is unavailable
"""
UP = Enum.YLeaf(0, "UP")
DOWN = Enum.YLeaf(1, "DOWN")
class Global(_Entity_):
"""
Platform wide RSVP configuration and state
.. attribute:: graceful_restart
Operational state and configuration parameters relating to graceful\-restart for RSVP
**type**\: :py:class:`GracefulRestart <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart>`
.. attribute:: soft_preemption
Protocol options relating to RSVP soft preemption
**type**\: :py:class:`SoftPreemption <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption>`
.. attribute:: hellos
Top level container for RSVP hello parameters
**type**\: :py:class:`Hellos <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.Hellos>`
.. attribute:: state
Platform wide RSVP state, including counters
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global, self).__init__()
self.yang_name = "global"
self.yang_parent_name = "rsvp-te"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("graceful-restart", ("graceful_restart", Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart)), ("soft-preemption", ("soft_preemption", Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption)), ("hellos", ("hellos", Mpls.SignalingProtocols.RsvpTe.Global.Hellos)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.Global.State))])
self._leafs = OrderedDict()
self.graceful_restart = Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart()
self.graceful_restart.parent = self
self._children_name_map["graceful_restart"] = "graceful-restart"
self.soft_preemption = Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption()
self.soft_preemption.parent = self
self._children_name_map["soft_preemption"] = "soft-preemption"
self.hellos = Mpls.SignalingProtocols.RsvpTe.Global.Hellos()
self.hellos.parent = self
self._children_name_map["hellos"] = "hellos"
self.state = Mpls.SignalingProtocols.RsvpTe.Global.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "global"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global, [], name, value)
class GracefulRestart(_Entity_):
"""
Operational state and configuration parameters relating to
graceful\-restart for RSVP
.. attribute:: config
Configuration parameters relating to graceful\-restart
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.Config>`
.. attribute:: state
State information associated with RSVP graceful\-restart
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart, self).__init__()
self.yang_name = "graceful-restart"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.Config)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.State))])
self._leafs = OrderedDict()
self.config = Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "graceful-restart"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart, [], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to
graceful\-restart
.. attribute:: enable
Enables graceful restart on the node
**type**\: bool
**default value**\: false
.. attribute:: restart_time
Graceful restart time (seconds)
**type**\: int
**range:** 0..4294967295
.. attribute:: recovery_time
RSVP state recovery time
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "graceful-restart"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enable', (YLeaf(YType.boolean, 'enable'), ['bool'])),
('restart_time', (YLeaf(YType.uint32, 'restart-time'), ['int'])),
('recovery_time', (YLeaf(YType.uint32, 'recovery-time'), ['int'])),
])
self.enable = None
self.restart_time = None
self.recovery_time = None
self._segment_path = lambda: "config"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/graceful-restart/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.Config, ['enable', 'restart_time', 'recovery_time'], name, value)
class State(_Entity_):
"""
State information associated with
RSVP graceful\-restart
.. attribute:: enable
Enables graceful restart on the node
**type**\: bool
**config**\: False
**default value**\: false
.. attribute:: restart_time
Graceful restart time (seconds)
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: recovery_time
RSVP state recovery time
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "graceful-restart"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enable', (YLeaf(YType.boolean, 'enable'), ['bool'])),
('restart_time', (YLeaf(YType.uint32, 'restart-time'), ['int'])),
('recovery_time', (YLeaf(YType.uint32, 'recovery-time'), ['int'])),
])
self.enable = None
self.restart_time = None
self.recovery_time = None
self._segment_path = lambda: "state"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/graceful-restart/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.GracefulRestart.State, ['enable', 'restart_time', 'recovery_time'], name, value)
class SoftPreemption(_Entity_):
"""
Protocol options relating to RSVP
soft preemption
.. attribute:: config
Configuration parameters relating to RSVP soft preemption support
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.Config>`
.. attribute:: state
State parameters relating to RSVP soft preemption support
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption, self).__init__()
self.yang_name = "soft-preemption"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.Config)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.State))])
self._leafs = OrderedDict()
self.config = Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "soft-preemption"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption, [], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to RSVP
soft preemption support
.. attribute:: enable
Enables soft preemption on a node
**type**\: bool
**default value**\: false
.. attribute:: soft_preemption_timeout
Timeout value for soft preemption to revert to hard preemption. The default timeout for soft\-preemption is 30 seconds \- after which the local system reverts to hard pre\-emption
**type**\: int
**range:** 0..65535
**default value**\: 30
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "soft-preemption"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enable', (YLeaf(YType.boolean, 'enable'), ['bool'])),
('soft_preemption_timeout', (YLeaf(YType.uint16, 'soft-preemption-timeout'), ['int'])),
])
self.enable = None
self.soft_preemption_timeout = None
self._segment_path = lambda: "config"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/soft-preemption/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.Config, ['enable', 'soft_preemption_timeout'], name, value)
class State(_Entity_):
"""
State parameters relating to RSVP
soft preemption support
.. attribute:: enable
Enables soft preemption on a node
**type**\: bool
**config**\: False
**default value**\: false
.. attribute:: soft_preemption_timeout
Timeout value for soft preemption to revert to hard preemption. The default timeout for soft\-preemption is 30 seconds \- after which the local system reverts to hard pre\-emption
**type**\: int
**range:** 0..65535
**config**\: False
**default value**\: 30
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "soft-preemption"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enable', (YLeaf(YType.boolean, 'enable'), ['bool'])),
('soft_preemption_timeout', (YLeaf(YType.uint16, 'soft-preemption-timeout'), ['int'])),
])
self.enable = None
self.soft_preemption_timeout = None
self._segment_path = lambda: "state"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/soft-preemption/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.SoftPreemption.State, ['enable', 'soft_preemption_timeout'], name, value)
class Hellos(_Entity_):
"""
Top level container for RSVP hello parameters
.. attribute:: config
Configuration parameters relating to RSVP hellos
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.Hellos.Config>`
.. attribute:: state
State information associated with RSVP hellos
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.Hellos.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.Hellos, self).__init__()
self.yang_name = "hellos"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.RsvpTe.Global.Hellos.Config)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.Global.Hellos.State))])
self._leafs = OrderedDict()
self.config = Mpls.SignalingProtocols.RsvpTe.Global.Hellos.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.RsvpTe.Global.Hellos.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "hellos"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.Hellos, [], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to RSVP
hellos
.. attribute:: hello_interval
set the interval in ms between RSVP hello messages
**type**\: int
**range:** 1000..60000
**units**\: milliseconds
**default value**\: 9000
.. attribute:: refresh_reduction
enables all RSVP refresh reduction message bundling, RSVP message ID, reliable message delivery and summary refresh
**type**\: bool
**default value**\: true
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.Hellos.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "hellos"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('hello_interval', (YLeaf(YType.uint16, 'hello-interval'), ['int'])),
('refresh_reduction', (YLeaf(YType.boolean, 'refresh-reduction'), ['bool'])),
])
self.hello_interval = None
self.refresh_reduction = None
self._segment_path = lambda: "config"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/hellos/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.Hellos.Config, ['hello_interval', 'refresh_reduction'], name, value)
class State(_Entity_):
"""
State information associated with RSVP hellos
.. attribute:: hello_interval
set the interval in ms between RSVP hello messages
**type**\: int
**range:** 1000..60000
**config**\: False
**units**\: milliseconds
**default value**\: 9000
.. attribute:: refresh_reduction
enables all RSVP refresh reduction message bundling, RSVP message ID, reliable message delivery and summary refresh
**type**\: bool
**config**\: False
**default value**\: true
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.Hellos.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "hellos"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('hello_interval', (YLeaf(YType.uint16, 'hello-interval'), ['int'])),
('refresh_reduction', (YLeaf(YType.boolean, 'refresh-reduction'), ['bool'])),
])
self.hello_interval = None
self.refresh_reduction = None
self._segment_path = lambda: "state"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/hellos/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.Hellos.State, ['hello_interval', 'refresh_reduction'], name, value)
class State(_Entity_):
"""
Platform wide RSVP state, including counters
.. attribute:: counters
Platform wide RSVP statistics and counters
**type**\: :py:class:`Counters <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Global.State.Counters>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "global"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("counters", ("counters", Mpls.SignalingProtocols.RsvpTe.Global.State.Counters))])
self._leafs = OrderedDict()
self.counters = Mpls.SignalingProtocols.RsvpTe.Global.State.Counters()
self.counters.parent = self
self._children_name_map["counters"] = "counters"
self._segment_path = lambda: "state"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.State, [], name, value)
class Counters(_Entity_):
"""
Platform wide RSVP statistics and counters
.. attribute:: path_timeouts
TODO
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: reservation_timeouts
TODO
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: rate_limited_messages
RSVP messages dropped due to rate limiting
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_path_messages
Number of received RSVP Path messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_path_error_messages
Number of received RSVP Path Error messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_path_tear_messages
Number of received RSVP Path Tear messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_reservation_messages
Number of received RSVP Resv messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_reservation_error_messages
Number of received RSVP Resv Error messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_reservation_tear_messages
Number of received RSVP Resv Tear messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_hello_messages
Number of received RSVP hello messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_srefresh_messages
Number of received RSVP summary refresh messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_ack_messages
Number of received RSVP refresh reduction ack messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_path_messages
Number of sent RSVP PATH messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_path_error_messages
Number of sent RSVP Path Error messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_path_tear_messages
Number of sent RSVP Path Tear messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_reservation_messages
Number of sent RSVP Resv messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_reservation_error_messages
Number of sent RSVP Resv Error messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_reservation_tear_messages
Number of sent RSVP Resv Tear messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_hello_messages
Number of sent RSVP hello messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_srefresh_messages
Number of sent RSVP summary refresh messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_ack_messages
Number of sent RSVP refresh reduction ack messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.Global.State.Counters, self).__init__()
self.yang_name = "counters"
self.yang_parent_name = "state"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('path_timeouts', (YLeaf(YType.uint64, 'path-timeouts'), ['int'])),
('reservation_timeouts', (YLeaf(YType.uint64, 'reservation-timeouts'), ['int'])),
('rate_limited_messages', (YLeaf(YType.uint64, 'rate-limited-messages'), ['int'])),
('in_path_messages', (YLeaf(YType.uint64, 'in-path-messages'), ['int'])),
('in_path_error_messages', (YLeaf(YType.uint64, 'in-path-error-messages'), ['int'])),
('in_path_tear_messages', (YLeaf(YType.uint64, 'in-path-tear-messages'), ['int'])),
('in_reservation_messages', (YLeaf(YType.uint64, 'in-reservation-messages'), ['int'])),
('in_reservation_error_messages', (YLeaf(YType.uint64, 'in-reservation-error-messages'), ['int'])),
('in_reservation_tear_messages', (YLeaf(YType.uint64, 'in-reservation-tear-messages'), ['int'])),
('in_hello_messages', (YLeaf(YType.uint64, 'in-hello-messages'), ['int'])),
('in_srefresh_messages', (YLeaf(YType.uint64, 'in-srefresh-messages'), ['int'])),
('in_ack_messages', (YLeaf(YType.uint64, 'in-ack-messages'), ['int'])),
('out_path_messages', (YLeaf(YType.uint64, 'out-path-messages'), ['int'])),
('out_path_error_messages', (YLeaf(YType.uint64, 'out-path-error-messages'), ['int'])),
('out_path_tear_messages', (YLeaf(YType.uint64, 'out-path-tear-messages'), ['int'])),
('out_reservation_messages', (YLeaf(YType.uint64, 'out-reservation-messages'), ['int'])),
('out_reservation_error_messages', (YLeaf(YType.uint64, 'out-reservation-error-messages'), ['int'])),
('out_reservation_tear_messages', (YLeaf(YType.uint64, 'out-reservation-tear-messages'), ['int'])),
('out_hello_messages', (YLeaf(YType.uint64, 'out-hello-messages'), ['int'])),
('out_srefresh_messages', (YLeaf(YType.uint64, 'out-srefresh-messages'), ['int'])),
('out_ack_messages', (YLeaf(YType.uint64, 'out-ack-messages'), ['int'])),
])
self.path_timeouts = None
self.reservation_timeouts = None
self.rate_limited_messages = None
self.in_path_messages = None
self.in_path_error_messages = None
self.in_path_tear_messages = None
self.in_reservation_messages = None
self.in_reservation_error_messages = None
self.in_reservation_tear_messages = None
self.in_hello_messages = None
self.in_srefresh_messages = None
self.in_ack_messages = None
self.out_path_messages = None
self.out_path_error_messages = None
self.out_path_tear_messages = None
self.out_reservation_messages = None
self.out_reservation_error_messages = None
self.out_reservation_tear_messages = None
self.out_hello_messages = None
self.out_srefresh_messages = None
self.out_ack_messages = None
self._segment_path = lambda: "counters"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/global/state/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.Global.State.Counters, ['path_timeouts', 'reservation_timeouts', 'rate_limited_messages', 'in_path_messages', 'in_path_error_messages', 'in_path_tear_messages', 'in_reservation_messages', 'in_reservation_error_messages', 'in_reservation_tear_messages', 'in_hello_messages', 'in_srefresh_messages', 'in_ack_messages', 'out_path_messages', 'out_path_error_messages', 'out_path_tear_messages', 'out_reservation_messages', 'out_reservation_error_messages', 'out_reservation_tear_messages', 'out_hello_messages', 'out_srefresh_messages', 'out_ack_messages'], name, value)
class InterfaceAttributes(_Entity_):
"""
Attributes relating to RSVP\-TE enabled interfaces
.. attribute:: interface
list of per\-interface RSVP configurations
**type**\: list of :py:class:`Interface <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes, self).__init__()
self.yang_name = "interface-attributes"
self.yang_parent_name = "rsvp-te"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("interface", ("interface", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface))])
self._leafs = OrderedDict()
self.interface = YList(self)
self._segment_path = lambda: "interface-attributes"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes, [], name, value)
class Interface(_Entity_):
"""
list of per\-interface RSVP configurations
.. attribute:: interface_id (key)
reference to the interface\-id data
**type**\: str
**refers to**\: :py:class:`interface_id <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Config>`
.. attribute:: config
Configuration of per\-interface RSVP parameters
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Config>`
.. attribute:: state
Per\-interface RSVP protocol and state information
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State>`
**config**\: False
.. attribute:: interface_ref
Reference to an interface or subinterface
**type**\: :py:class:`InterfaceRef <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef>`
.. attribute:: bandwidth_reservations
Enclosing container for bandwidth reservation
**type**\: :py:class:`BandwidthReservations <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations>`
.. attribute:: hellos
Top level container for RSVP hello parameters
**type**\: :py:class:`Hellos <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos>`
.. attribute:: authentication
Configuration and state parameters relating to RSVP authentication as per RFC2747
**type**\: :py:class:`Authentication <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication>`
.. attribute:: subscription
Bandwidth percentage reservable by RSVP on an interface
**type**\: :py:class:`Subscription <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription>`
.. attribute:: protection
link\-protection (NHOP) related configuration
**type**\: :py:class:`Protection <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface, self).__init__()
self.yang_name = "interface"
self.yang_parent_name = "interface-attributes"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_id']
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Config)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State)), ("interface-ref", ("interface_ref", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef)), ("bandwidth-reservations", ("bandwidth_reservations", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations)), ("hellos", ("hellos", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos)), ("authentication", ("authentication", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication)), ("subscription", ("subscription", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription)), ("protection", ("protection", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection))])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
])
self.interface_id = None
self.config = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.interface_ref = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef()
self.interface_ref.parent = self
self._children_name_map["interface_ref"] = "interface-ref"
self.bandwidth_reservations = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations()
self.bandwidth_reservations.parent = self
self._children_name_map["bandwidth_reservations"] = "bandwidth-reservations"
self.hellos = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos()
self.hellos.parent = self
self._children_name_map["hellos"] = "hellos"
self.authentication = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication()
self.authentication.parent = self
self._children_name_map["authentication"] = "authentication"
self.subscription = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription()
self.subscription.parent = self
self._children_name_map["subscription"] = "subscription"
self.protection = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection()
self.protection.parent = self
self._children_name_map["protection"] = "protection"
self._segment_path = lambda: "interface" + "[interface-id='" + str(self.interface_id) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/rsvp-te/interface-attributes/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface, ['interface_id'], name, value)
class Config(_Entity_):
"""
Configuration of per\-interface RSVP parameters
.. attribute:: interface_id
Identifier for the interface
**type**\: str
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
])
self.interface_id = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Config, ['interface_id'], name, value)
class State(_Entity_):
"""
Per\-interface RSVP protocol and state information
.. attribute:: interface_id
Identifier for the interface
**type**\: str
**config**\: False
.. attribute:: counters
Interface specific RSVP statistics and counters
**type**\: :py:class:`Counters <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State.Counters>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("counters", ("counters", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State.Counters))])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
])
self.interface_id = None
self.counters = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State.Counters()
self.counters.parent = self
self._children_name_map["counters"] = "counters"
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State, ['interface_id'], name, value)
class Counters(_Entity_):
"""
Interface specific RSVP statistics and counters
.. attribute:: in_path_messages
Number of received RSVP Path messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_path_error_messages
Number of received RSVP Path Error messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_path_tear_messages
Number of received RSVP Path Tear messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_reservation_messages
Number of received RSVP Resv messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_reservation_error_messages
Number of received RSVP Resv Error messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_reservation_tear_messages
Number of received RSVP Resv Tear messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_hello_messages
Number of received RSVP hello messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_srefresh_messages
Number of received RSVP summary refresh messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_ack_messages
Number of received RSVP refresh reduction ack messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_path_messages
Number of sent RSVP PATH messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_path_error_messages
Number of sent RSVP Path Error messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_path_tear_messages
Number of sent RSVP Path Tear messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_reservation_messages
Number of sent RSVP Resv messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_reservation_error_messages
Number of sent RSVP Resv Error messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_reservation_tear_messages
Number of sent RSVP Resv Tear messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_hello_messages
Number of sent RSVP hello messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_srefresh_messages
Number of sent RSVP summary refresh messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_ack_messages
Number of sent RSVP refresh reduction ack messages
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State.Counters, self).__init__()
self.yang_name = "counters"
self.yang_parent_name = "state"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('in_path_messages', (YLeaf(YType.uint64, 'in-path-messages'), ['int'])),
('in_path_error_messages', (YLeaf(YType.uint64, 'in-path-error-messages'), ['int'])),
('in_path_tear_messages', (YLeaf(YType.uint64, 'in-path-tear-messages'), ['int'])),
('in_reservation_messages', (YLeaf(YType.uint64, 'in-reservation-messages'), ['int'])),
('in_reservation_error_messages', (YLeaf(YType.uint64, 'in-reservation-error-messages'), ['int'])),
('in_reservation_tear_messages', (YLeaf(YType.uint64, 'in-reservation-tear-messages'), ['int'])),
('in_hello_messages', (YLeaf(YType.uint64, 'in-hello-messages'), ['int'])),
('in_srefresh_messages', (YLeaf(YType.uint64, 'in-srefresh-messages'), ['int'])),
('in_ack_messages', (YLeaf(YType.uint64, 'in-ack-messages'), ['int'])),
('out_path_messages', (YLeaf(YType.uint64, 'out-path-messages'), ['int'])),
('out_path_error_messages', (YLeaf(YType.uint64, 'out-path-error-messages'), ['int'])),
('out_path_tear_messages', (YLeaf(YType.uint64, 'out-path-tear-messages'), ['int'])),
('out_reservation_messages', (YLeaf(YType.uint64, 'out-reservation-messages'), ['int'])),
('out_reservation_error_messages', (YLeaf(YType.uint64, 'out-reservation-error-messages'), ['int'])),
('out_reservation_tear_messages', (YLeaf(YType.uint64, 'out-reservation-tear-messages'), ['int'])),
('out_hello_messages', (YLeaf(YType.uint64, 'out-hello-messages'), ['int'])),
('out_srefresh_messages', (YLeaf(YType.uint64, 'out-srefresh-messages'), ['int'])),
('out_ack_messages', (YLeaf(YType.uint64, 'out-ack-messages'), ['int'])),
])
self.in_path_messages = None
self.in_path_error_messages = None
self.in_path_tear_messages = None
self.in_reservation_messages = None
self.in_reservation_error_messages = None
self.in_reservation_tear_messages = None
self.in_hello_messages = None
self.in_srefresh_messages = None
self.in_ack_messages = None
self.out_path_messages = None
self.out_path_error_messages = None
self.out_path_tear_messages = None
self.out_reservation_messages = None
self.out_reservation_error_messages = None
self.out_reservation_tear_messages = None
self.out_hello_messages = None
self.out_srefresh_messages = None
self.out_ack_messages = None
self._segment_path = lambda: "counters"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.State.Counters, ['in_path_messages', 'in_path_error_messages', 'in_path_tear_messages', 'in_reservation_messages', 'in_reservation_error_messages', 'in_reservation_tear_messages', 'in_hello_messages', 'in_srefresh_messages', 'in_ack_messages', 'out_path_messages', 'out_path_error_messages', 'out_path_tear_messages', 'out_reservation_messages', 'out_reservation_error_messages', 'out_reservation_tear_messages', 'out_hello_messages', 'out_srefresh_messages', 'out_ack_messages'], name, value)
class InterfaceRef(_Entity_):
"""
Reference to an interface or subinterface
.. attribute:: config
Configured reference to interface / subinterface
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.Config>`
.. attribute:: state
Operational state for interface\-ref
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef, self).__init__()
self.yang_name = "interface-ref"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.Config)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.State))])
self._leafs = OrderedDict()
self.config = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "interface-ref"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef, [], name, value)
class Config(_Entity_):
"""
Configured reference to interface / subinterface
.. attribute:: interface
Reference to a base interface. If a reference to a subinterface is required, this leaf must be specified to indicate the base interface
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface>`
.. attribute:: subinterface
Reference to a subinterface \-\- this requires the base interface to be specified using the interface leaf in this container. If only a reference to a base interface is requuired, this leaf should not be set
**type**\: int
**range:** 0..4294967295
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface.Subinterfaces.Subinterface>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "interface-ref"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('subinterface', (YLeaf(YType.str, 'subinterface'), ['int'])),
])
self.interface = None
self.subinterface = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.Config, ['interface', 'subinterface'], name, value)
class State(_Entity_):
"""
Operational state for interface\-ref
.. attribute:: interface
Reference to a base interface. If a reference to a subinterface is required, this leaf must be specified to indicate the base interface
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface>`
**config**\: False
.. attribute:: subinterface
Reference to a subinterface \-\- this requires the base interface to be specified using the interface leaf in this container. If only a reference to a base interface is requuired, this leaf should not be set
**type**\: int
**range:** 0..4294967295
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface.Subinterfaces.Subinterface>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "interface-ref"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('subinterface', (YLeaf(YType.str, 'subinterface'), ['int'])),
])
self.interface = None
self.subinterface = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.InterfaceRef.State, ['interface', 'subinterface'], name, value)
class BandwidthReservations(_Entity_):
"""
Enclosing container for bandwidth reservation
.. attribute:: bandwidth_reservation
Available and reserved bandwidth by priority on the interface
**type**\: list of :py:class:`BandwidthReservation <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations, self).__init__()
self.yang_name = "bandwidth-reservations"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("bandwidth-reservation", ("bandwidth_reservation", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation))])
self._leafs = OrderedDict()
self.bandwidth_reservation = YList(self)
self._segment_path = lambda: "bandwidth-reservations"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations, [], name, value)
class BandwidthReservation(_Entity_):
"""
Available and reserved bandwidth by priority on
the interface.
.. attribute:: priority (key)
Reference to the RSVP priority level
**type**\: union of the below types:
**type**\: int
**range:** 0..7
**type**\: :py:class:`Priority <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation.State.Priority>`
**refers to**\: :py:class:`priority <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation.State>`
**config**\: False
.. attribute:: state
Operational state parameters relating to a bandwidth reservation at a certain priority
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation, self).__init__()
self.yang_name = "bandwidth-reservation"
self.yang_parent_name = "bandwidth-reservations"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['priority']
self._child_classes = OrderedDict([("state", ("state", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation.State))])
self._leafs = OrderedDict([
('priority', (YLeaf(YType.str, 'priority'), ['str'])),
])
self.priority = None
self.state = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "bandwidth-reservation" + "[priority='" + str(self.priority) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation, ['priority'], name, value)
class State(_Entity_):
"""
Operational state parameters relating to a
bandwidth reservation at a certain priority
.. attribute:: priority
RSVP priority level for LSPs traversing the interface
**type**\: union of the below types:
**type**\: int
**range:** 0..7
**type**\: :py:class:`Priority <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation.State.Priority>`
**config**\: False
.. attribute:: available_bandwidth
Bandwidth currently available with the priority level, or for the entire interface when the priority is set to ALL
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: reserved_bandwidth
Bandwidth currently reserved within the priority level, or the sum of all priority levels when the keyword is set to ALL
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: active_reservations_count
Number of active RSVP reservations in the associated priority, or the sum of all reservations when the priority level is set to ALL
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: highwater_mark
Maximum bandwidth reserved on the interface within the priority, or across all priorities in the case that the priority level is set to ALL
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "bandwidth-reservation"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('priority', (YLeaf(YType.str, 'priority'), ['int',('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation.State.Priority')])),
('available_bandwidth', (YLeaf(YType.uint64, 'available-bandwidth'), ['int'])),
('reserved_bandwidth', (YLeaf(YType.uint64, 'reserved-bandwidth'), ['int'])),
('active_reservations_count', (YLeaf(YType.uint64, 'active-reservations-count'), ['int'])),
('highwater_mark', (YLeaf(YType.uint64, 'highwater-mark'), ['int'])),
])
self.priority = None
self.available_bandwidth = None
self.reserved_bandwidth = None
self.active_reservations_count = None
self.highwater_mark = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.BandwidthReservations.BandwidthReservation.State, ['priority', 'available_bandwidth', 'reserved_bandwidth', 'active_reservations_count', 'highwater_mark'], name, value)
class Priority(Enum):
"""
Priority (Enum Class)
RSVP priority level for LSPs traversing the interface
.. data:: ALL = 0
The ALL keyword represents the overall
state of the interface - i.e., the union
of all of the priority levels
"""
ALL = Enum.YLeaf(0, "ALL")
class Hellos(_Entity_):
"""
Top level container for RSVP hello parameters
.. attribute:: config
Configuration parameters relating to RSVP hellos
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.Config>`
.. attribute:: state
State information associated with RSVP hellos
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos, self).__init__()
self.yang_name = "hellos"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.Config)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.State))])
self._leafs = OrderedDict()
self.config = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "hellos"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos, [], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to RSVP
hellos
.. attribute:: hello_interval
set the interval in ms between RSVP hello messages
**type**\: int
**range:** 1000..60000
**units**\: milliseconds
**default value**\: 9000
.. attribute:: refresh_reduction
enables all RSVP refresh reduction message bundling, RSVP message ID, reliable message delivery and summary refresh
**type**\: bool
**default value**\: true
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "hellos"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('hello_interval', (YLeaf(YType.uint16, 'hello-interval'), ['int'])),
('refresh_reduction', (YLeaf(YType.boolean, 'refresh-reduction'), ['bool'])),
])
self.hello_interval = None
self.refresh_reduction = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.Config, ['hello_interval', 'refresh_reduction'], name, value)
class State(_Entity_):
"""
State information associated with RSVP hellos
.. attribute:: hello_interval
set the interval in ms between RSVP hello messages
**type**\: int
**range:** 1000..60000
**config**\: False
**units**\: milliseconds
**default value**\: 9000
.. attribute:: refresh_reduction
enables all RSVP refresh reduction message bundling, RSVP message ID, reliable message delivery and summary refresh
**type**\: bool
**config**\: False
**default value**\: true
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "hellos"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('hello_interval', (YLeaf(YType.uint16, 'hello-interval'), ['int'])),
('refresh_reduction', (YLeaf(YType.boolean, 'refresh-reduction'), ['bool'])),
])
self.hello_interval = None
self.refresh_reduction = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Hellos.State, ['hello_interval', 'refresh_reduction'], name, value)
class Authentication(_Entity_):
"""
Configuration and state parameters relating to RSVP
authentication as per RFC2747
.. attribute:: config
Configuration parameters relating to authentication
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.Config>`
.. attribute:: state
State information associated with authentication
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication, self).__init__()
self.yang_name = "authentication"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.Config)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.State))])
self._leafs = OrderedDict()
self.config = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "authentication"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication, [], name, value)
class Config(_Entity_):
"""
Configuration parameters relating
to authentication
.. attribute:: enable
Enables RSVP authentication on the node
**type**\: bool
**default value**\: false
.. attribute:: authentication_key
authenticate RSVP signaling messages
**type**\: str
**length:** 1..32
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "authentication"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enable', (YLeaf(YType.boolean, 'enable'), ['bool'])),
('authentication_key', (YLeaf(YType.str, 'authentication-key'), ['str'])),
])
self.enable = None
self.authentication_key = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.Config, ['enable', 'authentication_key'], name, value)
class State(_Entity_):
"""
State information associated
with authentication
.. attribute:: enable
Enables RSVP authentication on the node
**type**\: bool
**config**\: False
**default value**\: false
.. attribute:: authentication_key
authenticate RSVP signaling messages
**type**\: str
**length:** 1..32
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "authentication"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enable', (YLeaf(YType.boolean, 'enable'), ['bool'])),
('authentication_key', (YLeaf(YType.str, 'authentication-key'), ['str'])),
])
self.enable = None
self.authentication_key = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Authentication.State, ['enable', 'authentication_key'], name, value)
class Subscription(_Entity_):
"""
Bandwidth percentage reservable by RSVP
on an interface
.. attribute:: config
Configuration parameters relating to RSVP subscription options
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.Config>`
.. attribute:: state
State parameters relating to RSVP subscription options
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription, self).__init__()
self.yang_name = "subscription"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.Config)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.State))])
self._leafs = OrderedDict()
self.config = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "subscription"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription, [], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to RSVP
subscription options
.. attribute:: subscription
percentage of the interface bandwidth that RSVP can reserve
**type**\: int
**range:** 0..100
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "subscription"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('subscription', (YLeaf(YType.uint8, 'subscription'), ['int'])),
])
self.subscription = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.Config, ['subscription'], name, value)
class State(_Entity_):
"""
State parameters relating to RSVP
subscription options
.. attribute:: subscription
percentage of the interface bandwidth that RSVP can reserve
**type**\: int
**range:** 0..100
**config**\: False
.. attribute:: calculated_absolute_subscription_bw
The calculated absolute value of the bandwidth which is reservable to RSVP\-TE on the interface prior to any adjustments that may be made from external sources
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
**units**\: kbps
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "subscription"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('subscription', (YLeaf(YType.uint8, 'subscription'), ['int'])),
('calculated_absolute_subscription_bw', (YLeaf(YType.uint64, 'calculated-absolute-subscription-bw'), ['int'])),
])
self.subscription = None
self.calculated_absolute_subscription_bw = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Subscription.State, ['subscription', 'calculated_absolute_subscription_bw'], name, value)
class Protection(_Entity_):
"""
link\-protection (NHOP) related configuration
.. attribute:: config
Configuration for link\-protection
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.Config>`
.. attribute:: state
State for link\-protection
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection, self).__init__()
self.yang_name = "protection"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.Config)), ("state", ("state", Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.State))])
self._leafs = OrderedDict()
self.config = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "protection"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection, [], name, value)
class Config(_Entity_):
"""
Configuration for link\-protection
.. attribute:: link_protection_style_requested
Style of mpls frr protection desired\: link, link\-node, or unprotected
**type**\: :py:class:`PROTECTIONTYPE <ydk.models.openconfig.openconfig_mpls_types.PROTECTIONTYPE>`
**default value**\: oc-mplst:LINK_NODE_PROTECTION_REQUESTED
.. attribute:: bypass_optimize_interval
interval between periodic optimization of the bypass LSPs
**type**\: int
**range:** 0..65535
**units**\: seconds
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "protection"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('link_protection_style_requested', (YLeaf(YType.identityref, 'link-protection-style-requested'), [('ydk.models.openconfig.openconfig_mpls_types', 'PROTECTIONTYPE')])),
('bypass_optimize_interval', (YLeaf(YType.uint16, 'bypass-optimize-interval'), ['int'])),
])
self.link_protection_style_requested = None
self.bypass_optimize_interval = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.Config, ['link_protection_style_requested', 'bypass_optimize_interval'], name, value)
class State(_Entity_):
"""
State for link\-protection
.. attribute:: link_protection_style_requested
Style of mpls frr protection desired\: link, link\-node, or unprotected
**type**\: :py:class:`PROTECTIONTYPE <ydk.models.openconfig.openconfig_mpls_types.PROTECTIONTYPE>`
**config**\: False
**default value**\: oc-mplst:LINK_NODE_PROTECTION_REQUESTED
.. attribute:: bypass_optimize_interval
interval between periodic optimization of the bypass LSPs
**type**\: int
**range:** 0..65535
**config**\: False
**units**\: seconds
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "protection"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('link_protection_style_requested', (YLeaf(YType.identityref, 'link-protection-style-requested'), [('ydk.models.openconfig.openconfig_mpls_types', 'PROTECTIONTYPE')])),
('bypass_optimize_interval', (YLeaf(YType.uint16, 'bypass-optimize-interval'), ['int'])),
])
self.link_protection_style_requested = None
self.bypass_optimize_interval = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.RsvpTe.InterfaceAttributes.Interface.Protection.State, ['link_protection_style_requested', 'bypass_optimize_interval'], name, value)
class Ldp(_Entity_):
"""
LDP global signaling configuration
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.Ldp, self).__init__()
self.yang_name = "ldp"
self.yang_parent_name = "signaling-protocols"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict()
self._segment_path = lambda: "ldp"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/%s" % self._segment_path()
self._is_frozen = True
class SegmentRouting(_Entity_):
"""
MPLS\-specific Segment Routing configuration and operational state
parameters
.. attribute:: aggregate_sid_counters
Per\-SID counters aggregated across all interfaces on the local system
**type**\: :py:class:`AggregateSidCounters <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters>`
.. attribute:: interfaces
Interface related Segment Routing parameters
**type**\: :py:class:`Interfaces <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting, self).__init__()
self.yang_name = "segment-routing"
self.yang_parent_name = "signaling-protocols"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("aggregate-sid-counters", ("aggregate_sid_counters", Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters)), ("interfaces", ("interfaces", Mpls.SignalingProtocols.SegmentRouting.Interfaces))])
self._leafs = OrderedDict()
self.aggregate_sid_counters = Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters()
self.aggregate_sid_counters.parent = self
self._children_name_map["aggregate_sid_counters"] = "aggregate-sid-counters"
self.interfaces = Mpls.SignalingProtocols.SegmentRouting.Interfaces()
self.interfaces.parent = self
self._children_name_map["interfaces"] = "interfaces"
self._segment_path = lambda: "segment-routing"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting, [], name, value)
class AggregateSidCounters(_Entity_):
"""
Per\-SID counters aggregated across all interfaces on the local system
.. attribute:: aggregate_sid_counter
Counters aggregated across all of the interfaces of the local system corresponding to traffic received or forwarded with a particular SID
**type**\: list of :py:class:`AggregateSidCounter <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters, self).__init__()
self.yang_name = "aggregate-sid-counters"
self.yang_parent_name = "segment-routing"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("aggregate-sid-counter", ("aggregate_sid_counter", Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter))])
self._leafs = OrderedDict()
self.aggregate_sid_counter = YList(self)
self._segment_path = lambda: "aggregate-sid-counters"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/segment-routing/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters, [], name, value)
class AggregateSidCounter(_Entity_):
"""
Counters aggregated across all of the interfaces of the local
system corresponding to traffic received or forwarded with a
particular SID
.. attribute:: mpls_label (key)
The MPLS label representing the segment identifier
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**refers to**\: :py:class:`mpls_label <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter.State>`
**config**\: False
.. attribute:: state
State parameters for per\-SID statistics
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter, self).__init__()
self.yang_name = "aggregate-sid-counter"
self.yang_parent_name = "aggregate-sid-counters"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['mpls_label']
self._child_classes = OrderedDict([("state", ("state", Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter.State))])
self._leafs = OrderedDict([
('mpls_label', (YLeaf(YType.str, 'mpls-label'), ['str'])),
])
self.mpls_label = None
self.state = Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "aggregate-sid-counter" + "[mpls-label='" + str(self.mpls_label) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/segment-routing/aggregate-sid-counters/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter, ['mpls_label'], name, value)
class State(_Entity_):
"""
State parameters for per\-SID statistics
.. attribute:: mpls_label
The MPLS label used for the segment identifier
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
.. attribute:: in_pkts
A cumulative counter of the packets received within the context which have matched a label corresponding to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_octets
The cumulative counter of the total bytes received within the context which have matched a label corresponding to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_pkts
A cumulative counter of the total number of packets transmitted by the local system within the context which have a label imposed that corresponds to an Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_octets
A cumulative counter of the total bytes transmitted by the local system within the context which have a label imported that corresponds to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "aggregate-sid-counter"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mpls_label', (YLeaf(YType.str, 'mpls-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('in_pkts', (YLeaf(YType.uint64, 'in-pkts'), ['int'])),
('in_octets', (YLeaf(YType.uint64, 'in-octets'), ['int'])),
('out_pkts', (YLeaf(YType.uint64, 'out-pkts'), ['int'])),
('out_octets', (YLeaf(YType.uint64, 'out-octets'), ['int'])),
])
self.mpls_label = None
self.in_pkts = None
self.in_octets = None
self.out_pkts = None
self.out_octets = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.AggregateSidCounters.AggregateSidCounter.State, ['mpls_label', 'in_pkts', 'in_octets', 'out_pkts', 'out_octets'], name, value)
class Interfaces(_Entity_):
"""
Interface related Segment Routing parameters.
.. attribute:: interface
Parameters and MPLS\-specific configuration relating to Segment Routing on an interface
**type**\: list of :py:class:`Interface <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces, self).__init__()
self.yang_name = "interfaces"
self.yang_parent_name = "segment-routing"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("interface", ("interface", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface))])
self._leafs = OrderedDict()
self.interface = YList(self)
self._segment_path = lambda: "interfaces"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/segment-routing/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces, [], name, value)
class Interface(_Entity_):
"""
Parameters and MPLS\-specific configuration relating to Segment
Routing on an interface.
.. attribute:: interface_id (key)
A reference to the ID for the interface for which SR is configured
**type**\: str
**refers to**\: :py:class:`interface_id <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.Config>`
.. attribute:: config
MPLS\-specific Segment Routing configuration parameters related to an interface
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.Config>`
.. attribute:: state
MPLS\-specific Segment Routing operational state parameters related to an interface
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.State>`
**config**\: False
.. attribute:: sid_counters
Per\-SID statistics for MPLS
**type**\: :py:class:`SidCounters <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters>`
.. attribute:: interface_ref
Reference to an interface or subinterface
**type**\: :py:class:`InterfaceRef <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface, self).__init__()
self.yang_name = "interface"
self.yang_parent_name = "interfaces"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_id']
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.Config)), ("state", ("state", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.State)), ("sid-counters", ("sid_counters", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters)), ("interface-ref", ("interface_ref", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef))])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
])
self.interface_id = None
self.config = Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.sid_counters = Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters()
self.sid_counters.parent = self
self._children_name_map["sid_counters"] = "sid-counters"
self.interface_ref = Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef()
self.interface_ref.parent = self
self._children_name_map["interface_ref"] = "interface-ref"
self._segment_path = lambda: "interface" + "[interface-id='" + str(self.interface_id) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/signaling-protocols/segment-routing/interfaces/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface, ['interface_id'], name, value)
class Config(_Entity_):
"""
MPLS\-specific Segment Routing configuration parameters
related to an interface.
.. attribute:: interface_id
A unique identifier for the interface
**type**\: str
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
])
self.interface_id = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.Config, ['interface_id'], name, value)
class State(_Entity_):
"""
MPLS\-specific Segment Routing operational state parameters
related to an interface.
.. attribute:: interface_id
A unique identifier for the interface
**type**\: str
**config**\: False
.. attribute:: in_pkts
A cumulative counter of the packets received within the context which have matched a label corresponding to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_octets
The cumulative counter of the total bytes received within the context which have matched a label corresponding to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_pkts
A cumulative counter of the total number of packets transmitted by the local system within the context which have a label imposed that corresponds to an Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_octets
A cumulative counter of the total bytes transmitted by the local system within the context which have a label imported that corresponds to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_id', (YLeaf(YType.str, 'interface-id'), ['str'])),
('in_pkts', (YLeaf(YType.uint64, 'in-pkts'), ['int'])),
('in_octets', (YLeaf(YType.uint64, 'in-octets'), ['int'])),
('out_pkts', (YLeaf(YType.uint64, 'out-pkts'), ['int'])),
('out_octets', (YLeaf(YType.uint64, 'out-octets'), ['int'])),
])
self.interface_id = None
self.in_pkts = None
self.in_octets = None
self.out_pkts = None
self.out_octets = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.State, ['interface_id', 'in_pkts', 'in_octets', 'out_pkts', 'out_octets'], name, value)
class SidCounters(_Entity_):
"""
Per\-SID statistics for MPLS
.. attribute:: sid_counter
Per segment identifier counters for the MPLS dataplane
**type**\: list of :py:class:`SidCounter <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters, self).__init__()
self.yang_name = "sid-counters"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("sid-counter", ("sid_counter", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter))])
self._leafs = OrderedDict()
self.sid_counter = YList(self)
self._segment_path = lambda: "sid-counters"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters, [], name, value)
class SidCounter(_Entity_):
"""
Per segment identifier counters for the MPLS dataplane.
.. attribute:: mpls_label (key)
The MPLS label representing the segment identifier
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**refers to**\: :py:class:`mpls_label <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.State>`
**config**\: False
.. attribute:: state
State parameters for per\-SID statistics
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.State>`
**config**\: False
.. attribute:: forwarding_classes
Per\-SID per\-forwarding class counters for Segment Routing
**type**\: :py:class:`ForwardingClasses <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter, self).__init__()
self.yang_name = "sid-counter"
self.yang_parent_name = "sid-counters"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['mpls_label']
self._child_classes = OrderedDict([("state", ("state", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.State)), ("forwarding-classes", ("forwarding_classes", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses))])
self._leafs = OrderedDict([
('mpls_label', (YLeaf(YType.str, 'mpls-label'), ['str'])),
])
self.mpls_label = None
self.state = Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.forwarding_classes = Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses()
self.forwarding_classes.parent = self
self._children_name_map["forwarding_classes"] = "forwarding-classes"
self._segment_path = lambda: "sid-counter" + "[mpls-label='" + str(self.mpls_label) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter, ['mpls_label'], name, value)
class State(_Entity_):
"""
State parameters for per\-SID statistics
.. attribute:: mpls_label
The MPLS label used for the segment identifier
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
.. attribute:: in_pkts
A cumulative counter of the packets received within the context which have matched a label corresponding to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_octets
The cumulative counter of the total bytes received within the context which have matched a label corresponding to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_pkts
A cumulative counter of the total number of packets transmitted by the local system within the context which have a label imposed that corresponds to an Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_octets
A cumulative counter of the total bytes transmitted by the local system within the context which have a label imported that corresponds to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "sid-counter"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mpls_label', (YLeaf(YType.str, 'mpls-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('in_pkts', (YLeaf(YType.uint64, 'in-pkts'), ['int'])),
('in_octets', (YLeaf(YType.uint64, 'in-octets'), ['int'])),
('out_pkts', (YLeaf(YType.uint64, 'out-pkts'), ['int'])),
('out_octets', (YLeaf(YType.uint64, 'out-octets'), ['int'])),
])
self.mpls_label = None
self.in_pkts = None
self.in_octets = None
self.out_pkts = None
self.out_octets = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.State, ['mpls_label', 'in_pkts', 'in_octets', 'out_pkts', 'out_octets'], name, value)
class ForwardingClasses(_Entity_):
"""
Per\-SID per\-forwarding class counters for Segment Routing.
.. attribute:: forwarding_class
SID entries for the forwarding class associated with the referenced MPLS EXP
**type**\: list of :py:class:`ForwardingClass <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses, self).__init__()
self.yang_name = "forwarding-classes"
self.yang_parent_name = "sid-counter"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("forwarding-class", ("forwarding_class", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass))])
self._leafs = OrderedDict()
self.forwarding_class = YList(self)
self._segment_path = lambda: "forwarding-classes"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses, [], name, value)
class ForwardingClass(_Entity_):
"""
SID entries for the forwarding class associated with the
referenced MPLS EXP.
.. attribute:: exp (key)
Reference to the value of the EXP bits of the segment identifier
**type**\: int
**range:** 0..7
**refers to**\: :py:class:`exp <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass.State>`
**config**\: False
.. attribute:: state
Per\-SID, per forwarding class counters for Segment Routing with the MPLS dataplane
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass, self).__init__()
self.yang_name = "forwarding-class"
self.yang_parent_name = "forwarding-classes"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['exp']
self._child_classes = OrderedDict([("state", ("state", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass.State))])
self._leafs = OrderedDict([
('exp', (YLeaf(YType.str, 'exp'), ['int'])),
])
self.exp = None
self.state = Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "forwarding-class" + "[exp='" + str(self.exp) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass, ['exp'], name, value)
class State(_Entity_):
"""
Per\-SID, per forwarding class counters for Segment Routing
with the MPLS dataplane
.. attribute:: exp
The value of the MPLS EXP (experimental) or Traffic Class bits that the SID statistics relate to. Packets received with a MPLS label value equal to the SID's MPLS label and EXP bits equal to the this value should be counted towards the associated ingress statistics. Packets that are forwarded to the destination MPLS label corresponding to the SID should be counted towards this value. In the egress direction, where forwarding follows a SID value that requires PHP at the local node, packets should still be counted towards the egress counters
**type**\: int
**range:** 0..7
**config**\: False
.. attribute:: in_pkts
A cumulative counter of the packets received within the context which have matched a label corresponding to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: in_octets
The cumulative counter of the total bytes received within the context which have matched a label corresponding to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_pkts
A cumulative counter of the total number of packets transmitted by the local system within the context which have a label imposed that corresponds to an Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: out_octets
A cumulative counter of the total bytes transmitted by the local system within the context which have a label imported that corresponds to an SR Segment Identifier
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "forwarding-class"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('exp', (YLeaf(YType.uint8, 'exp'), ['int'])),
('in_pkts', (YLeaf(YType.uint64, 'in-pkts'), ['int'])),
('in_octets', (YLeaf(YType.uint64, 'in-octets'), ['int'])),
('out_pkts', (YLeaf(YType.uint64, 'out-pkts'), ['int'])),
('out_octets', (YLeaf(YType.uint64, 'out-octets'), ['int'])),
])
self.exp = None
self.in_pkts = None
self.in_octets = None
self.out_pkts = None
self.out_octets = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.SidCounters.SidCounter.ForwardingClasses.ForwardingClass.State, ['exp', 'in_pkts', 'in_octets', 'out_pkts', 'out_octets'], name, value)
class InterfaceRef(_Entity_):
"""
Reference to an interface or subinterface
.. attribute:: config
Configured reference to interface / subinterface
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.Config>`
.. attribute:: state
Operational state for interface\-ref
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef, self).__init__()
self.yang_name = "interface-ref"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.Config)), ("state", ("state", Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.State))])
self._leafs = OrderedDict()
self.config = Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "interface-ref"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef, [], name, value)
class Config(_Entity_):
"""
Configured reference to interface / subinterface
.. attribute:: interface
Reference to a base interface. If a reference to a subinterface is required, this leaf must be specified to indicate the base interface
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface>`
.. attribute:: subinterface
Reference to a subinterface \-\- this requires the base interface to be specified using the interface leaf in this container. If only a reference to a base interface is requuired, this leaf should not be set
**type**\: int
**range:** 0..4294967295
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface.Subinterfaces.Subinterface>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "interface-ref"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('subinterface', (YLeaf(YType.str, 'subinterface'), ['int'])),
])
self.interface = None
self.subinterface = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.Config, ['interface', 'subinterface'], name, value)
class State(_Entity_):
"""
Operational state for interface\-ref
.. attribute:: interface
Reference to a base interface. If a reference to a subinterface is required, this leaf must be specified to indicate the base interface
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface>`
**config**\: False
.. attribute:: subinterface
Reference to a subinterface \-\- this requires the base interface to be specified using the interface leaf in this container. If only a reference to a base interface is requuired, this leaf should not be set
**type**\: int
**range:** 0..4294967295
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_interfaces.Interfaces.Interface.Subinterfaces.Subinterface>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "interface-ref"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('subinterface', (YLeaf(YType.str, 'subinterface'), ['int'])),
])
self.interface = None
self.subinterface = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.SignalingProtocols.SegmentRouting.Interfaces.Interface.InterfaceRef.State, ['interface', 'subinterface'], name, value)
class Lsps(_Entity_):
"""
LSP definitions and configuration
.. attribute:: constrained_path
traffic\-engineered LSPs supporting different path computation and signaling methods
**type**\: :py:class:`ConstrainedPath <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath>`
.. attribute:: unconstrained_path
LSPs that use the IGP\-determined path, i.e., non traffic\-engineered, or non constrained\-path
**type**\: :py:class:`UnconstrainedPath <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.UnconstrainedPath>`
.. attribute:: static_lsps
statically configured LSPs, without dynamic signaling
**type**\: :py:class:`StaticLsps <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps, self).__init__()
self.yang_name = "lsps"
self.yang_parent_name = "mpls"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("constrained-path", ("constrained_path", Mpls.Lsps.ConstrainedPath)), ("unconstrained-path", ("unconstrained_path", Mpls.Lsps.UnconstrainedPath)), ("static-lsps", ("static_lsps", Mpls.Lsps.StaticLsps))])
self._leafs = OrderedDict()
self.constrained_path = Mpls.Lsps.ConstrainedPath()
self.constrained_path.parent = self
self._children_name_map["constrained_path"] = "constrained-path"
self.unconstrained_path = Mpls.Lsps.UnconstrainedPath()
self.unconstrained_path.parent = self
self._children_name_map["unconstrained_path"] = "unconstrained-path"
self.static_lsps = Mpls.Lsps.StaticLsps()
self.static_lsps.parent = self
self._children_name_map["static_lsps"] = "static-lsps"
self._segment_path = lambda: "lsps"
self._absolute_path = lambda: "openconfig-mpls:mpls/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps, [], name, value)
class ConstrainedPath(_Entity_):
"""
traffic\-engineered LSPs supporting different
path computation and signaling methods
.. attribute:: named_explicit_paths
Enclosing container for the named explicit paths
**type**\: :py:class:`NamedExplicitPaths <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths>`
.. attribute:: tunnels
Enclosing container for tunnels
**type**\: :py:class:`Tunnels <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath, self).__init__()
self.yang_name = "constrained-path"
self.yang_parent_name = "lsps"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("named-explicit-paths", ("named_explicit_paths", Mpls.Lsps.ConstrainedPath.NamedExplicitPaths)), ("tunnels", ("tunnels", Mpls.Lsps.ConstrainedPath.Tunnels))])
self._leafs = OrderedDict()
self.named_explicit_paths = Mpls.Lsps.ConstrainedPath.NamedExplicitPaths()
self.named_explicit_paths.parent = self
self._children_name_map["named_explicit_paths"] = "named-explicit-paths"
self.tunnels = Mpls.Lsps.ConstrainedPath.Tunnels()
self.tunnels.parent = self
self._children_name_map["tunnels"] = "tunnels"
self._segment_path = lambda: "constrained-path"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath, [], name, value)
class NamedExplicitPaths(_Entity_):
"""
Enclosing container for the named explicit paths
.. attribute:: named_explicit_path
A list of explicit paths
**type**\: list of :py:class:`NamedExplicitPath <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths, self).__init__()
self.yang_name = "named-explicit-paths"
self.yang_parent_name = "constrained-path"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("named-explicit-path", ("named_explicit_path", Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath))])
self._leafs = OrderedDict()
self.named_explicit_path = YList(self)
self._segment_path = lambda: "named-explicit-paths"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/constrained-path/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths, [], name, value)
class NamedExplicitPath(_Entity_):
"""
A list of explicit paths
.. attribute:: name (key)
A string name that uniquely identifies an explicit path
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config>`
.. attribute:: config
Configuration parameters relating to named explicit paths
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config>`
.. attribute:: state
Operational state parameters relating to the named explicit paths
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.State>`
**config**\: False
.. attribute:: explicit_route_objects
Enclosing container for EROs
**type**\: :py:class:`ExplicitRouteObjects <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath, self).__init__()
self.yang_name = "named-explicit-path"
self.yang_parent_name = "named-explicit-paths"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['name']
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.State)), ("explicit-route-objects", ("explicit_route_objects", Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects))])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
])
self.name = None
self.config = Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.explicit_route_objects = Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects()
self.explicit_route_objects.parent = self
self._children_name_map["explicit_route_objects"] = "explicit-route-objects"
self._segment_path = lambda: "named-explicit-path" + "[name='" + str(self.name) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/constrained-path/named-explicit-paths/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath, ['name'], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to named explicit
paths
.. attribute:: name
A string name that uniquely identifies an explicit path
**type**\: str
.. attribute:: sid_selection_mode
The restrictions placed on the SIDs to be selected by the calculation method for the explicit path when it is instantiated for a SR\-TE LSP
**type**\: :py:class:`SidSelectionMode <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config.SidSelectionMode>`
**default value**\: MIXED_MODE
.. attribute:: sid_protection_required
When this value is set to true, only SIDs that are protected are to be selected by the calculating method when the explicit path is instantiated by a SR\-TE LSP
**type**\: bool
**default value**\: false
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "named-explicit-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('sid_selection_mode', (YLeaf(YType.enumeration, 'sid-selection-mode'), [('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config.SidSelectionMode')])),
('sid_protection_required', (YLeaf(YType.boolean, 'sid-protection-required'), ['bool'])),
])
self.name = None
self.sid_selection_mode = None
self.sid_protection_required = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config, ['name', 'sid_selection_mode', 'sid_protection_required'], name, value)
class SidSelectionMode(Enum):
"""
SidSelectionMode (Enum Class)
The restrictions placed on the SIDs to be selected by the
calculation method for the explicit path when it is
instantiated for a SR\-TE LSP
.. data:: ADJ_SID_ONLY = 0
The SR-TE tunnel should only use adjacency SIDs
to build the SID stack to be pushed for the LSP
.. data:: MIXED_MODE = 1
The SR-TE tunnel can use a mix of adjacency
and prefix SIDs to build the SID stack to be pushed
to the LSP
"""
ADJ_SID_ONLY = Enum.YLeaf(0, "ADJ_SID_ONLY")
MIXED_MODE = Enum.YLeaf(1, "MIXED_MODE")
class State(_Entity_):
"""
Operational state parameters relating to the named
explicit paths
.. attribute:: name
A string name that uniquely identifies an explicit path
**type**\: str
**config**\: False
.. attribute:: sid_selection_mode
The restrictions placed on the SIDs to be selected by the calculation method for the explicit path when it is instantiated for a SR\-TE LSP
**type**\: :py:class:`SidSelectionMode <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.State.SidSelectionMode>`
**config**\: False
**default value**\: MIXED_MODE
.. attribute:: sid_protection_required
When this value is set to true, only SIDs that are protected are to be selected by the calculating method when the explicit path is instantiated by a SR\-TE LSP
**type**\: bool
**config**\: False
**default value**\: false
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "named-explicit-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('sid_selection_mode', (YLeaf(YType.enumeration, 'sid-selection-mode'), [('ydk.models.openconfig.openconfig_mpls', 'Mpls', 'Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.State.SidSelectionMode')])),
('sid_protection_required', (YLeaf(YType.boolean, 'sid-protection-required'), ['bool'])),
])
self.name = None
self.sid_selection_mode = None
self.sid_protection_required = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.State, ['name', 'sid_selection_mode', 'sid_protection_required'], name, value)
class SidSelectionMode(Enum):
"""
SidSelectionMode (Enum Class)
The restrictions placed on the SIDs to be selected by the
calculation method for the explicit path when it is
instantiated for a SR\-TE LSP
.. data:: ADJ_SID_ONLY = 0
The SR-TE tunnel should only use adjacency SIDs
to build the SID stack to be pushed for the LSP
.. data:: MIXED_MODE = 1
The SR-TE tunnel can use a mix of adjacency
and prefix SIDs to build the SID stack to be pushed
to the LSP
"""
ADJ_SID_ONLY = Enum.YLeaf(0, "ADJ_SID_ONLY")
MIXED_MODE = Enum.YLeaf(1, "MIXED_MODE")
class ExplicitRouteObjects(_Entity_):
"""
Enclosing container for EROs
.. attribute:: explicit_route_object
List of explicit route objects
**type**\: list of :py:class:`ExplicitRouteObject <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects, self).__init__()
self.yang_name = "explicit-route-objects"
self.yang_parent_name = "named-explicit-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("explicit-route-object", ("explicit_route_object", Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject))])
self._leafs = OrderedDict()
self.explicit_route_object = YList(self)
self._segment_path = lambda: "explicit-route-objects"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects, [], name, value)
class ExplicitRouteObject(_Entity_):
"""
List of explicit route objects
.. attribute:: index (key)
Index of this explicit route object, to express the order of hops in path
**type**\: int
**range:** 0..255
**refers to**\: :py:class:`index <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.Config>`
.. attribute:: config
Configuration parameters relating to an explicit route
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.Config>`
.. attribute:: state
State parameters relating to an explicit route
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject, self).__init__()
self.yang_name = "explicit-route-object"
self.yang_parent_name = "explicit-route-objects"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['index']
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.State))])
self._leafs = OrderedDict([
('index', (YLeaf(YType.str, 'index'), ['int'])),
])
self.index = None
self.config = Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "explicit-route-object" + "[index='" + str(self.index) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject, ['index'], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to an explicit
route
.. attribute:: address
router hop for the LSP path
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
.. attribute:: hop_type
strict or loose hop
**type**\: :py:class:`MplsHopType <ydk.models.openconfig.openconfig_mpls.MplsHopType>`
.. attribute:: index
Index of this explicit route object to express the order of hops in the path
**type**\: int
**range:** 0..255
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "explicit-route-object"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('address', (YLeaf(YType.str, 'address'), ['str','str'])),
('hop_type', (YLeaf(YType.enumeration, 'hop-type'), [('ydk.models.openconfig.openconfig_mpls', 'MplsHopType', '')])),
('index', (YLeaf(YType.uint8, 'index'), ['int'])),
])
self.address = None
self.hop_type = None
self.index = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.Config, ['address', 'hop_type', 'index'], name, value)
class State(_Entity_):
"""
State parameters relating to an explicit route
.. attribute:: address
router hop for the LSP path
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: hop_type
strict or loose hop
**type**\: :py:class:`MplsHopType <ydk.models.openconfig.openconfig_mpls.MplsHopType>`
**config**\: False
.. attribute:: index
Index of this explicit route object to express the order of hops in the path
**type**\: int
**range:** 0..255
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "explicit-route-object"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('address', (YLeaf(YType.str, 'address'), ['str','str'])),
('hop_type', (YLeaf(YType.enumeration, 'hop-type'), [('ydk.models.openconfig.openconfig_mpls', 'MplsHopType', '')])),
('index', (YLeaf(YType.uint8, 'index'), ['int'])),
])
self.address = None
self.hop_type = None
self.index = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.ExplicitRouteObjects.ExplicitRouteObject.State, ['address', 'hop_type', 'index'], name, value)
class Tunnels(_Entity_):
"""
Enclosing container for tunnels
.. attribute:: tunnel
List of TE tunnels. This list contains only the LSPs that the current device originates (i.e., for which it is the head\-end). Where the signaling protocol utilised for an LSP allows a mid\-point or tail device to be aware of the LSP (e.g., RSVP\-TE), then the associated sessions are maintained per protocol
**type**\: list of :py:class:`Tunnel <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels, self).__init__()
self.yang_name = "tunnels"
self.yang_parent_name = "constrained-path"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("tunnel", ("tunnel", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel))])
self._leafs = OrderedDict()
self.tunnel = YList(self)
self._segment_path = lambda: "tunnels"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/constrained-path/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels, [], name, value)
class Tunnel(_Entity_):
"""
List of TE tunnels. This list contains only the LSPs that the
current device originates (i.e., for which it is the head\-end).
Where the signaling protocol utilised for an LSP allows a mid\-point
or tail device to be aware of the LSP (e.g., RSVP\-TE), then the
associated sessions are maintained per protocol
.. attribute:: name (key)
The tunnel name
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Config>`
.. attribute:: config
Configuration parameters related to TE tunnels\:
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Config>`
.. attribute:: state
State parameters related to TE tunnels
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State>`
**config**\: False
.. attribute:: bandwidth
Bandwidth configuration for TE LSPs
**type**\: :py:class:`Bandwidth <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth>`
.. attribute:: p2p_tunnel_attributes
Parameters related to LSPs of type P2P
**type**\: :py:class:`P2pTunnelAttributes <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel, self).__init__()
self.yang_name = "tunnel"
self.yang_parent_name = "tunnels"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['name']
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State)), ("bandwidth", ("bandwidth", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth)), ("p2p-tunnel-attributes", ("p2p_tunnel_attributes", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes))])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
])
self.name = None
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.bandwidth = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth()
self.bandwidth.parent = self
self._children_name_map["bandwidth"] = "bandwidth"
self.p2p_tunnel_attributes = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes()
self.p2p_tunnel_attributes.parent = self
self._children_name_map["p2p_tunnel_attributes"] = "p2p-tunnel-attributes"
self._segment_path = lambda: "tunnel" + "[name='" + str(self.name) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/constrained-path/tunnels/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel, ['name'], name, value)
class Config(_Entity_):
"""
Configuration parameters related to TE tunnels\:
.. attribute:: name
The tunnel name
**type**\: str
.. attribute:: type
Tunnel type, p2p or p2mp
**type**\: :py:class:`TUNNELTYPE <ydk.models.openconfig.openconfig_mpls_types.TUNNELTYPE>`
.. attribute:: signaling_protocol
Signaling protocol used to set up this tunnel
**type**\: :py:class:`PATHSETUPPROTOCOL <ydk.models.openconfig.openconfig_mpls_types.PATHSETUPPROTOCOL>`
.. attribute:: description
optional text description for the tunnel
**type**\: str
.. attribute:: admin_status
TE tunnel administrative state
**type**\: :py:class:`TUNNELADMINSTATUS <ydk.models.openconfig.openconfig_mpls_types.TUNNELADMINSTATUS>`
**default value**\: oc-mplst:ADMIN_UP
.. attribute:: preference
Specifies a preference for this tunnel. A lower number signifies a better preference
**type**\: int
**range:** 1..255
.. attribute:: metric_type
The type of metric specification that should be used to set the LSP(s) metric
**type**\: :py:class:`LSPMETRICTYPE <ydk.models.openconfig.openconfig_mpls_types.LSPMETRICTYPE>`
**default value**\: oc-mplst:LSP_METRIC_INHERITED
.. attribute:: metric
The value of the metric that should be specified. The value supplied in this leaf is used in conjunction with the metric type to determine the value of the metric used by the system. Where the metric\-type is set to LSP\_METRIC\_ABSOLUTE \- the value of this leaf is used directly; where it is set to LSP\_METRIC\_RELATIVE, the relevant (positive or negative) offset is used to formulate the metric; where metric\-type is LSP\_METRIC\_INHERITED, the value of this leaf is not utilised
**type**\: int
**range:** \-2147483648..2147483647
.. attribute:: shortcut_eligible
Whether this LSP is considered to be eligible for us as a shortcut in the IGP. In the case that this leaf is set to true, the IGP SPF calculation uses the metric specified to determine whether traffic should be carried over this LSP
**type**\: bool
**default value**\: true
.. attribute:: protection_style_requested
style of mpls frr protection desired\: can be link, link\-node or unprotected
**type**\: :py:class:`PROTECTIONTYPE <ydk.models.openconfig.openconfig_mpls_types.PROTECTIONTYPE>`
**default value**\: oc-mplst:UNPROTECTED
.. attribute:: reoptimize_timer
frequency of reoptimization of a traffic engineered LSP
**type**\: int
**range:** 0..65535
**units**\: seconds
.. attribute:: source
RSVP\-TE tunnel source address
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
.. attribute:: soft_preemption
Enables RSVP soft\-preemption on this LSP
**type**\: bool
**default value**\: false
.. attribute:: setup_priority
RSVP\-TE preemption priority during LSP setup, lower is higher priority; default 7 indicates that LSP will not preempt established LSPs during setup
**type**\: int
**range:** 0..7
**default value**\: 7
.. attribute:: hold_priority
preemption priority once the LSP is established, lower is higher priority; default 0 indicates other LSPs will not preempt the LSPs once established
**type**\: int
**range:** 0..7
**default value**\: 0
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "tunnel"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('type', (YLeaf(YType.identityref, 'type'), [('ydk.models.openconfig.openconfig_mpls_types', 'TUNNELTYPE')])),
('signaling_protocol', (YLeaf(YType.identityref, 'signaling-protocol'), [('ydk.models.openconfig.openconfig_mpls_types', 'PATHSETUPPROTOCOL')])),
('description', (YLeaf(YType.str, 'description'), ['str'])),
('admin_status', (YLeaf(YType.identityref, 'admin-status'), [('ydk.models.openconfig.openconfig_mpls_types', 'TUNNELADMINSTATUS')])),
('preference', (YLeaf(YType.uint8, 'preference'), ['int'])),
('metric_type', (YLeaf(YType.identityref, 'metric-type'), [('ydk.models.openconfig.openconfig_mpls_types', 'LSPMETRICTYPE')])),
('metric', (YLeaf(YType.int32, 'metric'), ['int'])),
('shortcut_eligible', (YLeaf(YType.boolean, 'shortcut-eligible'), ['bool'])),
('protection_style_requested', (YLeaf(YType.identityref, 'protection-style-requested'), [('ydk.models.openconfig.openconfig_mpls_types', 'PROTECTIONTYPE')])),
('reoptimize_timer', (YLeaf(YType.uint16, 'reoptimize-timer'), ['int'])),
('source', (YLeaf(YType.str, 'source'), ['str','str'])),
('soft_preemption', (YLeaf(YType.boolean, 'soft-preemption'), ['bool'])),
('setup_priority', (YLeaf(YType.uint8, 'setup-priority'), ['int'])),
('hold_priority', (YLeaf(YType.uint8, 'hold-priority'), ['int'])),
])
self.name = None
self.type = None
self.signaling_protocol = None
self.description = None
self.admin_status = None
self.preference = None
self.metric_type = None
self.metric = None
self.shortcut_eligible = None
self.protection_style_requested = None
self.reoptimize_timer = None
self.source = None
self.soft_preemption = None
self.setup_priority = None
self.hold_priority = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Config, ['name', 'type', 'signaling_protocol', 'description', 'admin_status', 'preference', 'metric_type', 'metric', 'shortcut_eligible', 'protection_style_requested', 'reoptimize_timer', 'source', 'soft_preemption', 'setup_priority', 'hold_priority'], name, value)
class State(_Entity_):
"""
State parameters related to TE tunnels
.. attribute:: name
The tunnel name
**type**\: str
**config**\: False
.. attribute:: type
Tunnel type, p2p or p2mp
**type**\: :py:class:`TUNNELTYPE <ydk.models.openconfig.openconfig_mpls_types.TUNNELTYPE>`
**config**\: False
.. attribute:: signaling_protocol
Signaling protocol used to set up this tunnel
**type**\: :py:class:`PATHSETUPPROTOCOL <ydk.models.openconfig.openconfig_mpls_types.PATHSETUPPROTOCOL>`
**config**\: False
.. attribute:: description
optional text description for the tunnel
**type**\: str
**config**\: False
.. attribute:: admin_status
TE tunnel administrative state
**type**\: :py:class:`TUNNELADMINSTATUS <ydk.models.openconfig.openconfig_mpls_types.TUNNELADMINSTATUS>`
**config**\: False
**default value**\: oc-mplst:ADMIN_UP
.. attribute:: preference
Specifies a preference for this tunnel. A lower number signifies a better preference
**type**\: int
**range:** 1..255
**config**\: False
.. attribute:: metric_type
The type of metric specification that should be used to set the LSP(s) metric
**type**\: :py:class:`LSPMETRICTYPE <ydk.models.openconfig.openconfig_mpls_types.LSPMETRICTYPE>`
**config**\: False
**default value**\: oc-mplst:LSP_METRIC_INHERITED
.. attribute:: metric
The value of the metric that should be specified. The value supplied in this leaf is used in conjunction with the metric type to determine the value of the metric used by the system. Where the metric\-type is set to LSP\_METRIC\_ABSOLUTE \- the value of this leaf is used directly; where it is set to LSP\_METRIC\_RELATIVE, the relevant (positive or negative) offset is used to formulate the metric; where metric\-type is LSP\_METRIC\_INHERITED, the value of this leaf is not utilised
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: shortcut_eligible
Whether this LSP is considered to be eligible for us as a shortcut in the IGP. In the case that this leaf is set to true, the IGP SPF calculation uses the metric specified to determine whether traffic should be carried over this LSP
**type**\: bool
**config**\: False
**default value**\: true
.. attribute:: protection_style_requested
style of mpls frr protection desired\: can be link, link\-node or unprotected
**type**\: :py:class:`PROTECTIONTYPE <ydk.models.openconfig.openconfig_mpls_types.PROTECTIONTYPE>`
**config**\: False
**default value**\: oc-mplst:UNPROTECTED
.. attribute:: reoptimize_timer
frequency of reoptimization of a traffic engineered LSP
**type**\: int
**range:** 0..65535
**config**\: False
**units**\: seconds
.. attribute:: source
RSVP\-TE tunnel source address
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: soft_preemption
Enables RSVP soft\-preemption on this LSP
**type**\: bool
**config**\: False
**default value**\: false
.. attribute:: setup_priority
RSVP\-TE preemption priority during LSP setup, lower is higher priority; default 7 indicates that LSP will not preempt established LSPs during setup
**type**\: int
**range:** 0..7
**config**\: False
**default value**\: 7
.. attribute:: hold_priority
preemption priority once the LSP is established, lower is higher priority; default 0 indicates other LSPs will not preempt the LSPs once established
**type**\: int
**range:** 0..7
**config**\: False
**default value**\: 0
.. attribute:: oper_status
The operational status of the TE tunnel
**type**\: :py:class:`LSPOPERSTATUS <ydk.models.openconfig.openconfig_mpls_types.LSPOPERSTATUS>`
**config**\: False
.. attribute:: role
The lsp role at the current node, whether it is headend, transit or tailend
**type**\: :py:class:`LSPROLE <ydk.models.openconfig.openconfig_mpls_types.LSPROLE>`
**config**\: False
.. attribute:: counters
State data for MPLS label switched paths. This state data is specific to a single label switched path
**type**\: :py:class:`Counters <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State.Counters>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "tunnel"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("counters", ("counters", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State.Counters))])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('type', (YLeaf(YType.identityref, 'type'), [('ydk.models.openconfig.openconfig_mpls_types', 'TUNNELTYPE')])),
('signaling_protocol', (YLeaf(YType.identityref, 'signaling-protocol'), [('ydk.models.openconfig.openconfig_mpls_types', 'PATHSETUPPROTOCOL')])),
('description', (YLeaf(YType.str, 'description'), ['str'])),
('admin_status', (YLeaf(YType.identityref, 'admin-status'), [('ydk.models.openconfig.openconfig_mpls_types', 'TUNNELADMINSTATUS')])),
('preference', (YLeaf(YType.uint8, 'preference'), ['int'])),
('metric_type', (YLeaf(YType.identityref, 'metric-type'), [('ydk.models.openconfig.openconfig_mpls_types', 'LSPMETRICTYPE')])),
('metric', (YLeaf(YType.int32, 'metric'), ['int'])),
('shortcut_eligible', (YLeaf(YType.boolean, 'shortcut-eligible'), ['bool'])),
('protection_style_requested', (YLeaf(YType.identityref, 'protection-style-requested'), [('ydk.models.openconfig.openconfig_mpls_types', 'PROTECTIONTYPE')])),
('reoptimize_timer', (YLeaf(YType.uint16, 'reoptimize-timer'), ['int'])),
('source', (YLeaf(YType.str, 'source'), ['str','str'])),
('soft_preemption', (YLeaf(YType.boolean, 'soft-preemption'), ['bool'])),
('setup_priority', (YLeaf(YType.uint8, 'setup-priority'), ['int'])),
('hold_priority', (YLeaf(YType.uint8, 'hold-priority'), ['int'])),
('oper_status', (YLeaf(YType.identityref, 'oper-status'), [('ydk.models.openconfig.openconfig_mpls_types', 'LSPOPERSTATUS')])),
('role', (YLeaf(YType.identityref, 'role'), [('ydk.models.openconfig.openconfig_mpls_types', 'LSPROLE')])),
])
self.name = None
self.type = None
self.signaling_protocol = None
self.description = None
self.admin_status = None
self.preference = None
self.metric_type = None
self.metric = None
self.shortcut_eligible = None
self.protection_style_requested = None
self.reoptimize_timer = None
self.source = None
self.soft_preemption = None
self.setup_priority = None
self.hold_priority = None
self.oper_status = None
self.role = None
self.counters = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State.Counters()
self.counters.parent = self
self._children_name_map["counters"] = "counters"
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State, ['name', 'type', 'signaling_protocol', 'description', 'admin_status', 'preference', 'metric_type', 'metric', 'shortcut_eligible', 'protection_style_requested', 'reoptimize_timer', 'source', 'soft_preemption', 'setup_priority', 'hold_priority', 'oper_status', 'role'], name, value)
class Counters(_Entity_):
"""
State data for MPLS label switched paths. This state
data is specific to a single label switched path.
.. attribute:: bytes
Number of bytes that have been forwarded over the label switched path
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: packets
Number of pacets that have been forwarded over the label switched path
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: path_changes
Number of path changes for the label switched path
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: state_changes
Number of state changes for the label switched path
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: online_time
Indication of the time the label switched path transitioned to an Oper Up or in\-service state
**type**\: str
**pattern:** ^[0\-9]{4}\\\-[0\-9]{2}\\\-[0\-9]{2}T[0\-9]{2}\:[0\-9]{2}\:[0\-9]{2}(\\.[0\-9]+)?Z[+\-][0\-9]{2}\:[0\-9]{2}$
**config**\: False
.. attribute:: current_path_time
Indicates the time the LSP switched onto its current path. This is reset upon a LSP path change
**type**\: str
**pattern:** ^[0\-9]{4}\\\-[0\-9]{2}\\\-[0\-9]{2}T[0\-9]{2}\:[0\-9]{2}\:[0\-9]{2}(\\.[0\-9]+)?Z[+\-][0\-9]{2}\:[0\-9]{2}$
**config**\: False
.. attribute:: next_reoptimization_time
Indicates the next scheduled time the LSP will be reoptimized
**type**\: str
**pattern:** ^[0\-9]{4}\\\-[0\-9]{2}\\\-[0\-9]{2}T[0\-9]{2}\:[0\-9]{2}\:[0\-9]{2}(\\.[0\-9]+)?Z[+\-][0\-9]{2}\:[0\-9]{2}$
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State.Counters, self).__init__()
self.yang_name = "counters"
self.yang_parent_name = "state"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('bytes', (YLeaf(YType.uint64, 'bytes'), ['int'])),
('packets', (YLeaf(YType.uint64, 'packets'), ['int'])),
('path_changes', (YLeaf(YType.uint64, 'path-changes'), ['int'])),
('state_changes', (YLeaf(YType.uint64, 'state-changes'), ['int'])),
('online_time', (YLeaf(YType.str, 'online-time'), ['str'])),
('current_path_time', (YLeaf(YType.str, 'current-path-time'), ['str'])),
('next_reoptimization_time', (YLeaf(YType.str, 'next-reoptimization-time'), ['str'])),
])
self.bytes = None
self.packets = None
self.path_changes = None
self.state_changes = None
self.online_time = None
self.current_path_time = None
self.next_reoptimization_time = None
self._segment_path = lambda: "counters"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.State.Counters, ['bytes', 'packets', 'path_changes', 'state_changes', 'online_time', 'current_path_time', 'next_reoptimization_time'], name, value)
class Bandwidth(_Entity_):
"""
Bandwidth configuration for TE LSPs
.. attribute:: config
Configuration parameters related to bandwidth on TE tunnels\:
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.Config>`
.. attribute:: state
State parameters related to bandwidth configuration of TE tunnels
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.State>`
**config**\: False
.. attribute:: auto_bandwidth
Parameters related to auto\-bandwidth
**type**\: :py:class:`AutoBandwidth <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth, self).__init__()
self.yang_name = "bandwidth"
self.yang_parent_name = "tunnel"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.State)), ("auto-bandwidth", ("auto_bandwidth", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.auto_bandwidth = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth()
self.auto_bandwidth.parent = self
self._children_name_map["auto_bandwidth"] = "auto-bandwidth"
self._segment_path = lambda: "bandwidth"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth, [], name, value)
class Config(_Entity_):
"""
Configuration parameters related to bandwidth on TE
tunnels\:
.. attribute:: specification_type
The method used for settign the bandwidth, either explicitly specified or configured
**type**\: :py:class:`TeBandwidthType <ydk.models.openconfig.openconfig_mpls.TeBandwidthType>`
**default value**\: SPECIFIED
.. attribute:: set_bandwidth
set bandwidth explicitly, e.g., using offline calculation
**type**\: int
**range:** 0..18446744073709551615
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "bandwidth"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('specification_type', (YLeaf(YType.enumeration, 'specification-type'), [('ydk.models.openconfig.openconfig_mpls', 'TeBandwidthType', '')])),
('set_bandwidth', (YLeaf(YType.uint64, 'set-bandwidth'), ['int'])),
])
self.specification_type = None
self.set_bandwidth = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.Config, ['specification_type', 'set_bandwidth'], name, value)
class State(_Entity_):
"""
State parameters related to bandwidth
configuration of TE tunnels
.. attribute:: specification_type
The method used for settign the bandwidth, either explicitly specified or configured
**type**\: :py:class:`TeBandwidthType <ydk.models.openconfig.openconfig_mpls.TeBandwidthType>`
**config**\: False
**default value**\: SPECIFIED
.. attribute:: set_bandwidth
set bandwidth explicitly, e.g., using offline calculation
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: signaled_bandwidth
The currently signaled bandwidth of the LSP. In the case where the bandwidth is specified explicitly, then this will match the value of the set\-bandwidth leaf; in cases where the bandwidth is dynamically computed by the system, the current value of the bandwidth should be reflected
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "bandwidth"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('specification_type', (YLeaf(YType.enumeration, 'specification-type'), [('ydk.models.openconfig.openconfig_mpls', 'TeBandwidthType', '')])),
('set_bandwidth', (YLeaf(YType.uint64, 'set-bandwidth'), ['int'])),
('signaled_bandwidth', (YLeaf(YType.uint64, 'signaled-bandwidth'), ['int'])),
])
self.specification_type = None
self.set_bandwidth = None
self.signaled_bandwidth = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.State, ['specification_type', 'set_bandwidth', 'signaled_bandwidth'], name, value)
class AutoBandwidth(_Entity_):
"""
Parameters related to auto\-bandwidth
.. attribute:: config
Configuration parameters relating to MPLS auto\-bandwidth on the tunnel
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Config>`
.. attribute:: state
State parameters relating to MPLS auto\-bandwidth on the tunnel
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.State>`
**config**\: False
.. attribute:: overflow
configuration of MPLS overflow bandwidth adjustement for the LSP
**type**\: :py:class:`Overflow <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow>`
.. attribute:: underflow
configuration of MPLS underflow bandwidth adjustement for the LSP
**type**\: :py:class:`Underflow <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth, self).__init__()
self.yang_name = "auto-bandwidth"
self.yang_parent_name = "bandwidth"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.State)), ("overflow", ("overflow", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow)), ("underflow", ("underflow", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.overflow = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow()
self.overflow.parent = self
self._children_name_map["overflow"] = "overflow"
self.underflow = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow()
self.underflow.parent = self
self._children_name_map["underflow"] = "underflow"
self._segment_path = lambda: "auto-bandwidth"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth, [], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to MPLS
auto\-bandwidth on the tunnel.
.. attribute:: enabled
enables mpls auto\-bandwidth on the lsp
**type**\: bool
**default value**\: false
.. attribute:: min_bw
set the minimum bandwidth in Kbps for an auto\-bandwidth LSP
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: max_bw
set the maximum bandwidth in Kbps for an auto\-bandwidth LSP
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: adjust_interval
time in seconds between adjustments to LSP bandwidth
**type**\: int
**range:** 0..4294967295
.. attribute:: adjust_threshold
percentage difference between the LSP's specified bandwidth and its current bandwidth allocation \-\- if the difference is greater than the specified percentage, auto\-bandwidth adjustment is triggered
**type**\: int
**range:** 0..100
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "auto-bandwidth"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enabled', (YLeaf(YType.boolean, 'enabled'), ['bool'])),
('min_bw', (YLeaf(YType.uint64, 'min-bw'), ['int'])),
('max_bw', (YLeaf(YType.uint64, 'max-bw'), ['int'])),
('adjust_interval', (YLeaf(YType.uint32, 'adjust-interval'), ['int'])),
('adjust_threshold', (YLeaf(YType.uint8, 'adjust-threshold'), ['int'])),
])
self.enabled = None
self.min_bw = None
self.max_bw = None
self.adjust_interval = None
self.adjust_threshold = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Config, ['enabled', 'min_bw', 'max_bw', 'adjust_interval', 'adjust_threshold'], name, value)
class State(_Entity_):
"""
State parameters relating to MPLS
auto\-bandwidth on the tunnel.
.. attribute:: enabled
enables mpls auto\-bandwidth on the lsp
**type**\: bool
**config**\: False
**default value**\: false
.. attribute:: min_bw
set the minimum bandwidth in Kbps for an auto\-bandwidth LSP
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: max_bw
set the maximum bandwidth in Kbps for an auto\-bandwidth LSP
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: adjust_interval
time in seconds between adjustments to LSP bandwidth
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: adjust_threshold
percentage difference between the LSP's specified bandwidth and its current bandwidth allocation \-\- if the difference is greater than the specified percentage, auto\-bandwidth adjustment is triggered
**type**\: int
**range:** 0..100
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "auto-bandwidth"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enabled', (YLeaf(YType.boolean, 'enabled'), ['bool'])),
('min_bw', (YLeaf(YType.uint64, 'min-bw'), ['int'])),
('max_bw', (YLeaf(YType.uint64, 'max-bw'), ['int'])),
('adjust_interval', (YLeaf(YType.uint32, 'adjust-interval'), ['int'])),
('adjust_threshold', (YLeaf(YType.uint8, 'adjust-threshold'), ['int'])),
])
self.enabled = None
self.min_bw = None
self.max_bw = None
self.adjust_interval = None
self.adjust_threshold = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.State, ['enabled', 'min_bw', 'max_bw', 'adjust_interval', 'adjust_threshold'], name, value)
class Overflow(_Entity_):
"""
configuration of MPLS overflow bandwidth
adjustement for the LSP
.. attribute:: config
Config information for MPLS overflow bandwidth adjustment
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.Config>`
.. attribute:: state
Config information for MPLS overflow bandwidth adjustment
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow, self).__init__()
self.yang_name = "overflow"
self.yang_parent_name = "auto-bandwidth"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.State))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "overflow"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow, [], name, value)
class Config(_Entity_):
"""
Config information for MPLS overflow bandwidth
adjustment
.. attribute:: enabled
enables mpls lsp bandwidth overflow adjustment on the lsp
**type**\: bool
**default value**\: false
.. attribute:: overflow_threshold
bandwidth percentage change to trigger an overflow event
**type**\: int
**range:** 0..100
.. attribute:: trigger_event_count
number of consecutive overflow sample events needed to trigger an overflow adjustment
**type**\: int
**range:** 0..65535
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "overflow"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enabled', (YLeaf(YType.boolean, 'enabled'), ['bool'])),
('overflow_threshold', (YLeaf(YType.uint8, 'overflow-threshold'), ['int'])),
('trigger_event_count', (YLeaf(YType.uint16, 'trigger-event-count'), ['int'])),
])
self.enabled = None
self.overflow_threshold = None
self.trigger_event_count = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.Config, ['enabled', 'overflow_threshold', 'trigger_event_count'], name, value)
class State(_Entity_):
"""
Config information for MPLS overflow bandwidth
adjustment
.. attribute:: enabled
enables mpls lsp bandwidth overflow adjustment on the lsp
**type**\: bool
**config**\: False
**default value**\: false
.. attribute:: overflow_threshold
bandwidth percentage change to trigger an overflow event
**type**\: int
**range:** 0..100
**config**\: False
.. attribute:: trigger_event_count
number of consecutive overflow sample events needed to trigger an overflow adjustment
**type**\: int
**range:** 0..65535
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "overflow"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enabled', (YLeaf(YType.boolean, 'enabled'), ['bool'])),
('overflow_threshold', (YLeaf(YType.uint8, 'overflow-threshold'), ['int'])),
('trigger_event_count', (YLeaf(YType.uint16, 'trigger-event-count'), ['int'])),
])
self.enabled = None
self.overflow_threshold = None
self.trigger_event_count = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Overflow.State, ['enabled', 'overflow_threshold', 'trigger_event_count'], name, value)
class Underflow(_Entity_):
"""
configuration of MPLS underflow bandwidth
adjustement for the LSP
.. attribute:: config
Config information for MPLS underflow bandwidth adjustment
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.Config>`
.. attribute:: state
State information for MPLS underflow bandwidth adjustment
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow, self).__init__()
self.yang_name = "underflow"
self.yang_parent_name = "auto-bandwidth"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.State))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "underflow"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow, [], name, value)
class Config(_Entity_):
"""
Config information for MPLS underflow bandwidth
adjustment
.. attribute:: enabled
enables bandwidth underflow adjustment on the lsp
**type**\: bool
**default value**\: false
.. attribute:: underflow_threshold
bandwidth percentage change to trigger and underflow event
**type**\: int
**range:** 0..100
.. attribute:: trigger_event_count
number of consecutive underflow sample events needed to trigger an underflow adjustment
**type**\: int
**range:** 0..65535
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "underflow"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enabled', (YLeaf(YType.boolean, 'enabled'), ['bool'])),
('underflow_threshold', (YLeaf(YType.uint8, 'underflow-threshold'), ['int'])),
('trigger_event_count', (YLeaf(YType.uint16, 'trigger-event-count'), ['int'])),
])
self.enabled = None
self.underflow_threshold = None
self.trigger_event_count = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.Config, ['enabled', 'underflow_threshold', 'trigger_event_count'], name, value)
class State(_Entity_):
"""
State information for MPLS underflow bandwidth
adjustment
.. attribute:: enabled
enables bandwidth underflow adjustment on the lsp
**type**\: bool
**config**\: False
**default value**\: false
.. attribute:: underflow_threshold
bandwidth percentage change to trigger and underflow event
**type**\: int
**range:** 0..100
**config**\: False
.. attribute:: trigger_event_count
number of consecutive underflow sample events needed to trigger an underflow adjustment
**type**\: int
**range:** 0..65535
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "underflow"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('enabled', (YLeaf(YType.boolean, 'enabled'), ['bool'])),
('underflow_threshold', (YLeaf(YType.uint8, 'underflow-threshold'), ['int'])),
('trigger_event_count', (YLeaf(YType.uint16, 'trigger-event-count'), ['int'])),
])
self.enabled = None
self.underflow_threshold = None
self.trigger_event_count = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.Bandwidth.AutoBandwidth.Underflow.State, ['enabled', 'underflow_threshold', 'trigger_event_count'], name, value)
class P2pTunnelAttributes(_Entity_):
"""
Parameters related to LSPs of type P2P
.. attribute:: config
Configuration parameters for P2P LSPs
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.Config>`
.. attribute:: state
State parameters for P2P LSPs
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.State>`
**config**\: False
.. attribute:: p2p_primary_path
Primary paths associated with the LSP
**type**\: :py:class:`P2pPrimaryPath <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath>`
.. attribute:: p2p_secondary_paths
Secondary paths for the LSP
**type**\: :py:class:`P2pSecondaryPaths <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes, self).__init__()
self.yang_name = "p2p-tunnel-attributes"
self.yang_parent_name = "tunnel"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.State)), ("p2p-primary-path", ("p2p_primary_path", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath)), ("p2p-secondary-paths", ("p2p_secondary_paths", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.p2p_primary_path = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath()
self.p2p_primary_path.parent = self
self._children_name_map["p2p_primary_path"] = "p2p-primary-path"
self.p2p_secondary_paths = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths()
self.p2p_secondary_paths.parent = self
self._children_name_map["p2p_secondary_paths"] = "p2p-secondary-paths"
self._segment_path = lambda: "p2p-tunnel-attributes"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes, [], name, value)
class Config(_Entity_):
"""
Configuration parameters for P2P LSPs
.. attribute:: destination
P2P tunnel destination address
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "p2p-tunnel-attributes"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('destination', (YLeaf(YType.str, 'destination'), ['str','str'])),
])
self.destination = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.Config, ['destination'], name, value)
class State(_Entity_):
"""
State parameters for P2P LSPs
.. attribute:: destination
P2P tunnel destination address
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "p2p-tunnel-attributes"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('destination', (YLeaf(YType.str, 'destination'), ['str','str'])),
])
self.destination = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.State, ['destination'], name, value)
class P2pPrimaryPath(_Entity_):
"""
Primary paths associated with the LSP
.. attribute:: p2p_primary_path
List of p2p primary paths for a tunnel
**type**\: list of :py:class:`P2pPrimaryPath_ <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath, self).__init__()
self.yang_name = "p2p-primary-path"
self.yang_parent_name = "p2p-tunnel-attributes"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("p2p-primary-path", ("p2p_primary_path", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_))])
self._leafs = OrderedDict()
self.p2p_primary_path = YList(self)
self._segment_path = lambda: "p2p-primary-path"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath, [], name, value)
class P2pPrimaryPath_(_Entity_):
"""
List of p2p primary paths for a tunnel
.. attribute:: name (key)
Path name
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.Config>`
.. attribute:: config
Configuration parameters related to paths
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.Config>`
.. attribute:: state
State parameters related to paths
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.State>`
**config**\: False
.. attribute:: candidate_secondary_paths
The set of candidate secondary paths which may be used for this primary path. When secondary paths are specified in the list the path of the secondary LSP in use must be restricted to those path options referenced. The priority of the secondary paths is specified within the list. Higher priority values are less preferred \- that is to say that a path with priority 0 is the most preferred path. In the case that the list is empty, any secondary path option may be utilised when the current primary path is in use
**type**\: :py:class:`CandidateSecondaryPaths <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths>`
.. attribute:: admin_groups
Top\-level container for include/exclude constraints for link affinities
**type**\: :py:class:`AdminGroups <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_, self).__init__()
self.yang_name = "p2p-primary-path"
self.yang_parent_name = "p2p-primary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['name']
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.State)), ("candidate-secondary-paths", ("candidate_secondary_paths", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths)), ("admin-groups", ("admin_groups", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups))])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
])
self.name = None
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.candidate_secondary_paths = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths()
self.candidate_secondary_paths.parent = self
self._children_name_map["candidate_secondary_paths"] = "candidate-secondary-paths"
self.admin_groups = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups()
self.admin_groups.parent = self
self._children_name_map["admin_groups"] = "admin-groups"
self._segment_path = lambda: "p2p-primary-path" + "[name='" + str(self.name) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_, ['name'], name, value)
class Config(_Entity_):
"""
Configuration parameters related to paths
.. attribute:: name
Path name
**type**\: str
.. attribute:: path_computation_method
The method used for computing the path, either locally computed, queried from a server or not computed at all (explicitly configured)
**type**\: :py:class:`PATHCOMPUTATIONMETHOD <ydk.models.openconfig.openconfig_mpls_types.PATHCOMPUTATIONMETHOD>`
**default value**\: oc-mplst:LOCALLY_COMPUTED
.. attribute:: use_cspf
Flag to enable CSPF for locally computed LSPs
**type**\: bool
.. attribute:: cspf_tiebreaker
Determine the tie\-breaking method to choose between equally desirable paths during CSFP computation
**type**\: :py:class:`CspfTieBreaking <ydk.models.openconfig.openconfig_mpls.CspfTieBreaking>`
.. attribute:: path_computation_server
Address of the external path computation server
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
.. attribute:: explicit_path_name
reference to a defined path
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config>`
.. attribute:: preference
Specifies a preference for this path. The lower the number higher the preference
**type**\: int
**range:** 1..255
.. attribute:: setup_priority
RSVP\-TE preemption priority during LSP setup, lower is higher priority; default 7 indicates that LSP will not preempt established LSPs during setup
**type**\: int
**range:** 0..7
**default value**\: 7
.. attribute:: hold_priority
preemption priority once the LSP is established, lower is higher priority; default 0 indicates other LSPs will not preempt the LSPs once established
**type**\: int
**range:** 0..7
**default value**\: 0
.. attribute:: retry_timer
sets the time between attempts to establish the LSP
**type**\: int
**range:** 1..600
**units**\: seconds
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "p2p-primary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('path_computation_method', (YLeaf(YType.identityref, 'path-computation-method'), [('ydk.models.openconfig.openconfig_mpls_types', 'PATHCOMPUTATIONMETHOD')])),
('use_cspf', (YLeaf(YType.boolean, 'use-cspf'), ['bool'])),
('cspf_tiebreaker', (YLeaf(YType.enumeration, 'cspf-tiebreaker'), [('ydk.models.openconfig.openconfig_mpls', 'CspfTieBreaking', '')])),
('path_computation_server', (YLeaf(YType.str, 'path-computation-server'), ['str','str'])),
('explicit_path_name', (YLeaf(YType.str, 'explicit-path-name'), ['str'])),
('preference', (YLeaf(YType.uint8, 'preference'), ['int'])),
('setup_priority', (YLeaf(YType.uint8, 'setup-priority'), ['int'])),
('hold_priority', (YLeaf(YType.uint8, 'hold-priority'), ['int'])),
('retry_timer', (YLeaf(YType.uint16, 'retry-timer'), ['int'])),
])
self.name = None
self.path_computation_method = None
self.use_cspf = None
self.cspf_tiebreaker = None
self.path_computation_server = None
self.explicit_path_name = None
self.preference = None
self.setup_priority = None
self.hold_priority = None
self.retry_timer = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.Config, ['name', 'path_computation_method', 'use_cspf', 'cspf_tiebreaker', 'path_computation_server', 'explicit_path_name', 'preference', 'setup_priority', 'hold_priority', 'retry_timer'], name, value)
class State(_Entity_):
"""
State parameters related to paths
.. attribute:: name
Path name
**type**\: str
**config**\: False
.. attribute:: path_computation_method
The method used for computing the path, either locally computed, queried from a server or not computed at all (explicitly configured)
**type**\: :py:class:`PATHCOMPUTATIONMETHOD <ydk.models.openconfig.openconfig_mpls_types.PATHCOMPUTATIONMETHOD>`
**config**\: False
**default value**\: oc-mplst:LOCALLY_COMPUTED
.. attribute:: use_cspf
Flag to enable CSPF for locally computed LSPs
**type**\: bool
**config**\: False
.. attribute:: cspf_tiebreaker
Determine the tie\-breaking method to choose between equally desirable paths during CSFP computation
**type**\: :py:class:`CspfTieBreaking <ydk.models.openconfig.openconfig_mpls.CspfTieBreaking>`
**config**\: False
.. attribute:: path_computation_server
Address of the external path computation server
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: explicit_path_name
reference to a defined path
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config>`
**config**\: False
.. attribute:: preference
Specifies a preference for this path. The lower the number higher the preference
**type**\: int
**range:** 1..255
**config**\: False
.. attribute:: setup_priority
RSVP\-TE preemption priority during LSP setup, lower is higher priority; default 7 indicates that LSP will not preempt established LSPs during setup
**type**\: int
**range:** 0..7
**config**\: False
**default value**\: 7
.. attribute:: hold_priority
preemption priority once the LSP is established, lower is higher priority; default 0 indicates other LSPs will not preempt the LSPs once established
**type**\: int
**range:** 0..7
**config**\: False
**default value**\: 0
.. attribute:: retry_timer
sets the time between attempts to establish the LSP
**type**\: int
**range:** 1..600
**config**\: False
**units**\: seconds
.. attribute:: associated_rsvp_session
If the signalling protocol specified for this path is RSVP\-TE, this leaf provides a reference to the associated session within the RSVP\-TE protocol sessions list, such that details of the signaling can be retrieved
**type**\: int
**range:** 0..18446744073709551615
**refers to**\: :py:class:`local_index <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "p2p-primary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('path_computation_method', (YLeaf(YType.identityref, 'path-computation-method'), [('ydk.models.openconfig.openconfig_mpls_types', 'PATHCOMPUTATIONMETHOD')])),
('use_cspf', (YLeaf(YType.boolean, 'use-cspf'), ['bool'])),
('cspf_tiebreaker', (YLeaf(YType.enumeration, 'cspf-tiebreaker'), [('ydk.models.openconfig.openconfig_mpls', 'CspfTieBreaking', '')])),
('path_computation_server', (YLeaf(YType.str, 'path-computation-server'), ['str','str'])),
('explicit_path_name', (YLeaf(YType.str, 'explicit-path-name'), ['str'])),
('preference', (YLeaf(YType.uint8, 'preference'), ['int'])),
('setup_priority', (YLeaf(YType.uint8, 'setup-priority'), ['int'])),
('hold_priority', (YLeaf(YType.uint8, 'hold-priority'), ['int'])),
('retry_timer', (YLeaf(YType.uint16, 'retry-timer'), ['int'])),
('associated_rsvp_session', (YLeaf(YType.str, 'associated-rsvp-session'), ['int'])),
])
self.name = None
self.path_computation_method = None
self.use_cspf = None
self.cspf_tiebreaker = None
self.path_computation_server = None
self.explicit_path_name = None
self.preference = None
self.setup_priority = None
self.hold_priority = None
self.retry_timer = None
self.associated_rsvp_session = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.State, ['name', 'path_computation_method', 'use_cspf', 'cspf_tiebreaker', 'path_computation_server', 'explicit_path_name', 'preference', 'setup_priority', 'hold_priority', 'retry_timer', 'associated_rsvp_session'], name, value)
class CandidateSecondaryPaths(_Entity_):
"""
The set of candidate secondary paths which may be used
for this primary path. When secondary paths are specified
in the list the path of the secondary LSP in use must be
restricted to those path options referenced. The
priority of the secondary paths is specified within the
list. Higher priority values are less preferred \- that is
to say that a path with priority 0 is the most preferred
path. In the case that the list is empty, any secondary
path option may be utilised when the current primary path
is in use.
.. attribute:: candidate_secondary_path
List of secondary paths which may be utilised when the current primary path is in use
**type**\: list of :py:class:`CandidateSecondaryPath <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths, self).__init__()
self.yang_name = "candidate-secondary-paths"
self.yang_parent_name = "p2p-primary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("candidate-secondary-path", ("candidate_secondary_path", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath))])
self._leafs = OrderedDict()
self.candidate_secondary_path = YList(self)
self._segment_path = lambda: "candidate-secondary-paths"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths, [], name, value)
class CandidateSecondaryPath(_Entity_):
"""
List of secondary paths which may be utilised when the
current primary path is in use
.. attribute:: secondary_path (key)
A reference to the secondary path option reference which acts as the key of the candidate\-secondary\-path list
**type**\: str
**refers to**\: :py:class:`secondary_path <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.Config>`
.. attribute:: config
Configuration parameters relating to the candidate secondary path
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.Config>`
.. attribute:: state
Operational state parameters relating to the candidate secondary path
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath, self).__init__()
self.yang_name = "candidate-secondary-path"
self.yang_parent_name = "candidate-secondary-paths"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['secondary_path']
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.State))])
self._leafs = OrderedDict([
('secondary_path', (YLeaf(YType.str, 'secondary-path'), ['str'])),
])
self.secondary_path = None
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "candidate-secondary-path" + "[secondary-path='" + str(self.secondary_path) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath, ['secondary_path'], name, value)
class Config(_Entity_):
"""
Configuration parameters relating to the candidate
secondary path
.. attribute:: secondary_path
A reference to the secondary path that should be utilised when the containing primary path option is in use
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.Config>`
.. attribute:: priority
The priority of the specified secondary path option. Higher priority options are less preferable \- such that a secondary path reference with a priority of 0 is the most preferred
**type**\: int
**range:** 0..65535
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "candidate-secondary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('secondary_path', (YLeaf(YType.str, 'secondary-path'), ['str'])),
('priority', (YLeaf(YType.uint16, 'priority'), ['int'])),
])
self.secondary_path = None
self.priority = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.Config, ['secondary_path', 'priority'], name, value)
class State(_Entity_):
"""
Operational state parameters relating to the candidate
secondary path
.. attribute:: secondary_path
A reference to the secondary path that should be utilised when the containing primary path option is in use
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.Config>`
**config**\: False
.. attribute:: priority
The priority of the specified secondary path option. Higher priority options are less preferable \- such that a secondary path reference with a priority of 0 is the most preferred
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: active
Indicates the current active path option that has been selected of the candidate secondary paths
**type**\: bool
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "candidate-secondary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('secondary_path', (YLeaf(YType.str, 'secondary-path'), ['str'])),
('priority', (YLeaf(YType.uint16, 'priority'), ['int'])),
('active', (YLeaf(YType.boolean, 'active'), ['bool'])),
])
self.secondary_path = None
self.priority = None
self.active = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.CandidateSecondaryPaths.CandidateSecondaryPath.State, ['secondary_path', 'priority', 'active'], name, value)
class AdminGroups(_Entity_):
"""
Top\-level container for include/exclude constraints for
link affinities
.. attribute:: config
Configuration data
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.Config>`
.. attribute:: state
Operational state data
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups, self).__init__()
self.yang_name = "admin-groups"
self.yang_parent_name = "p2p-primary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.State))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "admin-groups"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups, [], name, value)
class Config(_Entity_):
"""
Configuration data
.. attribute:: exclude_group
list of references to named admin\-groups to exclude in path calculation
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
.. attribute:: include_all_group
list of references to named admin\-groups of which all must be included
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
.. attribute:: include_any_group
list of references to named admin\-groups of which one must be included
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "admin-groups"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('exclude_group', (YLeafList(YType.str, 'exclude-group'), ['str'])),
('include_all_group', (YLeafList(YType.str, 'include-all-group'), ['str'])),
('include_any_group', (YLeafList(YType.str, 'include-any-group'), ['str'])),
])
self.exclude_group = []
self.include_all_group = []
self.include_any_group = []
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.Config, ['exclude_group', 'include_all_group', 'include_any_group'], name, value)
class State(_Entity_):
"""
Operational state data
.. attribute:: exclude_group
list of references to named admin\-groups to exclude in path calculation
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
**config**\: False
.. attribute:: include_all_group
list of references to named admin\-groups of which all must be included
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
**config**\: False
.. attribute:: include_any_group
list of references to named admin\-groups of which one must be included
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "admin-groups"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('exclude_group', (YLeafList(YType.str, 'exclude-group'), ['str'])),
('include_all_group', (YLeafList(YType.str, 'include-all-group'), ['str'])),
('include_any_group', (YLeafList(YType.str, 'include-any-group'), ['str'])),
])
self.exclude_group = []
self.include_all_group = []
self.include_any_group = []
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pPrimaryPath.P2pPrimaryPath_.AdminGroups.State, ['exclude_group', 'include_all_group', 'include_any_group'], name, value)
class P2pSecondaryPaths(_Entity_):
"""
Secondary paths for the LSP
.. attribute:: p2p_secondary_path
List of p2p primary paths for a tunnel
**type**\: list of :py:class:`P2pSecondaryPath <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths, self).__init__()
self.yang_name = "p2p-secondary-paths"
self.yang_parent_name = "p2p-tunnel-attributes"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("p2p-secondary-path", ("p2p_secondary_path", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath))])
self._leafs = OrderedDict()
self.p2p_secondary_path = YList(self)
self._segment_path = lambda: "p2p-secondary-paths"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths, [], name, value)
class P2pSecondaryPath(_Entity_):
"""
List of p2p primary paths for a tunnel
.. attribute:: name (key)
Path name
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.Config>`
.. attribute:: config
Configuration parameters related to paths
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.Config>`
.. attribute:: state
State parameters related to paths
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.State>`
**config**\: False
.. attribute:: admin_groups
Top\-level container for include/exclude constraints for link affinities
**type**\: :py:class:`AdminGroups <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath, self).__init__()
self.yang_name = "p2p-secondary-path"
self.yang_parent_name = "p2p-secondary-paths"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['name']
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.State)), ("admin-groups", ("admin_groups", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups))])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
])
self.name = None
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.admin_groups = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups()
self.admin_groups.parent = self
self._children_name_map["admin_groups"] = "admin-groups"
self._segment_path = lambda: "p2p-secondary-path" + "[name='" + str(self.name) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath, ['name'], name, value)
class Config(_Entity_):
"""
Configuration parameters related to paths
.. attribute:: name
Path name
**type**\: str
.. attribute:: path_computation_method
The method used for computing the path, either locally computed, queried from a server or not computed at all (explicitly configured)
**type**\: :py:class:`PATHCOMPUTATIONMETHOD <ydk.models.openconfig.openconfig_mpls_types.PATHCOMPUTATIONMETHOD>`
**default value**\: oc-mplst:LOCALLY_COMPUTED
.. attribute:: use_cspf
Flag to enable CSPF for locally computed LSPs
**type**\: bool
.. attribute:: cspf_tiebreaker
Determine the tie\-breaking method to choose between equally desirable paths during CSFP computation
**type**\: :py:class:`CspfTieBreaking <ydk.models.openconfig.openconfig_mpls.CspfTieBreaking>`
.. attribute:: path_computation_server
Address of the external path computation server
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
.. attribute:: explicit_path_name
reference to a defined path
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config>`
.. attribute:: preference
Specifies a preference for this path. The lower the number higher the preference
**type**\: int
**range:** 1..255
.. attribute:: setup_priority
RSVP\-TE preemption priority during LSP setup, lower is higher priority; default 7 indicates that LSP will not preempt established LSPs during setup
**type**\: int
**range:** 0..7
**default value**\: 7
.. attribute:: hold_priority
preemption priority once the LSP is established, lower is higher priority; default 0 indicates other LSPs will not preempt the LSPs once established
**type**\: int
**range:** 0..7
**default value**\: 0
.. attribute:: retry_timer
sets the time between attempts to establish the LSP
**type**\: int
**range:** 1..600
**units**\: seconds
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "p2p-secondary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('path_computation_method', (YLeaf(YType.identityref, 'path-computation-method'), [('ydk.models.openconfig.openconfig_mpls_types', 'PATHCOMPUTATIONMETHOD')])),
('use_cspf', (YLeaf(YType.boolean, 'use-cspf'), ['bool'])),
('cspf_tiebreaker', (YLeaf(YType.enumeration, 'cspf-tiebreaker'), [('ydk.models.openconfig.openconfig_mpls', 'CspfTieBreaking', '')])),
('path_computation_server', (YLeaf(YType.str, 'path-computation-server'), ['str','str'])),
('explicit_path_name', (YLeaf(YType.str, 'explicit-path-name'), ['str'])),
('preference', (YLeaf(YType.uint8, 'preference'), ['int'])),
('setup_priority', (YLeaf(YType.uint8, 'setup-priority'), ['int'])),
('hold_priority', (YLeaf(YType.uint8, 'hold-priority'), ['int'])),
('retry_timer', (YLeaf(YType.uint16, 'retry-timer'), ['int'])),
])
self.name = None
self.path_computation_method = None
self.use_cspf = None
self.cspf_tiebreaker = None
self.path_computation_server = None
self.explicit_path_name = None
self.preference = None
self.setup_priority = None
self.hold_priority = None
self.retry_timer = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.Config, ['name', 'path_computation_method', 'use_cspf', 'cspf_tiebreaker', 'path_computation_server', 'explicit_path_name', 'preference', 'setup_priority', 'hold_priority', 'retry_timer'], name, value)
class State(_Entity_):
"""
State parameters related to paths
.. attribute:: name
Path name
**type**\: str
**config**\: False
.. attribute:: path_computation_method
The method used for computing the path, either locally computed, queried from a server or not computed at all (explicitly configured)
**type**\: :py:class:`PATHCOMPUTATIONMETHOD <ydk.models.openconfig.openconfig_mpls_types.PATHCOMPUTATIONMETHOD>`
**config**\: False
**default value**\: oc-mplst:LOCALLY_COMPUTED
.. attribute:: use_cspf
Flag to enable CSPF for locally computed LSPs
**type**\: bool
**config**\: False
.. attribute:: cspf_tiebreaker
Determine the tie\-breaking method to choose between equally desirable paths during CSFP computation
**type**\: :py:class:`CspfTieBreaking <ydk.models.openconfig.openconfig_mpls.CspfTieBreaking>`
**config**\: False
.. attribute:: path_computation_server
Address of the external path computation server
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: explicit_path_name
reference to a defined path
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.NamedExplicitPaths.NamedExplicitPath.Config>`
**config**\: False
.. attribute:: preference
Specifies a preference for this path. The lower the number higher the preference
**type**\: int
**range:** 1..255
**config**\: False
.. attribute:: setup_priority
RSVP\-TE preemption priority during LSP setup, lower is higher priority; default 7 indicates that LSP will not preempt established LSPs during setup
**type**\: int
**range:** 0..7
**config**\: False
**default value**\: 7
.. attribute:: hold_priority
preemption priority once the LSP is established, lower is higher priority; default 0 indicates other LSPs will not preempt the LSPs once established
**type**\: int
**range:** 0..7
**config**\: False
**default value**\: 0
.. attribute:: retry_timer
sets the time between attempts to establish the LSP
**type**\: int
**range:** 1..600
**config**\: False
**units**\: seconds
.. attribute:: associated_rsvp_session
If the signalling protocol specified for this path is RSVP\-TE, this leaf provides a reference to the associated session within the RSVP\-TE protocol sessions list, such that details of the signaling can be retrieved
**type**\: int
**range:** 0..18446744073709551615
**refers to**\: :py:class:`local_index <ydk.models.openconfig.openconfig_mpls.Mpls.SignalingProtocols.RsvpTe.Sessions.Session>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "p2p-secondary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
('path_computation_method', (YLeaf(YType.identityref, 'path-computation-method'), [('ydk.models.openconfig.openconfig_mpls_types', 'PATHCOMPUTATIONMETHOD')])),
('use_cspf', (YLeaf(YType.boolean, 'use-cspf'), ['bool'])),
('cspf_tiebreaker', (YLeaf(YType.enumeration, 'cspf-tiebreaker'), [('ydk.models.openconfig.openconfig_mpls', 'CspfTieBreaking', '')])),
('path_computation_server', (YLeaf(YType.str, 'path-computation-server'), ['str','str'])),
('explicit_path_name', (YLeaf(YType.str, 'explicit-path-name'), ['str'])),
('preference', (YLeaf(YType.uint8, 'preference'), ['int'])),
('setup_priority', (YLeaf(YType.uint8, 'setup-priority'), ['int'])),
('hold_priority', (YLeaf(YType.uint8, 'hold-priority'), ['int'])),
('retry_timer', (YLeaf(YType.uint16, 'retry-timer'), ['int'])),
('associated_rsvp_session', (YLeaf(YType.str, 'associated-rsvp-session'), ['int'])),
])
self.name = None
self.path_computation_method = None
self.use_cspf = None
self.cspf_tiebreaker = None
self.path_computation_server = None
self.explicit_path_name = None
self.preference = None
self.setup_priority = None
self.hold_priority = None
self.retry_timer = None
self.associated_rsvp_session = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.State, ['name', 'path_computation_method', 'use_cspf', 'cspf_tiebreaker', 'path_computation_server', 'explicit_path_name', 'preference', 'setup_priority', 'hold_priority', 'retry_timer', 'associated_rsvp_session'], name, value)
class AdminGroups(_Entity_):
"""
Top\-level container for include/exclude constraints for
link affinities
.. attribute:: config
Configuration data
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.Config>`
.. attribute:: state
Operational state data
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups, self).__init__()
self.yang_name = "admin-groups"
self.yang_parent_name = "p2p-secondary-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.Config)), ("state", ("state", Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.State))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "admin-groups"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups, [], name, value)
class Config(_Entity_):
"""
Configuration data
.. attribute:: exclude_group
list of references to named admin\-groups to exclude in path calculation
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
.. attribute:: include_all_group
list of references to named admin\-groups of which all must be included
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
.. attribute:: include_any_group
list of references to named admin\-groups of which one must be included
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "admin-groups"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('exclude_group', (YLeafList(YType.str, 'exclude-group'), ['str'])),
('include_all_group', (YLeafList(YType.str, 'include-all-group'), ['str'])),
('include_any_group', (YLeafList(YType.str, 'include-any-group'), ['str'])),
])
self.exclude_group = []
self.include_all_group = []
self.include_any_group = []
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.Config, ['exclude_group', 'include_all_group', 'include_any_group'], name, value)
class State(_Entity_):
"""
Operational state data
.. attribute:: exclude_group
list of references to named admin\-groups to exclude in path calculation
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
**config**\: False
.. attribute:: include_all_group
list of references to named admin\-groups of which all must be included
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
**config**\: False
.. attribute:: include_any_group
list of references to named admin\-groups of which one must be included
**type**\: list of str
**refers to**\: :py:class:`admin_group_name <ydk.models.openconfig.openconfig_mpls.Mpls.TeGlobalAttributes.MplsAdminGroups.AdminGroup>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "admin-groups"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('exclude_group', (YLeafList(YType.str, 'exclude-group'), ['str'])),
('include_all_group', (YLeafList(YType.str, 'include-all-group'), ['str'])),
('include_any_group', (YLeafList(YType.str, 'include-any-group'), ['str'])),
])
self.exclude_group = []
self.include_all_group = []
self.include_any_group = []
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.ConstrainedPath.Tunnels.Tunnel.P2pTunnelAttributes.P2pSecondaryPaths.P2pSecondaryPath.AdminGroups.State, ['exclude_group', 'include_all_group', 'include_any_group'], name, value)
class UnconstrainedPath(_Entity_):
"""
LSPs that use the IGP\-determined path, i.e., non
traffic\-engineered, or non constrained\-path
.. attribute:: path_setup_protocol
select and configure the signaling method for the LSP
**type**\: :py:class:`PathSetupProtocol <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.UnconstrainedPath.PathSetupProtocol>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.UnconstrainedPath, self).__init__()
self.yang_name = "unconstrained-path"
self.yang_parent_name = "lsps"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("path-setup-protocol", ("path_setup_protocol", Mpls.Lsps.UnconstrainedPath.PathSetupProtocol))])
self._leafs = OrderedDict()
self.path_setup_protocol = Mpls.Lsps.UnconstrainedPath.PathSetupProtocol()
self.path_setup_protocol.parent = self
self._children_name_map["path_setup_protocol"] = "path-setup-protocol"
self._segment_path = lambda: "unconstrained-path"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.UnconstrainedPath, [], name, value)
class PathSetupProtocol(_Entity_):
"""
select and configure the signaling method for
the LSP
.. attribute:: ldp
LDP signaling setup for IGP\-congruent LSPs
**type**\: :py:class:`Ldp <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.UnconstrainedPath.PathSetupProtocol.Ldp>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.UnconstrainedPath.PathSetupProtocol, self).__init__()
self.yang_name = "path-setup-protocol"
self.yang_parent_name = "unconstrained-path"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("ldp", ("ldp", Mpls.Lsps.UnconstrainedPath.PathSetupProtocol.Ldp))])
self._leafs = OrderedDict()
self.ldp = Mpls.Lsps.UnconstrainedPath.PathSetupProtocol.Ldp()
self.ldp.parent = self
self._children_name_map["ldp"] = "ldp"
self._segment_path = lambda: "path-setup-protocol"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/unconstrained-path/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.UnconstrainedPath.PathSetupProtocol, [], name, value)
class Ldp(_Entity_):
"""
LDP signaling setup for IGP\-congruent LSPs
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.UnconstrainedPath.PathSetupProtocol.Ldp, self).__init__()
self.yang_name = "ldp"
self.yang_parent_name = "path-setup-protocol"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict()
self._segment_path = lambda: "ldp"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/unconstrained-path/path-setup-protocol/%s" % self._segment_path()
self._is_frozen = True
class StaticLsps(_Entity_):
"""
statically configured LSPs, without dynamic
signaling
.. attribute:: static_lsp
list of defined static LSPs
**type**\: list of :py:class:`StaticLsp <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps, self).__init__()
self.yang_name = "static-lsps"
self.yang_parent_name = "lsps"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("static-lsp", ("static_lsp", Mpls.Lsps.StaticLsps.StaticLsp))])
self._leafs = OrderedDict()
self.static_lsp = YList(self)
self._segment_path = lambda: "static-lsps"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps, [], name, value)
class StaticLsp(_Entity_):
"""
list of defined static LSPs
.. attribute:: name (key)
Reference the name list key
**type**\: str
**refers to**\: :py:class:`name <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Config>`
.. attribute:: config
Configuration data for the static lsp
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Config>`
.. attribute:: state
Operational state data for the static lsp
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.State>`
**config**\: False
.. attribute:: ingress
Static LSPs for which the router is an ingress node
**type**\: :py:class:`Ingress <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Ingress>`
.. attribute:: transit
Static LSPs for which the router is an transit node
**type**\: :py:class:`Transit <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Transit>`
.. attribute:: egress
Static LSPs for which the router is an egress node
**type**\: :py:class:`Egress <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Egress>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp, self).__init__()
self.yang_name = "static-lsp"
self.yang_parent_name = "static-lsps"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['name']
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.StaticLsps.StaticLsp.Config)), ("state", ("state", Mpls.Lsps.StaticLsps.StaticLsp.State)), ("ingress", ("ingress", Mpls.Lsps.StaticLsps.StaticLsp.Ingress)), ("transit", ("transit", Mpls.Lsps.StaticLsps.StaticLsp.Transit)), ("egress", ("egress", Mpls.Lsps.StaticLsps.StaticLsp.Egress))])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
])
self.name = None
self.config = Mpls.Lsps.StaticLsps.StaticLsp.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.StaticLsps.StaticLsp.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self.ingress = Mpls.Lsps.StaticLsps.StaticLsp.Ingress()
self.ingress.parent = self
self._children_name_map["ingress"] = "ingress"
self.transit = Mpls.Lsps.StaticLsps.StaticLsp.Transit()
self.transit.parent = self
self._children_name_map["transit"] = "transit"
self.egress = Mpls.Lsps.StaticLsps.StaticLsp.Egress()
self.egress.parent = self
self._children_name_map["egress"] = "egress"
self._segment_path = lambda: "static-lsp" + "[name='" + str(self.name) + "']"
self._absolute_path = lambda: "openconfig-mpls:mpls/lsps/static-lsps/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp, ['name'], name, value)
class Config(_Entity_):
"""
Configuration data for the static lsp
.. attribute:: name
name to identify the LSP
**type**\: str
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "static-lsp"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
])
self.name = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Config, ['name'], name, value)
class State(_Entity_):
"""
Operational state data for the static lsp
.. attribute:: name
name to identify the LSP
**type**\: str
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "static-lsp"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', (YLeaf(YType.str, 'name'), ['str'])),
])
self.name = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.State, ['name'], name, value)
class Ingress(_Entity_):
"""
Static LSPs for which the router is an
ingress node
.. attribute:: config
Configuration data for ingress LSPs
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Ingress.Config>`
.. attribute:: state
Operational state data for ingress LSPs
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Ingress.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Ingress, self).__init__()
self.yang_name = "ingress"
self.yang_parent_name = "static-lsp"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.StaticLsps.StaticLsp.Ingress.Config)), ("state", ("state", Mpls.Lsps.StaticLsps.StaticLsp.Ingress.State))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.StaticLsps.StaticLsp.Ingress.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.StaticLsps.StaticLsp.Ingress.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "ingress"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Ingress, [], name, value)
class Config(_Entity_):
"""
Configuration data for ingress LSPs
.. attribute:: next_hop
next hop IP address for the LSP
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
.. attribute:: incoming_label
label value on the incoming packet
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
.. attribute:: push_label
label value to push at the current hop for the LSP
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Ingress.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "ingress"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('next_hop', (YLeaf(YType.str, 'next-hop'), ['str','str'])),
('incoming_label', (YLeaf(YType.str, 'incoming-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('push_label', (YLeaf(YType.str, 'push-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
])
self.next_hop = None
self.incoming_label = None
self.push_label = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Ingress.Config, ['next_hop', 'incoming_label', 'push_label'], name, value)
class State(_Entity_):
"""
Operational state data for ingress LSPs
.. attribute:: next_hop
next hop IP address for the LSP
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: incoming_label
label value on the incoming packet
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
.. attribute:: push_label
label value to push at the current hop for the LSP
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Ingress.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "ingress"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('next_hop', (YLeaf(YType.str, 'next-hop'), ['str','str'])),
('incoming_label', (YLeaf(YType.str, 'incoming-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('push_label', (YLeaf(YType.str, 'push-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
])
self.next_hop = None
self.incoming_label = None
self.push_label = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Ingress.State, ['next_hop', 'incoming_label', 'push_label'], name, value)
class Transit(_Entity_):
"""
Static LSPs for which the router is an
transit node
.. attribute:: config
Configuration data for transit LSPs
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Transit.Config>`
.. attribute:: state
Operational state data for transit LSPs
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Transit.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Transit, self).__init__()
self.yang_name = "transit"
self.yang_parent_name = "static-lsp"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.StaticLsps.StaticLsp.Transit.Config)), ("state", ("state", Mpls.Lsps.StaticLsps.StaticLsp.Transit.State))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.StaticLsps.StaticLsp.Transit.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.StaticLsps.StaticLsp.Transit.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "transit"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Transit, [], name, value)
class Config(_Entity_):
"""
Configuration data for transit LSPs
.. attribute:: next_hop
next hop IP address for the LSP
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
.. attribute:: incoming_label
label value on the incoming packet
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
.. attribute:: push_label
label value to push at the current hop for the LSP
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Transit.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "transit"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('next_hop', (YLeaf(YType.str, 'next-hop'), ['str','str'])),
('incoming_label', (YLeaf(YType.str, 'incoming-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('push_label', (YLeaf(YType.str, 'push-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
])
self.next_hop = None
self.incoming_label = None
self.push_label = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Transit.Config, ['next_hop', 'incoming_label', 'push_label'], name, value)
class State(_Entity_):
"""
Operational state data for transit LSPs
.. attribute:: next_hop
next hop IP address for the LSP
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: incoming_label
label value on the incoming packet
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
.. attribute:: push_label
label value to push at the current hop for the LSP
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Transit.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "transit"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('next_hop', (YLeaf(YType.str, 'next-hop'), ['str','str'])),
('incoming_label', (YLeaf(YType.str, 'incoming-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('push_label', (YLeaf(YType.str, 'push-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
])
self.next_hop = None
self.incoming_label = None
self.push_label = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Transit.State, ['next_hop', 'incoming_label', 'push_label'], name, value)
class Egress(_Entity_):
"""
Static LSPs for which the router is an
egress node
.. attribute:: config
Configuration data for egress LSPs
**type**\: :py:class:`Config <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Egress.Config>`
.. attribute:: state
Operational state data for egress LSPs
**type**\: :py:class:`State <ydk.models.openconfig.openconfig_mpls.Mpls.Lsps.StaticLsps.StaticLsp.Egress.State>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Egress, self).__init__()
self.yang_name = "egress"
self.yang_parent_name = "static-lsp"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("config", ("config", Mpls.Lsps.StaticLsps.StaticLsp.Egress.Config)), ("state", ("state", Mpls.Lsps.StaticLsps.StaticLsp.Egress.State))])
self._leafs = OrderedDict()
self.config = Mpls.Lsps.StaticLsps.StaticLsp.Egress.Config()
self.config.parent = self
self._children_name_map["config"] = "config"
self.state = Mpls.Lsps.StaticLsps.StaticLsp.Egress.State()
self.state.parent = self
self._children_name_map["state"] = "state"
self._segment_path = lambda: "egress"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Egress, [], name, value)
class Config(_Entity_):
"""
Configuration data for egress LSPs
.. attribute:: next_hop
next hop IP address for the LSP
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
.. attribute:: incoming_label
label value on the incoming packet
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
.. attribute:: push_label
label value to push at the current hop for the LSP
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Egress.Config, self).__init__()
self.yang_name = "config"
self.yang_parent_name = "egress"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('next_hop', (YLeaf(YType.str, 'next-hop'), ['str','str'])),
('incoming_label', (YLeaf(YType.str, 'incoming-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('push_label', (YLeaf(YType.str, 'push-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
])
self.next_hop = None
self.incoming_label = None
self.push_label = None
self._segment_path = lambda: "config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Egress.Config, ['next_hop', 'incoming_label', 'push_label'], name, value)
class State(_Entity_):
"""
Operational state data for egress LSPs
.. attribute:: next_hop
next hop IP address for the LSP
**type**\: union of the below types:
**type**\: str
**pattern:** ^(([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])$
**type**\: str
**pattern:** ^(([0\-9a\-fA\-F]{1,4}\:){7}[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,7}\:\|([0\-9a\-fA\-F]{1,4}\:){1,6}\:[0\-9a\-fA\-F]{1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,5}(\:[0\-9a\-fA\-F]{1,4}){1,2}\|([0\-9a\-fA\-F]{1,4}\:){1,4}(\:[0\-9a\-fA\-F]{1,4}){1,3}\|([0\-9a\-fA\-F]{1,4}\:){1,3}(\:[0\-9a\-fA\-F]{1,4}){1,4}\|([0\-9a\-fA\-F]{1,4}\:){1,2}(\:[0\-9a\-fA\-F]{1,4}){1,5}\|[0\-9a\-fA\-F]{1,4}\:((\:[0\-9a\-fA\-F]{1,4}){1,6})\|\:((\:[0\-9a\-fA\-F]{1,4}){1,7}\|\:))$
**config**\: False
.. attribute:: incoming_label
label value on the incoming packet
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
.. attribute:: push_label
label value to push at the current hop for the LSP
**type**\: union of the below types:
**type**\: int
**range:** 16..1048575
**type**\: :py:class:`MplsLabel <ydk.models.openconfig.openconfig_segment_routing.MplsLabel>`
**config**\: False
"""
_prefix = 'oc-mpls'
_revision = '2017-03-22'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Mpls.Lsps.StaticLsps.StaticLsp.Egress.State, self).__init__()
self.yang_name = "state"
self.yang_parent_name = "egress"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('next_hop', (YLeaf(YType.str, 'next-hop'), ['str','str'])),
('incoming_label', (YLeaf(YType.str, 'incoming-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
('push_label', (YLeaf(YType.str, 'push-label'), ['int',('ydk.models.openconfig.openconfig_segment_routing', 'MplsLabel', '')])),
])
self.next_hop = None
self.incoming_label = None
self.push_label = None
self._segment_path = lambda: "state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Mpls.Lsps.StaticLsps.StaticLsp.Egress.State, ['next_hop', 'incoming_label', 'push_label'], name, value)
def clone_ptr(self):
self._top_entity = Mpls()
return self._top_entity
| 53.006741 | 921 | 0.427099 | 48,164 | 629,084 | 5.371003 | 0.017399 | 0.003618 | 0.008041 | 0.009649 | 0.929784 | 0.911492 | 0.885987 | 0.863736 | 0.839197 | 0.814948 | 0 | 0.024741 | 0.476095 | 629,084 | 11,867 | 922 | 53.011208 | 0.760164 | 0.265546 | 0 | 0.778352 | 0 | 0.001076 | 0.109019 | 0.034121 | 0 | 0 | 0 | 0.000169 | 0 | 1 | 0.07295 | false | 0.001291 | 0.001722 | 0 | 0.11728 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
96a3536131456cea7347608ae31f0cb4fcf5376c | 9,972 | py | Python | 2019/09/test_common.py | SabatierBoris/adventofcode | 19849a209e4e6d9d73ef5a5458c1831061a3ea42 | [
"MIT"
] | null | null | null | 2019/09/test_common.py | SabatierBoris/adventofcode | 19849a209e4e6d9d73ef5a5458c1831061a3ea42 | [
"MIT"
] | null | null | null | 2019/09/test_common.py | SabatierBoris/adventofcode | 19849a209e4e6d9d73ef5a5458c1831061a3ea42 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import unittest
from common import Program
class TestGetMode(unittest.TestCase):
def test_default(self):
mode = Program.get_mode(1, 3)
self.assertEqual(mode, [0, 0, 0])
def test_setted(self):
mode = Program.get_mode(1001, 3)
self.assertEqual(mode, [0, 1, 0])
def test_full_setted(self):
mode = Program.get_mode(11101, 3)
self.assertEqual(mode, [1, 1, 1])
class TestErrors(unittest.TestCase):
def test_out_of_range(self):
p = Program("1, 0, 0, 100, 99")
with self.assertRaises(IndexError):
p.step()
def test_immediate_result_error(self):
p = Program("10001, 0, 0, 0, 99")
with self.assertRaises(AssertionError):
p.step()
def test_out_of_range(self):
p = Program("98, 0, 0, 100, 99")
with self.assertRaises(AssertionError):
p.step()
class TestExecuteAdd(unittest.TestCase):
def test_simple(self):
p = Program("1,4,5,6,1,1,0")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [1, 4, 5, 6, 1, 1, 2])
def test_inplace(self):
p = Program("1, 0, 0, 0, 99")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [2, 0, 0, 0, 99])
def test_immediate_one(self):
p = Program("1001, 0, 4, 0, 99")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [1005, 0, 4, 0, 99])
def test_immediate_two(self):
p = Program("1101, 5, 4, 0, 99")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [9, 5, 4, 0, 99])
class TestExecuteMult(unittest.TestCase):
def test_simple(self):
p = Program("2, 4, 5, 6, 1, 1, 0")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [2, 4, 5, 6, 1, 1, 1])
def test_inplace(self):
p = Program("2, 0, 0, 0, 99")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [4, 0, 0, 0, 99])
def test_immediate_one(self):
p = Program("1002, 0, 4, 0, 99")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [4008, 0, 4, 0, 99])
def test_immediate_two(self):
p = Program("1102, 5, 4, 0, 99")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [20, 5, 4, 0, 99])
class TestInput(unittest.TestCase):
def test_simple(self):
p = Program("3, 1")
p.send_input(42)
p.step()
self.assertEqual(p._Program__pos, 2)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [3, 42])
class TestOuput(unittest.TestCase):
def test_simple(self):
p = Program("4, 0")
p.step()
self.assertEqual(p._Program__pos, 2)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [4, 0])
self.assertEqual(next(p), 4)
def test_direct(self):
p = Program("104, 0")
p.step()
self.assertEqual(p._Program__pos, 2)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [104, 0])
self.assertEqual(next(p), 0)
class TestJumpIfTrue(unittest.TestCase):
def test_simple_true(self):
p = Program("5, 0, 0")
p.step()
self.assertEqual(p._Program__pos, 5)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [5, 0, 0])
def test_direct_true(self):
p = Program("1105, 1, 0")
p.step()
self.assertEqual(p._Program__pos, 0)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [1105, 1, 0])
def test_simple_false(self):
p = Program("5, 2, 0")
p.step()
self.assertEqual(p._Program__pos, 3)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [5, 2, 0])
class TestJumpIfFalse(unittest.TestCase):
def test_simple_false(self):
p = Program("6, 2, 0")
p.step()
self.assertEqual(p._Program__pos, 6)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [6, 2, 0])
def test_direct_false(self):
p = Program("1106, 0, 0")
p.step()
self.assertEqual(p._Program__pos, 0)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [1106, 0, 0])
def test_simple_true(self):
p = Program("6, 1, 0")
p.step()
self.assertEqual(p._Program__pos, 3)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [6, 1, 0])
class TestLessThan(unittest.TestCase):
def test_true(self):
p = Program("7, 1, 0, 3")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [7, 1, 0, 1])
def test_false(self):
p = Program("7, 0, 1, 3")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [7, 0, 1, 0])
def test_direct_true(self):
p = Program("1107, 0, 1, 3")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [1107, 0, 1, 1])
def test_direct_false(self):
p = Program("1107, 1, 0, 3")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [1107, 1, 0, 0])
class TestEquals(unittest.TestCase):
def test_true(self):
p = Program("8, 0, 0, 3")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [8, 0, 0, 1])
def test_false(self):
p = Program("8, 0, 1, 3")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [8, 0, 1, 0])
def test_direct_true(self):
p = Program("1108, 1, 1, 3")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [1108, 1, 1, 1])
def test_direct_false(self):
p = Program("1108, 1, 0, 3")
p.step()
self.assertEqual(p._Program__pos, 4)
self.assertEqual(p._Program__running, True)
self.assertEqual(p._Program__memory, [1108, 1, 0, 0])
class TestHalt(unittest.TestCase):
def test_simple(self):
p = Program("99")
p.step()
self.assertEqual(p._Program__pos, 1)
self.assertEqual(p._Program__running, False)
self.assertEqual(p._Program__memory, [99])
class TestFonctionnal(unittest.TestCase):
def test_day2_part1(self):
datas = [
("1,9,10,3,2,3,11,0,99,30,40,50", 3500),
]
for data, result in datas:
with self.subTest(data=data):
p = Program(data)
p.execute()
self.assertEqual(p._Program__memory[0], result)
def test_day5_part2(self):
datas = [
(
"3,21,1008,21,8,20,1005,20,22,107,8,21,20,1006,20,31,1106,0,36,98,0,0,1002,21,125,20,4,20,1105,1,46,104,999,1105,1,46,1101,1000,1,20,4,20,1105,1,46,98,99",
7,
999,
),
(
"3,21,1008,21,8,20,1005,20,22,107,8,21,20,1006,20,31,1106,0,36,98,0,0,1002,21,125,20,4,20,1105,1,46,104,999,1105,1,46,1101,1000,1,20,4,20,1105,1,46,98,99",
8,
1000,
),
(
"3,21,1008,21,8,20,1005,20,22,107,8,21,20,1006,20,31,1106,0,36,98,0,0,1002,21,125,20,4,20,1105,1,46,104,999,1105,1,46,1101,1000,1,20,4,20,1105,1,46,98,99",
9,
1001,
),
]
for data, data_input, result in datas:
with self.subTest(data=data):
p = Program(data)
p.send_input(data_input)
p.execute()
self.assertEqual(next(p), result)
def test_day9_part1(self):
datas = [
(
"109,1,204,-1,1001,100,1,100,1008,100,16,101,1006,101,0,99",
[
109,
1,
204,
-1,
1001,
100,
1,
100,
1008,
100,
16,
101,
1006,
101,
0,
99,
],
),
("1102,34915192,34915192,7,4,7,99,0", [1219070632396864]),
("104,1125899906842624,99", [1125899906842624]),
]
for data, result in datas:
with self.subTest(data=data):
p = Program(data)
p.execute()
self.assertEqual(list(p), result)
if __name__ == "__main__":
unittest.main()
| 31.657143 | 171 | 0.563277 | 1,341 | 9,972 | 3.956749 | 0.092468 | 0.167358 | 0.238221 | 0.342443 | 0.839427 | 0.791745 | 0.768564 | 0.719186 | 0.633245 | 0.607991 | 0 | 0.125393 | 0.297834 | 9,972 | 314 | 172 | 31.757962 | 0.632391 | 0.002106 | 0 | 0.523077 | 0 | 0.019231 | 0.094573 | 0.060101 | 0 | 0 | 0 | 0 | 0.342308 | 1 | 0.134615 | false | 0 | 0.007692 | 0 | 0.188462 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7365aec634051a38293a0565168219efb5353661 | 1,830 | py | Python | tests/module/test_login.py | onekiloparsec/arcsecond.python | e4b22bf055c7f089ca9f0d6c4bda6314350878e0 | [
"MIT"
] | 7 | 2018-08-29T15:31:25.000Z | 2022-01-08T14:08:39.000Z | tests/module/test_login.py | onekiloparsec/arcsecond-python | e4b22bf055c7f089ca9f0d6c4bda6314350878e0 | [
"MIT"
] | 2 | 2018-10-21T07:42:26.000Z | 2020-02-24T10:11:22.000Z | tests/module/test_login.py | onekiloparsec/arcsecond-python | e4b22bf055c7f089ca9f0d6c4bda6314350878e0 | [
"MIT"
] | null | null | null | import httpretty
from arcsecond import ArcsecondAPI, config
from tests.utils import TEST_API_KEY, TEST_LOGIN_PASSWORD, TEST_LOGIN_USERNAME, TEST_UPLOAD_KEY, clear_test_credentials, \
prepare_successful_login
@httpretty.activate
def test_login_basic():
clear_test_credentials()
assert config.config_file_read_api_key('test') is None
prepare_successful_login()
ArcsecondAPI.login(TEST_LOGIN_USERNAME, TEST_LOGIN_PASSWORD, debug=True, test=True)
assert config.config_file_read_api_key('test') is None
assert config.config_file_read_upload_key('test') is None
@httpretty.activate
def test_login_apikey():
clear_test_credentials()
assert config.config_file_read_api_key('test') is None
prepare_successful_login()
ArcsecondAPI.login(TEST_LOGIN_USERNAME, TEST_LOGIN_PASSWORD, api_key=True, debug=True, test=True)
assert config.config_file_read_api_key('test') == TEST_API_KEY
assert config.config_file_read_upload_key('test') is None
@httpretty.activate
def test_login_uploadkey():
clear_test_credentials()
assert config.config_file_read_api_key('test') is None
prepare_successful_login()
ArcsecondAPI.login(TEST_LOGIN_USERNAME, TEST_LOGIN_PASSWORD, upload_key=True, debug=True, test=True)
assert config.config_file_read_api_key('test') is None
assert config.config_file_read_upload_key('test') == TEST_UPLOAD_KEY
@httpretty.activate
def test_login_both_apikey_uploadkey():
clear_test_credentials()
assert config.config_file_read_api_key('test') is None
prepare_successful_login()
ArcsecondAPI.login(TEST_LOGIN_USERNAME, TEST_LOGIN_PASSWORD, api_key=True, upload_key=True, debug=True, test=True)
assert config.config_file_read_api_key('test') == TEST_API_KEY
assert config.config_file_read_upload_key('test') == TEST_UPLOAD_KEY
| 39.782609 | 122 | 0.8 | 263 | 1,830 | 5.13308 | 0.121673 | 0.093333 | 0.16 | 0.195556 | 0.857778 | 0.814815 | 0.814815 | 0.814815 | 0.814815 | 0.814815 | 0 | 0 | 0.11694 | 1,830 | 45 | 123 | 40.666667 | 0.835396 | 0 | 0 | 0.666667 | 0 | 0 | 0.02623 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.111111 | true | 0.138889 | 0.083333 | 0 | 0.194444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
738366e57c2fd4e89b45aa6c9effa2f7b6e0cfe6 | 150 | py | Python | crosshair/__init__.py | cclauss/CrossHair | 44441e95722fe122b91a6b96e49cb03a56d91cc3 | [
"MIT"
] | 1 | 2020-01-16T03:24:23.000Z | 2020-01-16T03:24:23.000Z | crosshair/__init__.py | cclauss/CrossHair | 44441e95722fe122b91a6b96e49cb03a56d91cc3 | [
"MIT"
] | null | null | null | crosshair/__init__.py | cclauss/CrossHair | 44441e95722fe122b91a6b96e49cb03a56d91cc3 | [
"MIT"
] | null | null | null | from crosshair.core import realize
from crosshair.core import register_type
from crosshair.util import IgnoreAttempt
from crosshair.util import debug
| 30 | 40 | 0.866667 | 21 | 150 | 6.142857 | 0.47619 | 0.403101 | 0.263566 | 0.356589 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106667 | 150 | 4 | 41 | 37.5 | 0.962687 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
7395a2a2ed758a31abd377f3c7dcf0c50c909661 | 3,188 | py | Python | ingestion/waxstream/proto/bstream/v1/bstream_pb2_grpc.py | mharrisb1/peered-in | 7b2f0dd984e2eb4dca7369b71d5a6868978aed68 | [
"MIT"
] | null | null | null | ingestion/waxstream/proto/bstream/v1/bstream_pb2_grpc.py | mharrisb1/peered-in | 7b2f0dd984e2eb4dca7369b71d5a6868978aed68 | [
"MIT"
] | null | null | null | ingestion/waxstream/proto/bstream/v1/bstream_pb2_grpc.py | mharrisb1/peered-in | 7b2f0dd984e2eb4dca7369b71d5a6868978aed68 | [
"MIT"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
from . import bstream_pb2 as dfuse_dot_bstream_dot_v1_dot_bstream__pb2
class BlockStreamStub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.Blocks = channel.unary_stream(
"/dfuse.bstream.v1.BlockStream/Blocks",
request_serializer=dfuse_dot_bstream_dot_v1_dot_bstream__pb2.BlockRequest.SerializeToString,
response_deserializer=dfuse_dot_bstream_dot_v1_dot_bstream__pb2.Block.FromString,
)
class BlockStreamServicer(object):
# missing associated documentation comment in .proto file
pass
def Blocks(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details("Method not implemented!")
raise NotImplementedError("Method not implemented!")
def add_BlockStreamServicer_to_server(servicer, server):
rpc_method_handlers = {
"Blocks": grpc.unary_stream_rpc_method_handler(
servicer.Blocks,
request_deserializer=dfuse_dot_bstream_dot_v1_dot_bstream__pb2.BlockRequest.FromString,
response_serializer=dfuse_dot_bstream_dot_v1_dot_bstream__pb2.Block.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
"dfuse.bstream.v1.BlockStream", rpc_method_handlers
)
server.add_generic_rpc_handlers((generic_handler,))
class BlockStreamV2Stub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.Blocks = channel.unary_stream(
"/dfuse.bstream.v1.BlockStreamV2/Blocks",
request_serializer=dfuse_dot_bstream_dot_v1_dot_bstream__pb2.BlocksRequestV2.SerializeToString,
response_deserializer=dfuse_dot_bstream_dot_v1_dot_bstream__pb2.BlockResponseV2.FromString,
)
class BlockStreamV2Servicer(object):
# missing associated documentation comment in .proto file
pass
def Blocks(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details("Method not implemented!")
raise NotImplementedError("Method not implemented!")
def add_BlockStreamV2Servicer_to_server(servicer, server):
rpc_method_handlers = {
"Blocks": grpc.unary_stream_rpc_method_handler(
servicer.Blocks,
request_deserializer=dfuse_dot_bstream_dot_v1_dot_bstream__pb2.BlocksRequestV2.FromString,
response_serializer=dfuse_dot_bstream_dot_v1_dot_bstream__pb2.BlockResponseV2.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
"dfuse.bstream.v1.BlockStreamV2", rpc_method_handlers
)
server.add_generic_rpc_handlers((generic_handler,))
| 35.032967 | 108 | 0.724279 | 346 | 3,188 | 6.289017 | 0.210983 | 0.082721 | 0.06204 | 0.074449 | 0.868107 | 0.868107 | 0.868107 | 0.868107 | 0.852941 | 0.809743 | 0 | 0.012643 | 0.206085 | 3,188 | 90 | 109 | 35.422222 | 0.847096 | 0.156211 | 0 | 0.535714 | 1 | 0 | 0.089973 | 0.050324 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107143 | false | 0.107143 | 0.035714 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
73a053023215657af248b723e7c38ef0f142bcd9 | 267 | py | Python | utils/constants.py | iBeCo/analytics | c71c80a7cacd55078c1a9dd463cb4e66aa868764 | [
"Apache-2.0"
] | null | null | null | utils/constants.py | iBeCo/analytics | c71c80a7cacd55078c1a9dd463cb4e66aa868764 | [
"Apache-2.0"
] | null | null | null | utils/constants.py | iBeCo/analytics | c71c80a7cacd55078c1a9dd463cb4e66aa868764 | [
"Apache-2.0"
] | null | null | null | BECO_ADMIN = 'admin'
BECO_CUSTOMER = 'customer'
BECO_RETAILER = 'retailer'
BECO_ASSOCIATE = 'associate'
USER_ROLES = (
(BECO_ADMIN, 'Admin'),
(BECO_CUSTOMER, 'Customer'),
(BECO_RETAILER, 'Retailer'),
(BECO_ASSOCIATE, 'Associate'),
)
| 20.538462 | 38 | 0.636704 | 26 | 267 | 6.192308 | 0.269231 | 0.111801 | 0.173913 | 0.223602 | 0.944099 | 0.944099 | 0.944099 | 0.944099 | 0.944099 | 0.944099 | 0 | 0 | 0.217228 | 267 | 12 | 39 | 22.25 | 0.770335 | 0 | 0 | 0 | 0 | 0 | 0.224719 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
73f7ae6a8160f3f030d399470c41d86e35cdbce3 | 143 | py | Python | users/views.py | AndreMacedo88/VEnCode-Web | c4c760f4aaea213efcebbf8ab9277e1884aa85ec | [
"BSD-3-Clause"
] | null | null | null | users/views.py | AndreMacedo88/VEnCode-Web | c4c760f4aaea213efcebbf8ab9277e1884aa85ec | [
"BSD-3-Clause"
] | null | null | null | users/views.py | AndreMacedo88/VEnCode-Web | c4c760f4aaea213efcebbf8ab9277e1884aa85ec | [
"BSD-3-Clause"
] | null | null | null | from django.shortcuts import render
# Create your views here.
def get_user_profile(request):
return render(request, 'user_profile.html') | 20.428571 | 47 | 0.776224 | 20 | 143 | 5.4 | 0.8 | 0.203704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13986 | 143 | 7 | 47 | 20.428571 | 0.878049 | 0.160839 | 0 | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
fb40c52be54452976cc5f9806152a279784bcca5 | 661 | py | Python | utils/generators.py | sebastbk/algorithms | 08063d9bb29cfef21e19166bd69b3969e9f8fc14 | [
"MIT"
] | null | null | null | utils/generators.py | sebastbk/algorithms | 08063d9bb29cfef21e19166bd69b3969e9f8fc14 | [
"MIT"
] | null | null | null | utils/generators.py | sebastbk/algorithms | 08063d9bb29cfef21e19166bd69b3969e9f8fc14 | [
"MIT"
] | null | null | null | def len_lt(generator, n):
for i, _ in enumerate(generator):
if i >= n:
return False
return True
def len_lte(generator, n):
for i, _ in enumerate(generator):
if i > n:
return False
return True
def len_eq(generator, n):
for i, _ in enumerate(generator):
if i > n:
return False
return i < n
def len_gt(generator, n):
for i, _ in enumerate(generator):
if i > n:
return True
return False
def len_gte(generator, n):
for i, _ in enumerate(generator):
if i >= n:
return True
return False
| 19.441176 | 38 | 0.521936 | 86 | 661 | 3.895349 | 0.197674 | 0.035821 | 0.19403 | 0.208955 | 0.904478 | 0.904478 | 0.904478 | 0.904478 | 0.904478 | 0.904478 | 0 | 0 | 0.397882 | 661 | 33 | 39 | 20.030303 | 0.841709 | 0 | 0 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
fb42ed842eb7d5efd90cf19605be493cc716e4f1 | 9,035 | py | Python | app/tests/test_anti_virus_check.py | ONSdigital/sdx-seft-consumer-service | 13f35143c290a3bf42a79a8127b3045035694c57 | [
"MIT"
] | 1 | 2018-03-06T12:35:30.000Z | 2018-03-06T12:35:30.000Z | app/tests/test_anti_virus_check.py | ONSdigital/sdx-seft-consumer-service | 13f35143c290a3bf42a79a8127b3045035694c57 | [
"MIT"
] | 80 | 2017-05-31T10:30:27.000Z | 2021-03-25T21:51:18.000Z | app/tests/test_anti_virus_check.py | ONSdigital/sdx-seft-consumer-service | 13f35143c290a3bf42a79a8127b3045035694c57 | [
"MIT"
] | 1 | 2021-04-11T07:50:10.000Z | 2021-04-11T07:50:10.000Z | import unittest
import requests
import responses
from sdc.rabbit.exceptions import QuarantinableError, RetryableError, BadMessageError
from app import settings
from app.anti_virus_check import AntiVirusCheck
from app.main import Payload
class AntiVirusCheckTests(unittest.TestCase):
@responses.activate
def test_api_key(self):
settings.ANTI_VIRUS_API_KEY = "test"
data_id = '123'
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, json={'data_id': data_id}, status=200)
responses.add(responses.GET, settings.ANTI_VIRUS_BASE_URL + "/" + data_id,
json={
'scan_results': {'scan_all_result_i': 0},
'process_info': {'progress_percentage': 100, 'result': 'Allowed'}
}, status=200)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
self.assertTrue(anti_virus.send_for_av_scan(payload))
self.assertEqual(responses.calls[0].request.headers['apikey'], settings.ANTI_VIRUS_API_KEY)
@responses.activate
def test_send_for_av_scan_success(self):
data_id = '123'
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, json={'data_id': data_id}, status=200)
responses.add(responses.GET, settings.ANTI_VIRUS_BASE_URL + "/" + data_id,
json={
'scan_results': {'scan_all_result_i': 0},
'process_info': {'progress_percentage': 100, 'result': 'Allowed'}
}, status=200)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
self.assertTrue(anti_virus.send_for_av_scan(payload))
@responses.activate
def test_send_for_av_scan_request_exception(self):
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, body=requests.RequestException())
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_get_results_returns_request_exception(self):
data_id = '123'
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, json={'data_id': data_id}, status=200)
responses.add(responses.GET, settings.ANTI_VIRUS_BASE_URL + "/" + data_id, body=requests.RequestException())
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
self.assertTrue(anti_virus.send_for_av_scan(payload))
@responses.activate
def test_send_for_av_scan_json_decode_error(self):
# should be json not raw text so this should throw a retryable error
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, json=None, status=200)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_type_error(self):
# should be json not raw text so this should throw a retryable error
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, body='test', status=200)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_failure(self):
data_id = '123'
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, json={'data_id': data_id}, status=200)
responses.add(responses.GET, settings.ANTI_VIRUS_BASE_URL + "/" + data_id,
json={
'scan_results': {'scan_all_result_i': 0},
'process_info': {'progress_percentage': 100, 'result': 'Blocked'}
}, status=200)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(QuarantinableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_returns_err(self):
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, json={'err': 'unavailable'}, status=200)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_not_ready_hits_max_attempts(self):
settings.ANTI_VIRUS_WAIT_TIME = 0.1
data_id = '123'
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, json={'data_id': data_id}, status=200)
responses.add(responses.GET, settings.ANTI_VIRUS_BASE_URL + "/" + data_id,
json={
'scan_results': {'scan_all_result_i': 0},
'process_info': {'progress_percentage': 50, 'result': 'Allowed'}
}, status=200)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
self.assertTrue(anti_virus.send_for_av_scan(payload))
@responses.activate
def test_send_for_av_scan_forbidden_bad_api_key(self):
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, status=401)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_bad_request(self):
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, status=400)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(BadMessageError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_not_found(self):
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, status=404)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_forbidden(self):
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, status=403)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_internal_server_error(self):
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, status=500)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_service_unavailable(self):
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, status=503)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
@responses.activate
def test_send_for_av_scan_causes_type_error(self):
data_id = '123'
responses.add(responses.POST, settings.ANTI_VIRUS_BASE_URL, json={'data_id': data_id}, status=200)
responses.add(responses.GET, settings.ANTI_VIRUS_BASE_URL + "/" + data_id,
json={
'scan_results': {'scan_all_result_i': 0},
'process_info': {'progress_percentage': 'incorrect-value', 'result': 'Allowed'}
}, status=200)
anti_virus = AntiVirusCheck(tx_id=1)
payload = Payload(decoded_contents="test", file_name="test", case_id="1", survey_id="1")
with self.assertRaises(RetryableError):
anti_virus.send_for_av_scan(payload)
| 39.112554 | 116 | 0.670504 | 1,145 | 9,035 | 4.952838 | 0.100437 | 0.092047 | 0.047611 | 0.068771 | 0.886616 | 0.878505 | 0.878505 | 0.878505 | 0.87198 | 0.87198 | 0 | 0.020527 | 0.218152 | 9,035 | 230 | 117 | 39.282609 | 0.782276 | 0.014721 | 0 | 0.718954 | 0 | 0 | 0.071244 | 0 | 0 | 0 | 0 | 0 | 0.124183 | 1 | 0.104575 | false | 0 | 0.045752 | 0 | 0.156863 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fba36ed6ee05d8835c8f389d92a1abd2e03f541c | 213 | py | Python | dlkit/runtime/handcar_configs.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 2 | 2018-02-23T12:16:11.000Z | 2020-10-08T17:54:24.000Z | dlkit/runtime/handcar_configs.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 87 | 2017-04-21T18:57:15.000Z | 2021-12-13T19:43:57.000Z | dlkit/runtime/handcar_configs.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 1 | 2018-03-01T16:44:25.000Z | 2018-03-01T16:44:25.000Z | # initialize with built-in configs
from dlkit.app_configs.handcar_configs import *
# override with project-level ones if provided
try:
from dlkit_configs.handcar_configs import *
except ImportError:
pass
| 23.666667 | 47 | 0.793427 | 29 | 213 | 5.689655 | 0.689655 | 0.109091 | 0.254545 | 0.327273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15493 | 213 | 8 | 48 | 26.625 | 0.916667 | 0.361502 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
fbcd14eb321d9af9c3da9e373203d3225a038500 | 7,791 | py | Python | tests/test_renard.py | rob-smallshire/renard | 5d6432aeafa949c4b8009cd85d4acaeecab0e95c | [
"MIT"
] | 5 | 2018-01-11T18:58:52.000Z | 2020-01-26T01:41:15.000Z | tests/test_renard.py | rob-smallshire/renard | 5d6432aeafa949c4b8009cd85d4acaeecab0e95c | [
"MIT"
] | null | null | null | tests/test_renard.py | rob-smallshire/renard | 5d6432aeafa949c4b8009cd85d4acaeecab0e95c | [
"MIT"
] | null | null | null | import math
from hypothesis import given, assume
from hypothesis.strategies import sampled_from, floats, data, integers
from pytest import raises
from renard.renard import (RenardSeriesKey, series, rrange, find_less_than_or_equal, find_greater_than_or_equal,
find_nearest,
find_less_than, find_greater_than, find_nearest_few, open_rrange, R10, precision)
@given(series_key=sampled_from(RenardSeriesKey))
def test_series_cardinality(series_key):
assert len(series(series_key)) == series_key.cardinality
@given(series_key=sampled_from(RenardSeriesKey),
low=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_rrange_cardinality_over_one_order_of_magnitude(series_key, low):
high = low * 10.0
assume(math.isfinite(high))
values = list(rrange(series_key, low, high))
include_end = bool(high in values)
cardinality = series_key.cardinality + include_end
assert len(values) == cardinality
@given(series_key=sampled_from(RenardSeriesKey),
low=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False),
high=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_rrange_strictly_ordered(series_key, low, high):
assume(low < high)
values = list(rrange(series_key, low, high))
assert all(values[i] < values[i+1] for i in range(len(values)-1))
@given(series_key=sampled_from(RenardSeriesKey),
low=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_open_rrange_cardinality_over_one_order_of_magnitude(series_key, low):
high = low * 10.0
assume(math.isfinite(high))
values = list(open_rrange(series_key, low, high))
cardinality = series_key.cardinality
assert len(values) == cardinality
@given(series_key=sampled_from(RenardSeriesKey),
low=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False),
high=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_open_rrange_strictly_ordered(series_key, low, high):
assume(low < high)
values = list(open_rrange(series_key, low, high))
assert all(values[i] < values[i+1] for i in range(len(values)-1))
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_less_than_or_equal(series_key, value):
assert find_less_than_or_equal(series_key, value) <= value
@given(data())
def test_less_than_or_equal_returns_value_from_series(data):
series_key = data.draw(sampled_from(RenardSeriesKey))
value = data.draw(sampled_from(series(series_key)))
assert find_less_than_or_equal(series_key, value) == value
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_less_than(series_key, value):
assert find_less_than(series_key, value) < value
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_greater_than_or_equal(series_key, value):
assert find_greater_than_or_equal(series_key, value) >= value
@given(data())
def test_greater_than_or_equal_returns_value_from_series(data):
series_key = data.draw(sampled_from(RenardSeriesKey))
value = data.draw(sampled_from(series(series_key)))
assert find_greater_than_or_equal(series_key, value) == value
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_greater_than(series_key, value):
assert find_greater_than(series_key, value) > value
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_find_nearest_in_range(series_key, value):
nearest = find_nearest(series_key, value)
assert find_less_than_or_equal(series_key, value) <= nearest <= find_greater_than_or_equal(series_key, value)
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_find_nearest_is_nearest(series_key, value):
nearest = find_nearest(series_key, value)
lower = find_less_than_or_equal(series_key, value)
upper = find_greater_than_or_equal(series_key, value)
assert (((nearest == lower) and (nearest - lower <= upper - nearest))
or ((nearest == upper) and (upper - nearest <= nearest - lower)))
@given(data())
def test_nearest_returns_value_from_series(data):
series_key = data.draw(sampled_from(RenardSeriesKey))
value = data.draw(sampled_from(series(series_key)))
assert find_nearest(series_key, value) == value
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False),
num=sampled_from((1, 2, 3)))
def test_find_nearest_few_has_correct_cardinality(series_key, value, num):
assert len(find_nearest_few(series_key, value, num)) == num
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False),
num=integers())
def test_find_nearest_few_raises_error_with_num_out_of_range(series_key, value, num):
assume(num not in {1, 2, 3})
with raises(ValueError):
find_nearest_few(series_key, value, num)
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_find_nearest_three_includes_at_least_one_less(series_key, value):
assert any(v < value for v in find_nearest_few(series_key, value))
@given(series_key=sampled_from(RenardSeriesKey),
value=floats(min_value=1e-35, max_value=1e35, allow_nan=False, allow_infinity=False))
def test_find_nearest_three_includes_at_least_one_greater(series_key, value):
assert any(v > value for v in find_nearest_few(series_key, value))
def test_erange_start_infinite_raises_value_error():
with raises(ValueError):
inf = float("inf")
rrange(R10, inf, 10)
def test_erange_stop_infinite_raises_value_error():
with raises(ValueError):
rrange(R10, 10, float("inf"))
def test_erange_start_too_small_raises_value_error():
with raises(ValueError):
rrange(R10, 0, 10)
def test_erange_stop_too_small_raises_value_error():
with raises(ValueError):
rrange(R10, 10, 0)
def test_erange_start_stop_in_wrong_order_raises_value_error():
with raises(ValueError):
rrange(R10, 10, 8)
def test_open_erange_start_infinite_raises_value_error():
with raises(ValueError):
inf = float("inf")
open_rrange(R10, inf, 10)
def test_open_erange_stop_infinite_raises_value_error():
with raises(ValueError):
open_rrange(R10, 10, float("inf"))
def test_open_erange_start_too_small_raises_value_error():
with raises(ValueError):
open_rrange(R10, 0, 10)
def test_open_erange_stop_too_small_raises_value_error():
with raises(ValueError):
open_rrange(R10, 10, 0)
def test_open_erange_start_stop_in_wrong_order_raises_value_error():
with raises(ValueError):
open_rrange(R10, 10, 8)
def test_illegal_series_key_raises_value_error():
with raises(ValueError):
series(13)
@given(series_key=sampled_from(RenardSeriesKey))
def test_series_precision_is_positive(series_key):
assert precision(series_key) > 0
def test_illegal_precision_series_key_raises_value_error():
with raises(ValueError):
precision(object())
| 36.924171 | 113 | 0.758183 | 1,141 | 7,791 | 4.815075 | 0.090272 | 0.108118 | 0.068802 | 0.061158 | 0.853659 | 0.833819 | 0.813251 | 0.786312 | 0.743174 | 0.686749 | 0 | 0.024278 | 0.138236 | 7,791 | 210 | 114 | 37.1 | 0.794013 | 0 | 0 | 0.493151 | 0 | 0 | 0.001541 | 0 | 0 | 0 | 0 | 0 | 0.123288 | 1 | 0.212329 | false | 0 | 0.034247 | 0 | 0.246575 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
83a0a12d1e2d29ca2fdeacd8fb2d7f39ddcb0f7c | 6,439 | py | Python | mmdet/datasets/samplers/infinite_sampler.py | mrzhuzhe/mmdetection | c04ca2c2a65500bc248a5d2ab6ace5b15f00064d | [
"Apache-2.0"
] | null | null | null | mmdet/datasets/samplers/infinite_sampler.py | mrzhuzhe/mmdetection | c04ca2c2a65500bc248a5d2ab6ace5b15f00064d | [
"Apache-2.0"
] | null | null | null | mmdet/datasets/samplers/infinite_sampler.py | mrzhuzhe/mmdetection | c04ca2c2a65500bc248a5d2ab6ace5b15f00064d | [
"Apache-2.0"
] | null | null | null | # Copyright (c) OpenMMLab. All rights reserved.
import itertools
import numpy as np
import torch
from mmcv.runner import get_dist_info
from torch.utils.data.sampler import Sampler
class InfiniteGroupBatchSampler(Sampler):
"""Similar to `BatchSampler` warping a `GroupSampler. It is designed for
iteration-based runners like `IterBasedRunner` and yields a mini-batch
indices each time, all indices in a batch should be in the same group.
The implementation logic is referred to
https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/samplers/grouped_batch_sampler.py
Args:
dataset (object): The dataset.
batch_size (int): When model is :obj:`DistributedDataParallel`,
it is the number of training samples on each GPU.
When model is :obj:`DataParallel`, it is
`num_gpus * samples_per_gpu`.
Default : 1.
world_size (int, optional): Number of processes participating in
distributed training. Default: None.
rank (int, optional): Rank of current process. Default: None.
seed (int): Random seed. Default: 0.
shuffle (bool): Whether shuffle the indices of a dummy `epoch`, it
should be noted that `shuffle` can not guarantee that you can
generate sequential indices because it need to ensure
that all indices in a batch is in a group. Default: True.
""" # noqa: W605
def __init__(self,
dataset,
batch_size=1,
world_size=None,
rank=None,
seed=0,
shuffle=True):
_rank, _world_size = get_dist_info()
if world_size is None:
world_size = _world_size
if rank is None:
rank = _rank
self.rank = rank
self.world_size = world_size
self.dataset = dataset
self.batch_size = batch_size
self.seed = seed if seed is not None else 0
self.shuffle = shuffle
assert hasattr(self.dataset, 'flag')
self.flag = self.dataset.flag
self.group_sizes = np.bincount(self.flag)
# buffer used to save indices of each group
self.buffer_per_group = {k: [] for k in range(len(self.group_sizes))}
self.size = len(dataset)
self.indices = self._indices_of_rank()
def _infinite_indices(self):
"""Infinitely yield a sequence of indices."""
g = torch.Generator()
g.manual_seed(self.seed)
while True:
if self.shuffle:
yield from torch.randperm(self.size, generator=g).tolist()
else:
yield from torch.arange(self.size).tolist()
def _indices_of_rank(self):
"""Slice the infinite indices by rank."""
yield from itertools.islice(self._infinite_indices(), self.rank, None,
self.world_size)
def __iter__(self):
# once batch size is reached, yield the indices
for idx in self.indices:
flag = self.flag[idx]
group_buffer = self.buffer_per_group[flag]
group_buffer.append(idx)
if len(group_buffer) == self.batch_size:
yield group_buffer[:]
del group_buffer[:]
def __len__(self):
"""Length of base dataset."""
return self.size
def set_epoch(self, epoch):
"""Not supported in `IterationBased` runner."""
raise NotImplementedError
class InfiniteBatchSampler(Sampler):
"""Similar to `BatchSampler` warping a `DistributedSampler. It is designed
iteration-based runners like `IterBasedRunner` and yields a mini-batch
indices each time.
The implementation logic is referred to
https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/samplers/grouped_batch_sampler.py
Args:
dataset (object): The dataset.
batch_size (int): When model is :obj:`DistributedDataParallel`,
it is the number of training samples on each GPU,
When model is :obj:`DataParallel`, it is
`num_gpus * samples_per_gpu`.
Default : 1.
world_size (int, optional): Number of processes participating in
distributed training. Default: None.
rank (int, optional): Rank of current process. Default: None.
seed (int): Random seed. Default: 0.
shuffle (bool): Whether shuffle the dataset or not. Default: True.
""" # noqa: W605
def __init__(self,
dataset,
batch_size=1,
world_size=None,
rank=None,
seed=0,
shuffle=True):
_rank, _world_size = get_dist_info()
if world_size is None:
world_size = _world_size
if rank is None:
rank = _rank
self.rank = rank
self.world_size = world_size
self.dataset = dataset
self.batch_size = batch_size
self.seed = seed if seed is not None else 0
self.shuffle = shuffle
self.size = len(dataset)
self.indices = self._indices_of_rank()
def _infinite_indices(self):
"""Infinitely yield a sequence of indices."""
g = torch.Generator()
g.manual_seed(self.seed)
while True:
if self.shuffle:
yield from torch.randperm(self.size, generator=g).tolist()
else:
yield from torch.arange(self.size).tolist()
def _indices_of_rank(self):
"""Slice the infinite indices by rank."""
yield from itertools.islice(self._infinite_indices(), self.rank, None,
self.world_size)
def __iter__(self):
# once batch size is reached, yield the indices
batch_buffer = []
for idx in self.indices:
batch_buffer.append(idx)
if len(batch_buffer) == self.batch_size:
yield batch_buffer
batch_buffer = []
def __len__(self):
"""Length of base dataset."""
return self.size
def set_epoch(self, epoch):
"""Not supported in `IterationBased` runner."""
raise NotImplementedError
| 37.219653 | 111 | 0.589688 | 756 | 6,439 | 4.869048 | 0.215608 | 0.04401 | 0.017387 | 0.015213 | 0.809563 | 0.765553 | 0.745993 | 0.745993 | 0.745993 | 0.745993 | 0 | 0.004637 | 0.330175 | 6,439 | 172 | 112 | 37.436047 | 0.848829 | 0.386706 | 0 | 0.791667 | 0 | 0 | 0.00112 | 0 | 0 | 0 | 0 | 0 | 0.010417 | 1 | 0.125 | false | 0 | 0.052083 | 0 | 0.21875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
83ea0152b25994525df46301d41b95415889052a | 44 | py | Python | app/blueprints/search/__init__.py | deb17/moneycare | 0f67142bd63079b473d80e26845341ef2763a283 | [
"MIT"
] | null | null | null | app/blueprints/search/__init__.py | deb17/moneycare | 0f67142bd63079b473d80e26845341ef2763a283 | [
"MIT"
] | null | null | null | app/blueprints/search/__init__.py | deb17/moneycare | 0f67142bd63079b473d80e26845341ef2763a283 | [
"MIT"
] | null | null | null | from app.blueprints.search.routes import bp
| 22 | 43 | 0.840909 | 7 | 44 | 5.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 44 | 1 | 44 | 44 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
83fca5bf2e298f0ff02a28250b181b4e5fafafa0 | 203 | py | Python | papermerge/core/apps.py | amo13/papermerge | d188acb01c7e2e7086d216cd496e65030d48ae52 | [
"Apache-2.0"
] | 1 | 2020-09-28T06:04:38.000Z | 2020-09-28T06:04:38.000Z | papermerge/core/apps.py | amo13/papermerge | d188acb01c7e2e7086d216cd496e65030d48ae52 | [
"Apache-2.0"
] | null | null | null | papermerge/core/apps.py | amo13/papermerge | d188acb01c7e2e7086d216cd496e65030d48ae52 | [
"Apache-2.0"
] | 1 | 2020-11-17T16:20:05.000Z | 2020-11-17T16:20:05.000Z | from django.apps import AppConfig
class CoreConfig(AppConfig):
name = 'papermerge.core'
def ready(self):
from papermerge.core import signals
from papermerge.core import checks
| 20.3 | 43 | 0.70936 | 24 | 203 | 6 | 0.625 | 0.291667 | 0.25 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.226601 | 203 | 9 | 44 | 22.555556 | 0.917197 | 0 | 0 | 0 | 0 | 0 | 0.073892 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
86073ac4ddc7a05e4b781f8b2a933c9fcd0ea779 | 6,308 | py | Python | maskrcnn_benchmark/layers/dcn/deform_pool_module.py | cxq1/paddle_VinVL | f9136871c43b033cd209ddc7579fa986208e37db | [
"MIT"
] | null | null | null | maskrcnn_benchmark/layers/dcn/deform_pool_module.py | cxq1/paddle_VinVL | f9136871c43b033cd209ddc7579fa986208e37db | [
"MIT"
] | null | null | null | maskrcnn_benchmark/layers/dcn/deform_pool_module.py | cxq1/paddle_VinVL | f9136871c43b033cd209ddc7579fa986208e37db | [
"MIT"
] | null | null | null | from paddle import nn
from .deform_pool_func import deform_roi_pooling
class DeformRoIPooling(nn.Module):
def __init__(self,
spatial_scale,
out_size,
out_channels,
no_trans,
group_size=1,
part_size=None,
sample_per_part=4,
trans_std=.0):
super(DeformRoIPooling, self).__init__()
self.spatial_scale = spatial_scale
self.out_size = out_size
self.out_channels = out_channels
self.no_trans = no_trans
self.group_size = group_size
self.part_size = out_size if part_size is None else part_size
self.sample_per_part = sample_per_part
self.trans_std = trans_std
def forward(self, data, rois, offset):
if self.no_trans:
offset = data.new_empty(0)
return deform_roi_pooling(
data, rois, offset, self.spatial_scale, self.out_size,
self.out_channels, self.no_trans, self.group_size, self.part_size,
self.sample_per_part, self.trans_std)
class DeformRoIPoolingPack(DeformRoIPooling):
def __init__(self,
spatial_scale,
out_size,
out_channels,
no_trans,
group_size=1,
part_size=None,
sample_per_part=4,
trans_std=.0,
deform_fc_channels=1024):
super(DeformRoIPoolingPack,
self).__init__(spatial_scale, out_size, out_channels, no_trans,
group_size, part_size, sample_per_part, trans_std)
self.deform_fc_channels = deform_fc_channels
if not no_trans:
self.offset_fc = nn.Sequential(
nn.Linear(self.out_size * self.out_size * self.out_channels,
self.deform_fc_channels),
nn.ReLU(inplace=True),
nn.Linear(self.deform_fc_channels, self.deform_fc_channels),
nn.ReLU(inplace=True),
nn.Linear(self.deform_fc_channels,
self.out_size * self.out_size * 2))
self.offset_fc[-1].weight.data.zero_()
self.offset_fc[-1].bias.data.zero_()
def forward(self, data, rois):
assert data.size(1) == self.out_channels
if self.no_trans:
offset = data.new_empty(0)
return deform_roi_pooling(
data, rois, offset, self.spatial_scale, self.out_size,
self.out_channels, self.no_trans, self.group_size,
self.part_size, self.sample_per_part, self.trans_std)
else:
n = rois.shape[0]
offset = data.new_empty(0)
x = deform_roi_pooling(data, rois, offset, self.spatial_scale,
self.out_size, self.out_channels, True,
self.group_size, self.part_size,
self.sample_per_part, self.trans_std)
offset = self.offset_fc(x.view(n, -1))
offset = offset.view(n, 2, self.out_size, self.out_size)
return deform_roi_pooling(
data, rois, offset, self.spatial_scale, self.out_size,
self.out_channels, self.no_trans, self.group_size,
self.part_size, self.sample_per_part, self.trans_std)
class ModulatedDeformRoIPoolingPack(DeformRoIPooling):
def __init__(self,
spatial_scale,
out_size,
out_channels,
no_trans,
group_size=1,
part_size=None,
sample_per_part=4,
trans_std=.0,
deform_fc_channels=1024):
super(ModulatedDeformRoIPoolingPack, self).__init__(
spatial_scale, out_size, out_channels, no_trans, group_size,
part_size, sample_per_part, trans_std)
self.deform_fc_channels = deform_fc_channels
if not no_trans:
self.offset_fc = nn.Sequential(
nn.Linear(self.out_size * self.out_size * self.out_channels,
self.deform_fc_channels),
nn.ReLU(inplace=True),
nn.Linear(self.deform_fc_channels, self.deform_fc_channels),
nn.ReLU(inplace=True),
nn.Linear(self.deform_fc_channels,
self.out_size * self.out_size * 2))
self.offset_fc[-1].weight.data.zero_()
self.offset_fc[-1].bias.data.zero_()
self.mask_fc = nn.Sequential(
nn.Linear(self.out_size * self.out_size * self.out_channels,
self.deform_fc_channels),
nn.ReLU(inplace=True),
nn.Linear(self.deform_fc_channels,
self.out_size * self.out_size * 1),
nn.Sigmoid())
self.mask_fc[2].weight.data.zero_()
self.mask_fc[2].bias.data.zero_()
def forward(self, data, rois):
assert data.size(1) == self.out_channels
if self.no_trans:
offset = data.new_empty(0)
return deform_roi_pooling(
data, rois, offset, self.spatial_scale, self.out_size,
self.out_channels, self.no_trans, self.group_size,
self.part_size, self.sample_per_part, self.trans_std)
else:
n = rois.shape[0]
offset = data.new_empty(0)
x = deform_roi_pooling(data, rois, offset, self.spatial_scale,
self.out_size, self.out_channels, True,
self.group_size, self.part_size,
self.sample_per_part, self.trans_std)
offset = self.offset_fc(x.view(n, -1))
offset = offset.view(n, 2, self.out_size, self.out_size)
mask = self.mask_fc(x.view(n, -1))
mask = mask.view(n, 1, self.out_size, self.out_size)
return deform_roi_pooling(
data, rois, offset, self.spatial_scale, self.out_size,
self.out_channels, self.no_trans, self.group_size,
self.part_size, self.sample_per_part, self.trans_std) * mask
| 41.774834 | 79 | 0.559765 | 758 | 6,308 | 4.327177 | 0.083113 | 0.083232 | 0.087195 | 0.085366 | 0.899695 | 0.84939 | 0.834146 | 0.834146 | 0.834146 | 0.834146 | 0 | 0.010056 | 0.353678 | 6,308 | 150 | 80 | 42.053333 | 0.794457 | 0 | 0 | 0.766917 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015038 | 1 | 0.045113 | false | 0 | 0.015038 | 0 | 0.120301 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f7b4b69162bfad9196f689e65663caa84060cd53 | 23,070 | py | Python | intern/service/boss/tests/test_baseversion.py | dxenes1/intern | d29754a6a1746ba4ee52ab875d46c76742afc7c2 | [
"Apache-2.0"
] | null | null | null | intern/service/boss/tests/test_baseversion.py | dxenes1/intern | d29754a6a1746ba4ee52ab875d46c76742afc7c2 | [
"Apache-2.0"
] | null | null | null | intern/service/boss/tests/test_baseversion.py | dxenes1/intern | d29754a6a1746ba4ee52ab875d46c76742afc7c2 | [
"Apache-2.0"
] | null | null | null | # Copyright 2016 The Johns Hopkins University Applied Physics Laboratory
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import unittest
from intern.service.boss.baseversion import BaseVersion
from intern.service.boss.v1.volume import CacheMode
from intern.resource.boss.resource import CollectionResource
from intern.resource.boss.resource import ChannelResource
import numpy
VER = 'v0.7'
class ProjectImpl(BaseVersion):
"""Create a concrete implementation of BaseVersion so it can be tested.
"""
@property
def version(self):
return VER
@property
def endpoint(self):
return 'collection'
class MetadataImpl(BaseVersion):
"""Create a concrete implementation of BaseVersion so it can be tested.
"""
@property
def version(self):
return VER
@property
def endpoint(self):
return 'meta'
class VolumeImpl(BaseVersion):
"""Create a concrete implementation of BaseVersion so it can be tested.
"""
@property
def version(self):
return VER
@property
def endpoint(self):
return 'cutout'
class BaseVersionTest(unittest.TestCase):
def setUp(self):
self.resource = CollectionResource('coll1')
self.chanResource = ChannelResource(
'chan1', 'coll1', 'exp1', 'image', 'null descr', 0, 'uint8', 0)
self.annoResource = ChannelResource(
'annChan', 'coll1', 'exp1', 'annotation', 'null descr',
0, 'uint64', 0, sources=['chan1'])
self.test_project = ProjectImpl()
self.test_meta = MetadataImpl()
self.test_volume = VolumeImpl()
self.url_prefix = 'https://api.theboss.io'
##
## Methods used for the project service.
##
def test_build_url_for_list(self):
"""A list operation's URL is different than any other operation. It
uses the plural form of the resource's type name rather than the
resource's name.
"""
actual = self.test_project.build_url(
self.resource, self.url_prefix, 'collection', req_type='list')
self.assertEqual(
self.url_prefix + '/' + self.test_project.version + '/' +
self.test_project.endpoint + '/',
actual)
def test_build_url_for_cutout(self):
"""Cutout URLs are also different than standard operations."""
actual = self.test_project.build_url(
self.chanResource, self.url_prefix, 'cutout', req_type='cutout')
coll = self.chanResource.coll_name
exp = self.chanResource.exp_name
chan = self.chanResource.name
self.assertEqual(
self.url_prefix + '/' + self.test_project.version + '/' +
'cutout/' + coll + '/' + exp + '/' + chan,
actual)
def test_build_url_normal(self):
"""Test standard use of BaseVersion.build_url().
"""
actual = self.test_project.build_url(
self.resource, self.url_prefix, 'collection', req_type='normal')
self.assertEqual(
self.url_prefix + '/' + self.test_project.version + '/' +
self.test_project.endpoint + '/' + self.resource.name,
actual)
def test_get_headers_gives_dict_with_content_type(self):
actual = self.test_project.get_headers('application/json', 'my_token')
self.assertTrue('Content-Type' in actual)
self.assertEqual('application/json', actual['Content-Type'])
def test_get_headers_gives_dict_with_authorization(self):
actual = self.test_project.get_headers('application/json', 'my_token')
self.assertTrue('Authorization' in actual)
self.assertEqual('Token my_token', actual['Authorization'])
def test_get_request(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
actual = self.test_project.get_request(
self.resource, 'GET', 'application/json', url_prefix, token,
proj_list_req=False)
self.assertEqual(
'{}/{}/{}/{}'.format(url_prefix, self.test_project.version, self.test_project.endpoint, self.resource.name),
actual.url)
self.assertEqual('Token {}'.format(token), actual.headers['Authorization'])
self.assertEqual('application/json', actual.headers['Content-Type'])
def test_get_group_request(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
grp_name = 'fire'
expected = '{}/{}/groups/{}/'.format(
url_prefix, self.test_project.version, grp_name)
actual = self.test_project.get_group_request(
'GET', 'application/json', url_prefix, token, grp_name)
self.assertEqual(expected, actual.url)
def test_get_permission_request(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
grp_name = 'fire'
post_data = {"group": grp_name,
"permissions": ['update', 'add', 'delete'],
}
post_data.update(self.chanResource.get_dict_route())
expected = '{}/{}/permissions/'.format(url_prefix, self.test_volume.version)
actual = self.test_project.get_permission_request(
'GET', 'application/json', url_prefix, token, post_data=post_data)
self.assertEqual(expected, actual.url)
def test_get_user_role_request(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
user = 'fire'
role = 'admin'
expected = '{}/{}/sso/user-role/{}/{}'.format(
url_prefix, self.test_project.version, user, role)
actual = self.test_project.get_user_role_request(
'POST', 'application/json', url_prefix, token, user, role)
self.assertEqual(expected, actual.url)
def test_get_user_role_request_no_role(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
user = 'fire'
expected = '{}/{}/sso/user-role/{}'.format(
url_prefix, self.test_project.version, user)
actual = self.test_project.get_user_role_request(
'POST', 'application/json', url_prefix, token, user)
self.assertEqual(expected, actual.url)
def test_get_user_request_just_username(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
user = 'fire'
expected = '{}/{}/sso/user/{}'.format(
url_prefix, self.test_project.version, user)
actual = self.test_project.get_user_request(
'POST', 'application/json', url_prefix, token, user)
self.assertEqual(expected, actual.url)
def test_get_user_request_with_firstname(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
user = 'fire'
first = 'Roger'
expected = '{}/{}/sso/user/{}'.format(
url_prefix, self.test_project.version, user)
expectedData = { 'first_name': first }
actual = self.test_project.get_user_request(
'POST', 'application/json', url_prefix, token, user, first)
self.assertEqual(expected, actual.url)
self.assertDictEqual(expectedData, actual.json)
def test_get_user_request_with_lastname(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
user = 'fire'
last = 'Roger'
expected = '{}/{}/sso/user/{}'.format(
url_prefix, self.test_project.version, user)
expectedData = { 'last_name': last }
actual = self.test_project.get_user_request(
'POST', 'application/json', url_prefix, token, user, last_name=last)
self.assertEqual(expected, actual.url)
self.assertDictEqual(expectedData, actual.json)
def test_get_user_request_with_email(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
user = 'fire'
email = 'Roger@me.com'
expected = '{}/{}/sso/user/{}'.format(
url_prefix, self.test_project.version, user)
expectedData = { 'email': email }
actual = self.test_project.get_user_request(
'POST', 'application/json', url_prefix, token, user, email=email)
def test_get_user_request_with_password(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
user = 'fire'
password = 'password'
expected = '{}/{}/sso/user/{}'.format(
url_prefix, self.test_project.version, user)
expectedData = { 'password': password }
actual = self.test_project.get_user_request(
'POST', 'application/json', url_prefix, token, user, password=password)
self.assertEqual(expected, actual.url)
self.assertDictEqual(expectedData, actual.json)
def test_get_user_request_with_password(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
user = 'fire'
first = 'Roger'
last = 'Dodger'
email = 'Roger@me.com'
password = 'password'
expected = '{}/{}/sso/user/{}'.format(
url_prefix, self.test_project.version, user)
expectedData = {
'first_name': first, 'last_name': last, 'email': email,
'password': password }
actual = self.test_project.get_user_request(
'POST', 'application/json', url_prefix, token, user, first, last,
email, password)
self.assertEqual(expected, actual.url)
self.assertDictEqual(expectedData, actual.json)
##
## Methods used for the metadata service.
##
def test_build_metadata_url_no_value(self):
key = 'foo'
actual = self.test_meta.build_metadata_url(
self.resource, self.url_prefix, key)
self.assertEqual(
self.url_prefix + '/' + self.test_meta.version + '/' +
self.test_meta.endpoint + '/' + self.resource.name + '/?key=' + key,
actual)
def test_build_metadata_url_key_and_value(self):
key = 'foo'
value = 'bar'
actual = self.test_meta.build_metadata_url(
self.resource, self.url_prefix, key, value)
self.assertEqual(
self.url_prefix + '/' + self.test_meta.version + '/' +
self.test_meta.endpoint + '/' + self.resource.name + '/?key=' +
key + '&value=' + value,
actual)
def test_get_metadata_request(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
key = 'version'
actual = self.test_meta.get_metadata_request(
self.resource, 'GET', 'application/json', url_prefix, token, key)
self.assertEqual(
'{}/{}/{}/{}/?key={}'.format(url_prefix, self.test_meta.version,
self.test_meta.endpoint, self.resource.name, key),
actual.url)
self.assertEqual('Token {}'.format(token), actual.headers['Authorization'])
self.assertEqual('application/json', actual.headers['Content-Type'])
##
## Methods used for the volume service.
##
def test_convert_int_list_range_to_str(self):
exp = '2:7'
actual = self.test_volume.convert_int_list_range_to_str([2,7])
self.assertEqual(exp, actual)
def test_convert_int_list_range_to_str_bad_range(self):
with self.assertRaises(RuntimeError):
self.test_volume.convert_int_list_range_to_str([7,5])
def test_convert_int_list_range_to_str_wrong_number_of_elements(self):
with self.assertRaises(RuntimeError):
self.test_volume.convert_int_list_range_to_str([5, 7, 9])
def test_convert_int_list_range_to_str_no_list(self):
with self.assertRaises(RuntimeError):
self.test_volume.convert_int_list_range_to_str('5, 7')
def test_build_cutout_url_no_time_range(self):
res = 0
x_rng_lst = [20, 40]
x_range = '20:40'
y_rng_lst = [50, 70]
y_range = '50:70'
z_rng_lst = [30, 50]
z_range = '30:50'
t_rng_lst = None
actual = self.test_volume.build_cutout_url(
self.chanResource, self.url_prefix,
res, x_rng_lst, y_rng_lst, z_rng_lst, t_rng_lst)
self.assertEqual(
self.url_prefix + '/' + self.test_volume.version + '/' + self.test_volume.endpoint +
'/' + self.chanResource.coll_name + '/' + self.chanResource.exp_name +
'/' + self.chanResource.name + '/' + str(res) + '/' + x_range + '/' +
y_range + '/' + z_range + '/',
actual)
def test_build_cutout_url_no_time_range_with_ids(self):
res = 0
x_rng_lst = [20, 40]
x_range = '20:40'
y_rng_lst = [50, 70]
y_range = '50:70'
z_rng_lst = [30, 50]
z_range = '30:50'
t_rng_lst = None
id_list = [2, 7]
id_list_str = '2,7'
actual = self.test_volume.build_cutout_url(
self.chanResource, self.url_prefix,
res, x_rng_lst, y_rng_lst, z_rng_lst, t_rng_lst, id_list)
self.assertEqual(
self.url_prefix + '/' + self.test_volume.version + '/' + self.test_volume.endpoint +
'/' + self.chanResource.coll_name + '/' + self.chanResource.exp_name +
'/' + self.chanResource.name + '/' + str(res) + '/' + x_range + '/' +
y_range + '/' + z_range + '/?filter=' + id_list_str,
actual)
def test_build_cutout_url_with_time_range(self):
res = 0
x_rng_lst = [20, 40]
x_range = '20:40'
y_rng_lst = [50, 70]
y_range = '50:70'
z_rng_lst = [30, 50]
z_range = '30:50'
t_rng_lst = [10, 25]
time_range = '10:25'
actual = self.test_volume.build_cutout_url(
self.chanResource, self.url_prefix,
res, x_rng_lst, y_rng_lst, z_rng_lst, t_rng_lst)
self.assertEqual(
self.url_prefix + '/' + self.test_volume.version + '/' + self.test_volume.endpoint +
'/' + self.chanResource.coll_name + '/' + self.chanResource.exp_name +
'/' + self.chanResource.name + '/' + str(res) + '/' + x_range + '/' +
y_range + '/' + z_range + '/' + time_range + '/',
actual)
def test_build_cutout_url_with_time_range_and_ids(self):
res = 0
x_rng_lst = [20, 40]
x_range = '20:40'
y_rng_lst = [50, 70]
y_range = '50:70'
z_rng_lst = [30, 50]
z_range = '30:50'
t_rng_lst = [10, 25]
time_range = '10:25'
id_list = [2, 7]
id_list_str = '2,7'
actual = self.test_volume.build_cutout_url(
self.chanResource, self.url_prefix,
res, x_rng_lst, y_rng_lst, z_rng_lst, t_rng_lst, id_list)
self.assertEqual(
self.url_prefix + '/' + self.test_volume.version + '/' + self.test_volume.endpoint +
'/' + self.chanResource.coll_name + '/' + self.chanResource.exp_name +
'/' + self.chanResource.name + '/' + str(res) + '/' + x_range + '/' +
y_range + '/' + z_range + '/' + time_range + '/?filter=' + id_list_str,
actual)
def test_get_cutout_request(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
resolution = 0
x_rng_lst = [20, 40]
x_range = '20:40'
y_rng_lst = [50, 70]
y_range = '50:70'
z_rng_lst = [30, 50]
z_range = '30:50'
t_rng_lst = [10, 25]
time_range = '10:25'
data = numpy.random.randint(0, 3000, (15, 20, 20, 20), numpy.uint16)
actual = self.test_volume.get_cutout_request(
self.chanResource, 'GET', 'application/blosc-python', url_prefix, token,
resolution, x_rng_lst, y_rng_lst, z_rng_lst, t_rng_lst, data)
self.assertEqual(
'{}/{}/{}/{}/{}/{}/{}/{}/{}/{}/{}/'.format(url_prefix, self.test_volume.version,
self.test_volume.endpoint, self.chanResource.coll_name,
self.chanResource.exp_name, self.chanResource.name, resolution,
x_range, y_range, z_range, time_range),
actual.url)
self.assertEqual('Token {}'.format(token), actual.headers['Authorization'])
self.assertEqual('application/blosc-python', actual.headers['Content-Type'])
def test_get_cutout_request_with_ids(self):
"""Test request generated for a filtered cutout."""
url_prefix = 'https://api.theboss.io'
token = 'foobar'
resolution = 0
x_rng_lst = [20, 40]
x_range = '20:40'
y_rng_lst = [50, 70]
y_range = '50:70'
z_rng_lst = [30, 50]
z_range = '30:50'
t_rng_lst = [10, 25]
time_range = '10:25'
id_list = [10, 5]
id_list_str = '10,5'
data = numpy.random.randint(0, 3000, (15, 20, 20, 20), numpy.uint16)
actual = self.test_volume.get_cutout_request(
self.chanResource, 'GET', 'application/blosc-python', url_prefix, token,
resolution, x_rng_lst, y_rng_lst, z_rng_lst, t_rng_lst, id_list=id_list)
self.assertEqual(
'{}/{}/{}/{}/{}/{}/{}/{}/{}/{}/{}/?filter={}'.format(url_prefix, self.test_volume.version,
self.test_volume.endpoint, self.chanResource.coll_name,
self.chanResource.exp_name, self.chanResource.name, resolution,
x_range, y_range, z_range, time_range, id_list_str),
actual.url)
self.assertEqual('Token {}'.format(token), actual.headers['Authorization'])
self.assertEqual('application/blosc-python', actual.headers['Content-Type'])
def test_get_cutout_request_with_ids_and_access_mode(self):
"""Test request generated for a filtered cutout."""
url_prefix = 'https://api.theboss.io'
token = 'foobar'
resolution = 0
x_rng_lst = [20, 40]
x_range = '20:40'
y_rng_lst = [50, 70]
y_range = '50:70'
z_rng_lst = [30, 50]
z_range = '30:50'
t_rng_lst = [10, 25]
time_range = '10:25'
id_list = [10, 5]
id_list_str = '10,5'
data = numpy.random.randint(0, 3000, (15, 20, 20, 20), numpy.uint16)
actual = self.test_volume.get_cutout_request(
self.chanResource, 'GET', 'application/blosc-python', url_prefix, token,
resolution, x_rng_lst, y_rng_lst, z_rng_lst, t_rng_lst, id_list=id_list, access_mode=CacheMode.no_cache)
self.assertEqual(
'{}/{}/{}/{}/{}/{}/{}/{}/{}/{}/{}/?filter={}&access-mode=no-cache'.format(url_prefix, self.test_volume.version,
self.test_volume.endpoint, self.chanResource.coll_name,
self.chanResource.exp_name, self.chanResource.name, resolution,
x_range, y_range, z_range, time_range, id_list_str),
actual.url)
self.assertEqual('Token {}'.format(token), actual.headers['Authorization'])
self.assertEqual('application/blosc-python', actual.headers['Content-Type'])
def test_get_reserve_request(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
num_ids = 20
actual = self.test_volume.get_reserve_request(
self.annoResource, 'GET', 'application/json', url_prefix, token,
num_ids)
expected = '{}/{}/reserve/{}/{}/{}/{}'.format(
url_prefix, self.test_volume.version, self.annoResource.coll_name,
self.annoResource.exp_name, self.annoResource.name, num_ids)
self.assertEqual(expected, actual.url)
def test_get_bounding_box_request_loose(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
resolution = 0
bb_type = 'loose'
id = 55555
actual = self.test_volume.get_bounding_box_request(
self.annoResource, 'GET', 'application/json', url_prefix, token,
resolution, id, bb_type)
expected = '{}/{}/boundingbox/{}/{}/{}/{}/{}/?type={}'.format(
url_prefix, self.test_volume.version, self.annoResource.coll_name,
self.annoResource.exp_name, self.annoResource.name, resolution,
id, bb_type)
self.assertEqual(expected, actual.url)
def test_build_ids_url(self):
url_prefix = 'https://api.theboss.io'
resolution = 0
x_range = [0, 100]
x_range_str = '0:100'
y_range = [10, 50]
y_range_str = '10:50'
z_range = [20, 42]
z_range_str = '20:42'
t_range = [0, 1]
t_range_str = '0:1'
actual = self.test_volume.build_ids_url(
self.annoResource, url_prefix, resolution,
x_range, y_range, z_range, t_range)
expected = '{}/{}/ids/{}/{}/{}/{}/{}/{}/{}/{}/'.format(
url_prefix, self.test_volume.version, self.annoResource.coll_name,
self.annoResource.exp_name, self.annoResource.name,
resolution, x_range_str, y_range_str, z_range_str, t_range_str)
self.assertEqual(expected, actual)
def test_get_ids_request(self):
url_prefix = 'https://api.theboss.io'
token = 'foobar'
resolution = 0
x_range = [0, 100]
x_range_str = '0:100'
y_range = [10, 50]
y_range_str = '10:50'
z_range = [20, 42]
z_range_str = '20:42'
t_range = [0, 1]
t_range_str = '0:1'
actual = self.test_volume.get_ids_request(
self.annoResource, 'GET', 'application/json', url_prefix, token,
resolution, x_range, y_range, z_range, t_range)
expected = '{}/{}/ids/{}/{}/{}/{}/{}/{}/{}/{}/'.format(
url_prefix, self.test_volume.version, self.annoResource.coll_name,
self.annoResource.exp_name, self.annoResource.name,
resolution, x_range_str, y_range_str, z_range_str, t_range_str)
self.assertEqual(expected, actual.url)
def test_convert_int_list_to_comma_sep_str_1_ele(self):
"""Test with a list with one element."""
expected = '2'
actual = self.test_volume.convert_int_list_to_comma_sep_str([2])
self.assertEqual(expected, actual)
def test_convert_int_list_to_comma_sep_str_multi_ele(self):
"""Test with a list with multiple elements."""
expected = '2,6,9'
actual = self.test_volume.convert_int_list_to_comma_sep_str([2, 6, 9])
self.assertEqual(expected, actual)
if __name__ == '__main__':
unittest.main()
| 37.696078 | 123 | 0.602687 | 2,810 | 23,070 | 4.684342 | 0.097153 | 0.05166 | 0.039353 | 0.036162 | 0.804832 | 0.781661 | 0.770645 | 0.736306 | 0.712755 | 0.683583 | 0 | 0.023103 | 0.266407 | 23,070 | 611 | 124 | 37.757774 | 0.754609 | 0.030429 | 0 | 0.66167 | 0 | 0 | 0.114808 | 0.021518 | 0 | 0 | 0 | 0 | 0.109208 | 0 | null | null | 0.017131 | 0.012848 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f756704c04df19e2dcc32ed2d6398938bff2a5b2 | 11,819 | py | Python | tests/CLI/modules/report_tests.py | dvzrv/softlayer-python | 9a5f6c6981bcc370084537b4d1769383499ce90d | [
"MIT"
] | null | null | null | tests/CLI/modules/report_tests.py | dvzrv/softlayer-python | 9a5f6c6981bcc370084537b4d1769383499ce90d | [
"MIT"
] | null | null | null | tests/CLI/modules/report_tests.py | dvzrv/softlayer-python | 9a5f6c6981bcc370084537b4d1769383499ce90d | [
"MIT"
] | null | null | null | """
SoftLayer.tests.CLI.modules.report_tests
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
:license: MIT, see LICENSE for more details.
"""
from SoftLayer import testing
import json
from pprint import pprint as pp
class ReportTests(testing.TestCase):
def test_bandwidth_invalid_date(self):
result = self.run_command(
[
'report',
'bandwidth',
'--start=welp',
'--end=2016-01-01',
],
)
self.assertTrue('Invalid value for "--start"', result.output)
result = self.run_command(
[
'report',
'bandwidth',
'--start=2016-01-01',
'--end=welp',
],
)
self.assertTrue('Invalid value for "--end"', result.output)
def test_bandwidth_report(self):
racks = self.set_mock('SoftLayer_Account', 'getVirtualDedicatedRacks')
racks.return_value = [{
'id': 1,
'name': 'pool1',
'metricTrackingObjectId': 1,
}, {
'id': 2,
'name': 'pool2',
}, {
'id': 3,
'name': 'pool3',
'metricTrackingObjectId': 3,
}]
hardware = self.set_mock('SoftLayer_Account', 'getHardware')
hardware.return_value = [{
'id': 101,
'metricTrackingObject': {'id': 101},
'hostname': 'host1',
}, {
'id': 102,
'hostname': 'host2',
'virtualRack': {'id': 1, 'bandwidthAllotmentTypeId': 2},
}, {
'id': 103,
'metricTrackingObject': {'id': 103},
'hostname': 'host3',
'virtualRack': {'id': 1, 'bandwidthAllotmentTypeId': 2},
}]
guests = self.set_mock('SoftLayer_Account', 'getVirtualGuests')
guests.return_value = [{
'id': 201,
'metricTrackingObjectId': 201,
'hostname': 'host1',
}, {
'id': 202,
'hostname': 'host2',
'virtualRack': {'id': 2, 'bandwidthAllotmentTypeId': 2},
}, {
'id': 203,
'metricTrackingObjectId': 203,
'hostname': 'host3',
'virtualRack': {'id': 2, 'bandwidthAllotmentTypeId': 2},
}]
summary_data = self.set_mock('SoftLayer_Metric_Tracking_Object', 'getSummaryData')
summary_data.return_value = [
{'type': 'publicIn_net_octet', 'counter': 10},
{'type': 'publicOut_net_octet', 'counter': 20},
{'type': 'privateIn_net_octet', 'counter': 30},
{'type': 'privateOut_net_octet', 'counter': 40},
]
result = self.run_command([
'report',
'bandwidth',
'--start=2016-02-04',
'--end=2016-03-04 12:34:56',
])
self.assert_no_fail(result)
stripped_output = '[' + result.output.split('[', 1)[1]
json_output = json.loads(stripped_output)
pp(json.loads(stripped_output))
print("======= ^^^^^^^^^ ==============")
self.assertEqual(json_output[0]['hostname'], 'pool1')
self.assertEqual(json_output[0]['private_in'], 30)
self.assertEqual(6, len(self.calls('SoftLayer_Metric_Tracking_Object', 'getSummaryData')))
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=1)
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=3)
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=101)
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=103)
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=201)
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=203)
call = self.calls('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=1)[0]
expected_args = ('2016-02-04 00:00:00 ', '2016-03-04 12:34:56 ',
[{
'keyName': 'PUBLICIN',
'name': 'publicIn',
'summaryType': 'sum',
}, {
'keyName': 'PUBLICOUT',
'name': 'publicOut',
'summaryType': 'sum',
}, {
'keyName': 'PRIVATEIN',
'name': 'privateIn',
'summaryType': 'sum',
}, {
'keyName': 'PRIVATEOUT',
'name': 'privateOut',
'summaryType': 'sum',
}],
300,
)
self.assertEqual(expected_args, call.args)
def test_virtual_bandwidth_report(self):
racks = self.set_mock('SoftLayer_Account', 'getVirtualDedicatedRacks')
racks.return_value = [{
'id': 1,
'name': 'pool1',
'metricTrackingObjectId': 1,
}, {
'id': 2,
'name': 'pool2',
}, {
'id': 3,
'name': 'pool3',
'metricTrackingObjectId': 3,
}]
guests = self.set_mock('SoftLayer_Account', 'getVirtualGuests')
guests.return_value = [{
'id': 201,
'metricTrackingObjectId': 201,
'hostname': 'host1',
}, {
'id': 202,
'hostname': 'host2',
'virtualRack': {'id': 2, 'bandwidthAllotmentTypeId': 2},
}, {
'id': 203,
'metricTrackingObjectId': 203,
'hostname': 'host3',
'virtualRack': {'id': 2, 'bandwidthAllotmentTypeId': 2},
}]
summary_data = self.set_mock('SoftLayer_Metric_Tracking_Object',
'getSummaryData')
summary_data.return_value = [
{'type': 'publicIn_net_octet', 'counter': 10},
{'type': 'publicOut_net_octet', 'counter': 20},
{'type': 'privateIn_net_octet', 'counter': 30},
{'type': 'privateOut_net_octet', 'counter': 40},
]
result = self.run_command([
'report',
'bandwidth',
'--start=2016-02-04',
'--end=2016-03-04 12:34:56',
'--virtual',
])
self.assert_no_fail(result)
stripped_output = '[' + result.output.split('[', 1)[1]
json_output = json.loads(stripped_output)
self.assertEqual(json_output[0]['hostname'], 'pool1')
self.assertEqual(json_output[1]['private_in'], 0)
self.assertEqual(json_output[2]['private_in'], 30)
self.assertEqual(json_output[3]['type'], 'virtual')
self.assertEqual(4, len(self.calls('SoftLayer_Metric_Tracking_Object', 'getSummaryData')))
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=1)
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=3)
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=201)
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=203)
call = self.calls('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=1)[0]
expected_args = ('2016-02-04 00:00:00 ', '2016-03-04 12:34:56 ',
[{
'keyName': 'PUBLICIN',
'name': 'publicIn',
'summaryType': 'sum',
}, {
'keyName': 'PUBLICOUT',
'name': 'publicOut',
'summaryType': 'sum',
}, {
'keyName': 'PRIVATEIN',
'name': 'privateIn',
'summaryType': 'sum',
}, {
'keyName': 'PRIVATEOUT',
'name': 'privateOut',
'summaryType': 'sum',
}],
300,
)
self.assertEqual(expected_args, call.args)
def test_server_bandwidth_report(self):
racks = self.set_mock('SoftLayer_Account', 'getVirtualDedicatedRacks')
racks.return_value = [{
'id': 1,
'name': 'pool1',
'metricTrackingObjectId': 1,
}, {
'id': 2,
'name': 'pool2',
}, {
'id': 3,
'name': 'pool3',
'metricTrackingObjectId': 3,
}]
hardware = self.set_mock('SoftLayer_Account', 'getHardware')
hardware.return_value = [{
'id': 101,
'metricTrackingObject': {'id': 101},
'hostname': 'host1',
}, {
'id': 102,
'hostname': 'host2',
'virtualRack': {'id': 1, 'bandwidthAllotmentTypeId': 2},
}, {
'id': 103,
'metricTrackingObject': {'id': 103},
'hostname': 'host3',
'virtualRack': {'id': 1, 'bandwidthAllotmentTypeId': 2},
}]
summary_data = self.set_mock('SoftLayer_Metric_Tracking_Object',
'getSummaryData')
summary_data.return_value = [
{'type': 'publicIn_net_octet', 'counter': 10},
{'type': 'publicOut_net_octet', 'counter': 20},
{'type': 'privateIn_net_octet', 'counter': 30},
{'type': 'privateOut_net_octet', 'counter': 40},
]
result = self.run_command([
'report',
'bandwidth',
'--start=2016-02-04',
'--end=2016-03-04 12:34:56',
'--server',
])
self.assert_no_fail(result)
stripped_output = '[' + result.output.split('[', 1)[1]
json_output = json.loads(stripped_output)
self.assertEqual(json_output[0]['hostname'], 'pool1')
self.assertEqual(json_output[1]['private_in'], 0)
self.assertEqual(json_output[2]['private_in'], 30)
self.assertEqual(json_output[3]['type'], 'hardware')
self.assertEqual(4, len(self.calls('SoftLayer_Metric_Tracking_Object', 'getSummaryData')))
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=101)
self.assert_called_with('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=103)
call = self.calls('SoftLayer_Metric_Tracking_Object', 'getSummaryData', identifier=1)[0]
expected_args = ('2016-02-04 00:00:00 ', '2016-03-04 12:34:56 ',
[{
'keyName': 'PUBLICIN',
'name': 'publicIn',
'summaryType': 'sum',
}, {
'keyName': 'PUBLICOUT',
'name': 'publicOut',
'summaryType': 'sum',
}, {
'keyName': 'PRIVATEIN',
'name': 'privateIn',
'summaryType': 'sum',
}, {
'keyName': 'PRIVATEOUT',
'name': 'privateOut',
'summaryType': 'sum',
}],
300,
)
self.assertEqual(expected_args, call.args)
| 39.661074 | 101 | 0.482528 | 953 | 11,819 | 5.776495 | 0.128017 | 0.057221 | 0.087738 | 0.110627 | 0.940055 | 0.924614 | 0.924614 | 0.917348 | 0.909355 | 0.909355 | 0 | 0.050594 | 0.372874 | 11,819 | 297 | 102 | 39.794613 | 0.692121 | 0.010745 | 0 | 0.877778 | 0 | 0 | 0.305775 | 0.09904 | 0 | 0 | 0 | 0 | 0.122222 | 1 | 0.014815 | false | 0 | 0.011111 | 0 | 0.02963 | 0.007407 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f77faa2d4230b9a6536dc37f674be43020994936 | 4,743 | py | Python | Chapter05/RE_Search_Examples.py | frankethp/Hands-On-Enterprise-Automation-with-Python | 4d20dc5fda2265a2c3666770b8ad53e63c7ae07c | [
"MIT"
] | 51 | 2018-07-02T04:03:07.000Z | 2022-03-08T07:20:29.000Z | Chapter05/RE_Search_Examples.py | MindaugasVaitkus2/Hands-On-Enterprise-Automation-with-Python | 39471804525701e634bd35046d8db3c0bca51dd6 | [
"MIT"
] | 1 | 2018-08-06T10:13:15.000Z | 2020-10-08T12:27:17.000Z | Chapter05/RE_Search_Examples.py | MindaugasVaitkus2/Hands-On-Enterprise-Automation-with-Python | 39471804525701e634bd35046d8db3c0bca51dd6 | [
"MIT"
] | 43 | 2018-07-24T08:50:41.000Z | 2022-03-18T21:45:40.000Z | #!/usr/bin/python
__author__ = "Bassim Aly"
__EMAIL__ = "basim.alyy@gmail.com"
# Example 1
import re
intf_ip = 'Gi0/0/0.911 10.200.101.242 YES NVRAM up up'
match = re.search('10.200.101.242', intf_ip)
if match:
print match.group()
# Example 2
import re
intf_ip = '''Gi0/0/0.705 10.103.17.5 YES NVRAM up up
Gi0/0/0.900 86.121.75.31 YES NVRAM up up
Gi0/0/0.911 10.200.101.242 YES NVRAM up up
Gi0/0/0.7000 unassigned YES unset up up '''
match = re.search("\d+\.\d+\.\d+\.\d+", intf_ip)
if match:
print match.group()
# Example 3
import re
log_msg = 'Dec 20 12:11:47.417: %LINK-3-UPDOWN: Interface GigabitEthernet0/0/4, changed state to down'
match = re.search("(\w+\s\d+\s\S+):\s(\S+): Interface (\S+), changed state to (\S+)", log_msg)
if match:
print match.groups()
# Example 4: Named group
import re
log_msg = 'Dec 20 12:11:47.417: %LINK-3-UPDOWN: Interface GigabitEthernet0/0/4, changed state to down'
match = re.search(
"(?P<TIMESTAMP>\w+\s\d+\s\S+):\s(?P<EVENT>\S+): Interface (?P<INTF>\S+), changed state to (?P<STATE>\S+)", log_msg)
if match:
print match.groups()
# Example 5-1: Searching for multiple Lines using re.search()
import re
show_ip_int_br_full = """
GigabitEthernet0/0/0 110.110.110.1 YES NVRAM up up
GigabitEthernet0/0/1 107.107.107.1 YES NVRAM up up
GigabitEthernet0/0/2 108.108.108.1 YES NVRAM up up
GigabitEthernet0/0/3 109.109.109.1 YES NVRAM up up
GigabitEthernet0/0/4 unassigned YES NVRAM up up
GigabitEthernet0/0/5 10.131.71.1 YES NVRAM up up
GigabitEthernet0/0/6 10.37.102.225 YES NVRAM up up
GigabitEthernet0/1/0 unassigned YES unset up up
GigabitEthernet0/1/1 57.234.66.28 YES manual up up
GigabitEthernet0/1/2 10.10.99.70 YES manual up up
GigabitEthernet0/1/3 unassigned YES manual deleted down
GigabitEthernet0/1/4 192.168.200.1 YES manual up up
GigabitEthernet0/1/5 unassigned YES manual down down
GigabitEthernet0/1/6 10.20.20.1 YES manual down down
GigabitEthernet0/2/0 10.30.40.1 YES manual down down
GigabitEthernet0/2/1 57.20.20.1 YES manual down down
"""
for line in show_ip_int_br_full.split("\n"):
match = re.search(r"(?P<interface>\w+\d\/\d\/\d)\s+(?P<ip>\d+.\d+.\d+.\d+)", line)
if match:
intf_ip = match.groupdict()
if intf_ip["ip"].startswith("57"):
print "Subnet is configured on " + intf_ip["interface"] + " and ip is " + intf_ip["ip"]
# Example 5-2: Searching for multiple Lines using re.findall()
import re
from pprint import pprint
show_ip_int_br_full = """
GigabitEthernet0/0/0 110.110.110.1 YES NVRAM up up
GigabitEthernet0/0/1 107.107.107.1 YES NVRAM up up
GigabitEthernet0/0/2 108.108.108.1 YES NVRAM up up
GigabitEthernet0/0/3 109.109.109.1 YES NVRAM up up
GigabitEthernet0/0/4 unassigned YES NVRAM up up
GigabitEthernet0/0/5 10.131.71.1 YES NVRAM up up
GigabitEthernet0/0/6 10.37.102.225 YES NVRAM up up
GigabitEthernet0/1/0 unassigned YES unset up up
GigabitEthernet0/1/1 57.234.66.28 YES manual up up
GigabitEthernet0/1/2 10.10.99.70 YES manual up up
GigabitEthernet0/1/3 unassigned YES manual deleted down
GigabitEthernet0/1/4 192.168.200.1 YES manual up up
GigabitEthernet0/1/5 unassigned YES manual down down
GigabitEthernet0/1/6 10.20.20.1 YES manual down down
GigabitEthernet0/2/0 10.30.40.1 YES manual down down
GigabitEthernet0/2/1 57.20.20.1 YES manual down down
"""
intf_ip = re.findall(r"(?P<interface>\w+\d\/\d\/\d)\s+(?P<ip>57.\d+.\d+.\d+)", show_ip_int_br_full)
pprint(intf_ip)
| 47.909091 | 119 | 0.519502 | 656 | 4,743 | 3.698171 | 0.178354 | 0.044518 | 0.181369 | 0.089035 | 0.825639 | 0.793487 | 0.76216 | 0.734542 | 0.705688 | 0.675185 | 0 | 0.146102 | 0.378031 | 4,743 | 98 | 120 | 48.397959 | 0.676271 | 0.040059 | 0 | 0.726027 | 0 | 0.082192 | 0.823141 | 0.048174 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.09589 | null | null | 0.09589 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
e38f0fa17a6ac7b501f1592ecdf45509e145d610 | 39,595 | py | Python | tests/test_octodns_provider_constellix.py | PeterDaveHello/octodns | c3b68ce4c66d5a8319c6f998538e7e849aa2ae4e | [
"MIT"
] | null | null | null | tests/test_octodns_provider_constellix.py | PeterDaveHello/octodns | c3b68ce4c66d5a8319c6f998538e7e849aa2ae4e | [
"MIT"
] | 34 | 2020-12-01T21:24:10.000Z | 2021-09-20T21:12:48.000Z | tests/test_octodns_provider_constellix.py | PeterDaveHello/octodns | c3b68ce4c66d5a8319c6f998538e7e849aa2ae4e | [
"MIT"
] | 1 | 2021-08-10T16:54:50.000Z | 2021-08-10T16:54:50.000Z | #
#
#
from __future__ import absolute_import, division, print_function, \
unicode_literals
from mock import Mock, call
from os.path import dirname, join
from requests import HTTPError
from requests_mock import ANY, mock as requests_mock
from six import text_type
from unittest import TestCase
from octodns.record import Record
from octodns.provider.constellix import \
ConstellixProvider, ConstellixClientBadRequest
from octodns.provider.yaml import YamlProvider
from octodns.zone import Zone
class TestConstellixProvider(TestCase):
expected = Zone('unit.tests.', [])
source = YamlProvider('test', join(dirname(__file__), 'config'))
source.populate(expected)
# Our test suite differs a bit, add our NS and remove the simple one
expected.add_record(Record.new(expected, 'under', {
'ttl': 3600,
'type': 'NS',
'values': [
'ns1.unit.tests.',
'ns2.unit.tests.',
]
}))
# Add some ALIAS records
expected.add_record(Record.new(expected, '', {
'ttl': 1800,
'type': 'ALIAS',
'value': 'aname.unit.tests.'
}))
# Add a dynamic record
expected.add_record(Record.new(expected, 'www.dynamic', {
'ttl': 300,
'type': 'A',
'values': [
'1.2.3.4',
'1.2.3.5'
],
'dynamic': {
'pools': {
'two': {
'values': [{
'value': '1.2.3.4',
'weight': 1
}, {
'value': '1.2.3.5',
'weight': 1
}],
},
},
'rules': [{
'pool': 'two',
}],
},
}))
for record in list(expected.records):
if record.name == 'sub' and record._type == 'NS':
expected._remove_record(record)
break
expected_dynamic = Zone('unit.tests.', [])
source = YamlProvider('test', join(dirname(__file__), 'config'))
source.populate(expected_dynamic)
# Our test suite differs a bit, add our NS and remove the simple one
expected_dynamic.add_record(Record.new(expected_dynamic, 'under', {
'ttl': 3600,
'type': 'NS',
'values': [
'ns1.unit.tests.',
'ns2.unit.tests.',
]
}))
# Add some ALIAS records
expected_dynamic.add_record(Record.new(expected_dynamic, '', {
'ttl': 1800,
'type': 'ALIAS',
'value': 'aname.unit.tests.'
}))
# Add a dynamic record
expected_dynamic.add_record(Record.new(expected_dynamic, 'www.dynamic', {
'ttl': 300,
'type': 'A',
'values': [
'1.2.3.4',
'1.2.3.5'
],
'dynamic': {
'pools': {
'one': {
'fallback': 'two',
'values': [{
'value': '1.2.3.6',
'weight': 1
}, {
'value': '1.2.3.7',
'weight': 1
}],
},
'two': {
'values': [{
'value': '1.2.3.4',
'weight': 1
}, {
'value': '1.2.3.5',
'weight': 1
}],
},
},
'rules': [{
'geos': [
'AS',
'EU-ES',
'EU-UA',
'EU-SE',
'NA-CA-NL',
'OC'
],
'pool': 'one'
}, {
'pool': 'two',
}],
}
}))
for record in list(expected_dynamic.records):
if record.name == 'sub' and record._type == 'NS':
expected_dynamic._remove_record(record)
break
def test_populate(self):
provider = ConstellixProvider('test', 'api', 'secret')
# Bad auth
with requests_mock() as mock:
mock.get(ANY, status_code=401,
text='{"errors": ["Unable to authenticate token"]}')
with self.assertRaises(Exception) as ctx:
zone = Zone('unit.tests.', [])
provider.populate(zone)
self.assertEquals('Unauthorized', text_type(ctx.exception))
# Bad request
with requests_mock() as mock:
mock.get(ANY, status_code=400,
text='{"errors": ["\\"unittests\\" is not '
'a valid domain name"]}')
with self.assertRaises(Exception) as ctx:
zone = Zone('unit.tests.', [])
provider.populate(zone)
self.assertEquals('\n - "unittests" is not a valid domain name',
text_type(ctx.exception))
# General error
with requests_mock() as mock:
mock.get(ANY, status_code=502, text='Things caught fire')
with self.assertRaises(HTTPError) as ctx:
zone = Zone('unit.tests.', [])
provider.populate(zone)
self.assertEquals(502, ctx.exception.response.status_code)
# Non-existent zone doesn't populate anything
with requests_mock() as mock:
mock.get(ANY, status_code=404,
text='<html><head></head><body></body></html>')
zone = Zone('unit.tests.', [])
provider.populate(zone)
self.assertEquals(set(), zone.records)
# No diffs == no changes
with requests_mock() as mock:
base = 'https://api.dns.constellix.com/v1'
with open('tests/fixtures/constellix-domains.json') as fh:
mock.get(f'{base}/domains', text=fh.read())
with open('tests/fixtures/constellix-records.json') as fh:
mock.get(f'{base}/domains/123123/records', text=fh.read())
with open('tests/fixtures/constellix-pools.json') as fh:
mock.get(f'{base}/pools/A', text=fh.read())
with open('tests/fixtures/constellix-geofilters.json') as fh:
mock.get(f'{base}/geoFilters', text=fh.read())
zone = Zone('unit.tests.', [])
provider.populate(zone)
self.assertEquals(17, len(zone.records))
changes = self.expected_dynamic.changes(zone, provider)
self.assertEquals(0, len(changes))
# 2nd populate makes no network calls/all from cache
again = Zone('unit.tests.', [])
provider.populate(again)
self.assertEquals(17, len(again.records))
# bust the cache
del provider._zone_records[zone.name]
def test_apply(self):
provider = ConstellixProvider('test', 'api', 'secret')
resp = Mock()
resp.json = Mock()
provider._client._request = Mock(return_value=resp)
# non-existent domain, create everything
resp.json.side_effect = [
[], # no domains returned during populate
[{
'id': 123123,
'name': 'unit.tests'
}], # domain created in apply
[], # No pools returned during populate
[{
"id": 1808520,
"name": "unit.tests.:www.dynamic:A:two",
}] # pool created in apply
]
plan = provider.plan(self.expected)
# No root NS, no ignored, no excluded, no unsupported
n = len(self.expected.records) - 8
self.assertEquals(n, len(plan.changes))
self.assertEquals(n, provider.apply(plan))
provider._client._request.assert_has_calls([
# get all domains to build the cache
call('GET', '/domains'),
# created the domain
call('POST', '/domains', data={'names': ['unit.tests']})
])
# Check we tried to get our pool
provider._client._request.assert_has_calls([
# get all pools to build the cache
call('GET', '/pools/A'),
# created the pool
call('POST', '/pools/A', data={
'name': 'unit.tests.:www.dynamic:A:two',
'type': 'A',
'numReturn': 1,
'minAvailableFailover': 1,
'ttl': 300,
'values': [{
"value": "1.2.3.4",
"weight": 1
}, {
"value": "1.2.3.5",
"weight": 1
}]
})
])
# These two checks are broken up so that ordering doesn't break things.
# Python3 doesn't make the calls in a consistent order so different
# things follow the GET / on different runs
provider._client._request.assert_has_calls([
call('POST', '/domains/123123/records/SRV', data={
'roundRobin': [{
'priority': 10,
'weight': 20,
'value': 'foo-1.unit.tests.',
'port': 30
}, {
'priority': 12,
'weight': 20,
'value': 'foo-2.unit.tests.',
'port': 30
}],
'name': '_srv._tcp',
'ttl': 600,
}),
])
self.assertEquals(22, provider._client._request.call_count)
provider._client._request.reset_mock()
provider._client.records = Mock(return_value=[
{
'id': 11189897,
'type': 'A',
'name': 'www',
'ttl': 300,
'recordOption': 'roundRobin',
'value': [
'1.2.3.4',
'2.2.3.4',
]
}, {
'id': 11189898,
'type': 'A',
'name': 'ttl',
'ttl': 600,
'recordOption': 'roundRobin',
'value': [
'3.2.3.4'
]
}, {
'id': 11189899,
'type': 'ALIAS',
'name': 'alias',
'ttl': 600,
'recordOption': 'roundRobin',
'value': [{
'value': 'aname.unit.tests.'
}]
}, {
"id": 1808520,
"type": "A",
"name": "www.dynamic",
"geolocation": None,
"recordOption": "pools",
"ttl": 300,
"value": [],
"pools": [
1808521
]
}
])
provider._client.pools = Mock(return_value=[{
"id": 1808521,
"name": "unit.tests.:www.dynamic:A:two",
"type": "A",
"values": [
{
"value": "1.2.3.4",
"weight": 1
},
{
"value": "1.2.3.5",
"weight": 1
}
]
}])
# Domain exists, we don't care about return
resp.json.side_effect = [
[], # no domains returned during populate
[{
'id': 123123,
'name': 'unit.tests'
}], # domain created in apply
[], # No pools returned during populate
[{
"id": 1808521,
"name": "unit.tests.:www.dynamic:A:one"
}] # pool created in apply
]
wanted = Zone('unit.tests.', [])
wanted.add_record(Record.new(wanted, 'ttl', {
'ttl': 300,
'type': 'A',
'value': '3.2.3.4'
}))
wanted.add_record(Record.new(wanted, 'www.dynamic', {
'ttl': 300,
'type': 'A',
'values': [
'1.2.3.4'
],
'dynamic': {
'pools': {
'two': {
'values': [{
'value': '1.2.3.4',
'weight': 1
}],
},
},
'rules': [{
'pool': 'two',
}],
},
}))
plan = provider.plan(wanted)
self.assertEquals(4, len(plan.changes))
self.assertEquals(4, provider.apply(plan))
# recreate for update, and deletes for the 2 parts of the other
provider._client._request.assert_has_calls([
call('POST', '/domains/123123/records/A', data={
'roundRobin': [{
'value': '3.2.3.4'
}],
'name': 'ttl',
'ttl': 300
}),
call('PUT', '/pools/A/1808521', data={
'name': 'unit.tests.:www.dynamic:A:two',
'type': 'A',
'numReturn': 1,
'minAvailableFailover': 1,
'ttl': 300,
'values': [{
"value": "1.2.3.4",
"weight": 1
}],
'id': 1808521,
'geofilter': 1
}),
call('DELETE', '/domains/123123/records/A/11189897'),
call('DELETE', '/domains/123123/records/A/11189898'),
call('DELETE', '/domains/123123/records/ANAME/11189899'),
], any_order=True)
def test_apply_dunamic(self):
provider = ConstellixProvider('test', 'api', 'secret')
resp = Mock()
resp.json = Mock()
provider._client._request = Mock(return_value=resp)
# non-existent domain, create everything
resp.json.side_effect = [
[], # no domains returned during populate
[{
'id': 123123,
'name': 'unit.tests'
}], # domain created in apply
[], # No pools returned during populate
[{
"id": 1808521,
"name": "unit.tests.:www.dynamic:A:one"
}], # pool created in apply
[], # no geofilters returned during populate
[{
"id": 5303,
"name": "unit.tests.:www.dynamic:A:one",
"filterRulesLimit": 100,
"geoipContinents": ["AS", "OC"],
"geoipCountries": ["ES", "SE", "UA"],
"regions": [
{
"continentCode": "NA",
"countryCode": "CA",
"regionCode": "NL"
}
]
}], # geofilters created in applly
[{
"id": 1808520,
"name": "unit.tests.:www.dynamic:A:two",
}], # pool created in apply
{
'id': 123123,
'name': 'unit.tests',
'hasGeoIP': False
}, # domain listed for enabling geo
[] # enabling geo
]
plan = provider.plan(self.expected_dynamic)
# No root NS, no ignored, no excluded, no unsupported
n = len(self.expected_dynamic.records) - 8
self.assertEquals(n, len(plan.changes))
self.assertEquals(n, provider.apply(plan))
provider._client._request.assert_has_calls([
# get all domains to build the cache
call('GET', '/domains'),
# created the domain
call('POST', '/domains', data={'names': ['unit.tests']})
])
#
# Check we tried to get our pool
provider._client._request.assert_has_calls([
call('GET', '/pools/A'),
call('POST', '/pools/A', data={
'name': 'unit.tests.:www.dynamic:A:one',
'type': 'A',
'numReturn': 1,
'minAvailableFailover': 1,
'ttl': 300,
'values': [{
'value': '1.2.3.6',
'weight': 1
}, {
'value': '1.2.3.7',
'weight': 1}]
}),
call('GET', '/geoFilters'),
call('POST', '/geoFilters', data={
'filterRulesLimit': 100,
'name': 'unit.tests.:www.dynamic:A:one',
'geoipContinents': ['AS', 'OC'],
'geoipCountries': ['ES', 'SE', 'UA'],
'regions': [{
'continentCode': 'NA',
'countryCode': 'CA',
'regionCode': 'NL'}]
}),
call('POST', '/pools/A', data={
'name': 'unit.tests.:www.dynamic:A:two',
'type': 'A',
'numReturn': 1,
'minAvailableFailover': 1,
'ttl': 300,
'values': [{
'value': '1.2.3.4',
'weight': 1
}, {
'value': '1.2.3.5',
'weight': 1}]
})
])
# These two checks are broken up so that ordering doesn't break things.
# Python3 doesn't make the calls in a consistent order so different
# things follow the GET / on different runs
provider._client._request.assert_has_calls([
call('POST', '/domains/123123/records/SRV', data={
'roundRobin': [{
'priority': 10,
'weight': 20,
'value': 'foo-1.unit.tests.',
'port': 30
}, {
'priority': 12,
'weight': 20,
'value': 'foo-2.unit.tests.',
'port': 30
}],
'name': '_srv._tcp',
'ttl': 600,
}),
])
self.assertEquals(28, provider._client._request.call_count)
provider._client._request.reset_mock()
provider._client.records = Mock(return_value=[
{
'id': 11189897,
'type': 'A',
'name': 'www',
'ttl': 300,
'recordOption': 'roundRobin',
'value': [
'1.2.3.4',
'2.2.3.4',
]
}, {
'id': 11189898,
'type': 'A',
'name': 'ttl',
'ttl': 600,
'recordOption': 'roundRobin',
'value': [
'3.2.3.4'
]
}, {
'id': 11189899,
'type': 'ALIAS',
'name': 'alias',
'ttl': 600,
'recordOption': 'roundRobin',
'value': [{
'value': 'aname.unit.tests.'
}]
}, {
"id": 1808520,
"type": "A",
"name": "www.dynamic",
"geolocation": {
"geoipFilter": 1
},
"recordOption": "pools",
"ttl": 300,
"value": [],
"pools": [
1808521
]
}, {
"id": 1808521,
"type": "A",
"name": "www.dynamic",
"geolocation": {
"geoipFilter": 5303
},
"recordOption": "pools",
"ttl": 300,
"value": [],
"pools": [
1808522
]
}
])
provider._client.pools = Mock(return_value=[
{
"id": 1808521,
"name": "unit.tests.:www.dynamic:A:two",
"type": "A",
"values": [
{
"value": "1.2.3.4",
"weight": 1
},
{
"value": "1.2.3.5",
"weight": 1
}
]
},
{
"id": 1808522,
"name": "unit.tests.:www.dynamic:A:one",
"type": "A",
"values": [
{
"value": "1.2.3.6",
"weight": 1
},
{
"value": "1.2.3.7",
"weight": 1
}
]
}
])
provider._client.geofilters = Mock(return_value=[
{
"id": 5303,
"name": "unit.tests.:www.dynamic:A:one",
"filterRulesLimit": 100,
"geoipContinents": ["AS", "OC"],
"geoipCountries": ["ES", "SE", "UA"],
"regions": [
{
"continentCode": "NA",
"countryCode": "CA",
"regionCode": "NL"
}
]
}
])
# Domain exists, we don't care about return
resp.json.side_effect = [
[],
[],
[],
[],
{
'id': 123123,
'name': 'unit.tests',
'hasGeoIP': True
} # domain listed for enabling geo
]
wanted = Zone('unit.tests.', [])
wanted.add_record(Record.new(wanted, 'ttl', {
'ttl': 300,
'type': 'A',
'value': '3.2.3.4'
}))
wanted.add_record(Record.new(wanted, 'www.dynamic', {
'ttl': 300,
'type': 'A',
'values': [
'1.2.3.4'
],
'dynamic': {
'pools': {
'one': {
'fallback': 'two',
'values': [{
'value': '1.2.3.6',
'weight': 1
}, {
'value': '1.2.3.7',
'weight': 1
}],
},
'two': {
'values': [{
'value': '1.2.3.4',
'weight': 1
}],
},
},
'rules': [{
'geos': [
'AS',
'EU-ES',
'EU-UA',
'EU-SE',
'NA-CA-NL',
'OC'
],
'pool': 'one'
}, {
'pool': 'two',
}],
},
}))
plan = provider.plan(wanted)
self.assertEquals(4, len(plan.changes))
self.assertEquals(4, provider.apply(plan))
# recreate for update, and deletes for the 2 parts of the other
provider._client._request.assert_has_calls([
call('POST', '/domains/123123/records/A', data={
'roundRobin': [{
'value': '3.2.3.4'
}],
'name': 'ttl',
'ttl': 300
}),
call('DELETE', '/domains/123123/records/A/1808521'),
call('DELETE', '/geoFilters/5303'),
call('DELETE', '/pools/A/1808522'),
call('DELETE', '/domains/123123/records/A/1808520'),
call('DELETE', '/pools/A/1808521'),
call('DELETE', '/domains/123123/records/ANAME/11189899'),
call('PUT', '/pools/A/1808522', data={
'name': 'unit.tests.:www.dynamic:A:one',
'type': 'A',
'numReturn': 1,
'minAvailableFailover': 1,
'ttl': 300,
'values': [
{'value': '1.2.3.6', 'weight': 1},
{'value': '1.2.3.7', 'weight': 1}],
'id': 1808522,
'geofilter': 5303
}),
call('PUT', '/geoFilters/5303', data={
'filterRulesLimit': 100,
'name': 'unit.tests.:www.dynamic:A:one',
'geoipContinents': ['AS', 'OC'],
'geoipCountries': ['ES', 'SE', 'UA'],
'regions': [{
'continentCode': 'NA',
'countryCode': 'CA',
'regionCode': 'NL'}],
'id': 5303
}),
call('PUT', '/pools/A/1808521', data={
'name': 'unit.tests.:www.dynamic:A:two',
'type': 'A',
'numReturn': 1,
'minAvailableFailover': 1,
'ttl': 300,
'values': [{'value': '1.2.3.4', 'weight': 1}],
'id': 1808521,
'geofilter': 1
}),
call('GET', '/domains/123123'),
call('POST', '/domains/123123/records/A', data={
'name': 'www.dynamic',
'ttl': 300,
'pools': [1808522],
'recordOption': 'pools',
'geolocation': {
'geoipUserRegion': [5303]
}
}),
call('POST', '/domains/123123/records/A', data={
'name': 'www.dynamic',
'ttl': 300,
'pools': [1808522],
'recordOption': 'pools',
'geolocation': {
'geoipUserRegion': [5303]
}
})
], any_order=True)
def test_dynamic_record_failures(self):
provider = ConstellixProvider('test', 'api', 'secret')
resp = Mock()
resp.json = Mock()
provider._client._request = Mock(return_value=resp)
# Let's handle some failures for pools - first if it's not a simple
# weighted pool - we'll be OK as we assume a weight of 1 for all
# entries
provider._client._request.reset_mock()
provider._client.records = Mock(return_value=[
{
"id": 1808520,
"type": "A",
"name": "www.dynamic",
"geolocation": None,
"recordOption": "pools",
"ttl": 300,
"value": [],
"pools": [
1808521
]
}
])
provider._client.pools = Mock(return_value=[{
"id": 1808521,
"name": "unit.tests.:www.dynamic:A:two",
"type": "A",
"values": [
{
"value": "1.2.3.4",
"weight": 1
}
]
}])
provider._client.geofilters = Mock(return_value=[])
wanted = Zone('unit.tests.', [])
resp.json.side_effect = [
['{}'],
['{}'],
]
wanted.add_record(Record.new(wanted, 'www.dynamic', {
'ttl': 300,
'type': 'A',
'values': [
'1.2.3.4'
],
'dynamic': {
'pools': {
'two': {
'values': [{
'value': '1.2.3.4'
}],
},
},
'rules': [{
'pool': 'two',
}],
},
}))
plan = provider.plan(wanted)
self.assertIsNone(plan)
def test_dynamic_record_updates(self):
provider = ConstellixProvider('test', 'api', 'secret')
# Constellix API can return an error if you try and update a pool and
# don't change anything, so let's test we handle it silently
provider._client.records = Mock(return_value=[
{
"id": 1808520,
"type": "A",
"name": "www.dynamic",
"geolocation": {
"geoipFilter": 1
},
"recordOption": "pools",
"ttl": 300,
"value": [],
"pools": [
1808521
]
}, {
"id": 1808521,
"type": "A",
"name": "www.dynamic",
"geolocation": {
"geoipFilter": 5303
},
"recordOption": "pools",
"ttl": 300,
"value": [],
"pools": [
1808522
]
}
])
provider._client.pools = Mock(return_value=[
{
"id": 1808521,
"name": "unit.tests.:www.dynamic:A:two",
"type": "A",
"values": [
{
"value": "1.2.3.4",
"weight": 1
},
{
"value": "1.2.3.5",
"weight": 1
}
]
},
{
"id": 1808522,
"name": "unit.tests.:www.dynamic:A:one",
"type": "A",
"values": [
{
"value": "1.2.3.6",
"weight": 1
},
{
"value": "1.2.3.7",
"weight": 1
}
]
}
])
provider._client.geofilters = Mock(return_value=[
{
"id": 6303,
"name": "some.other",
"filterRulesLimit": 100,
"createdTs": "2021-08-19T14:47:47Z",
"modifiedTs": "2021-08-19T14:47:47Z",
"geoipContinents": ["AS", "OC"],
"geoipCountries": ["ES", "SE", "UA"],
"regions": [
{
"continentCode": "NA",
"countryCode": "CA",
"regionCode": "NL"
}
]
}, {
"id": 5303,
"name": "unit.tests.:www.dynamic:A:one",
"filterRulesLimit": 100,
"geoipContinents": ["AS", "OC"],
"geoipCountries": ["ES", "SE", "UA"],
"regions": [
{
"continentCode": "NA",
"countryCode": "CA",
"regionCode": "NL"
}
]
}
])
wanted = Zone('unit.tests.', [])
wanted.add_record(Record.new(wanted, 'www.dynamic', {
'ttl': 300,
'type': 'A',
'values': [
'1.2.3.4'
],
'dynamic': {
'pools': {
'one': {
'fallback': 'two',
'values': [{
'value': '1.2.3.6',
'weight': 1
}, {
'value': '1.2.3.7',
'weight': 1
}],
},
'two': {
'values': [{
'value': '1.2.3.4',
'weight': 1
}],
},
},
'rules': [{
'geos': [
'AS',
'EU-ES',
'EU-UA',
'EU-SE',
'OC'
],
'pool': 'one'
}, {
'pool': 'two',
}],
},
}))
# Try an error we can handle
with requests_mock() as mock:
mock.get(
"https://api.dns.constellix.com/v1/domains",
status_code=200,
text='[{"id": 1234, "name": "unit.tests", "hasGeoIP": true}]')
mock.get(
"https://api.dns.constellix.com/v1/domains/1234",
status_code=200,
text='{"id": 1234, "name": "unit.tests", "hasGeoIP": true}')
mock.delete(ANY, status_code=200,
text='{}')
mock.put("https://api.dns.constellix.com/v1/pools/A/1808521",
status_code=400,
text='{"errors": [\"no changes to save\"]}')
mock.put("https://api.dns.constellix.com/v1/pools/A/1808522",
status_code=400,
text='{"errors": [\"no changes to save\"]}')
mock.put("https://api.dns.constellix.com/v1/geoFilters/5303",
status_code=400,
text='{"errors": [\"no changes to save\"]}')
mock.post(ANY, status_code=200,
text='[{"id": 1234}]')
plan = provider.plan(wanted)
self.assertEquals(1, len(plan.changes))
self.assertEquals(1, provider.apply(plan))
provider._client.geofilters = Mock(return_value=[
{
"id": 5303,
"name": "unit.tests.:www.dynamic:A:one",
"filterRulesLimit": 100,
"regions": [
{
"continentCode": "NA",
"countryCode": "CA",
"regionCode": "NL"
}
]
}
])
plan = provider.plan(wanted)
self.assertEquals(1, len(plan.changes))
self.assertEquals(1, provider.apply(plan))
provider._client.geofilters = Mock(return_value=[
{
"id": 5303,
"name": "unit.tests.:www.dynamic:A:one",
"filterRulesLimit": 100,
"geoipContinents": ["AS", "OC"],
}
])
plan = provider.plan(wanted)
self.assertEquals(1, len(plan.changes))
self.assertEquals(1, provider.apply(plan))
# Now what happens if an error happens that we can't handle
# geofilter case
with requests_mock() as mock:
mock.get(
"https://api.dns.constellix.com/v1/domains",
status_code=200,
text='[{"id": 1234, "name": "unit.tests", "hasGeoIP": true}]')
mock.get(
"https://api.dns.constellix.com/v1/domains/1234",
status_code=200,
text='{"id": 1234, "name": "unit.tests", "hasGeoIP": true}')
mock.delete(ANY, status_code=200,
text='{}')
mock.put("https://api.dns.constellix.com/v1/pools/A/1808521",
status_code=400,
text='{"errors": [\"no changes to save\"]}')
mock.put("https://api.dns.constellix.com/v1/pools/A/1808522",
status_code=400,
text='{"errors": [\"no changes to save\"]}')
mock.put("https://api.dns.constellix.com/v1/geoFilters/5303",
status_code=400,
text='{"errors": [\"generic error\"]}')
mock.post(ANY, status_code=200,
text='[{"id": 1234}]')
plan = provider.plan(wanted)
self.assertEquals(1, len(plan.changes))
with self.assertRaises(ConstellixClientBadRequest):
provider.apply(plan)
# Now what happens if an error happens that we can't handle
with requests_mock() as mock:
mock.get(
"https://api.dns.constellix.com/v1/domains",
status_code=200,
text='[{"id": 1234, "name": "unit.tests", "hasGeoIP": true}]')
mock.get(
"https://api.dns.constellix.com/v1/domains/1234",
status_code=200,
text='{"id": 1234, "name": "unit.tests", "hasGeoIP": true}')
mock.delete(ANY, status_code=200,
text='{}')
mock.put("https://api.dns.constellix.com/v1/pools/A/1808521",
status_code=400,
text='{"errors": [\"generic error\"]}')
mock.put("https://api.dns.constellix.com/v1/pools/A/1808522",
status_code=400,
text='{"errors": [\"generic error\"]}')
mock.put("https://api.dns.constellix.com/v1/geoFilters/5303",
status_code=400,
text='{"errors": [\"generic error\"]}')
mock.post(ANY, status_code=200,
text='[{"id": 1234}]')
plan = provider.plan(wanted)
self.assertEquals(1, len(plan.changes))
with self.assertRaises(ConstellixClientBadRequest):
provider.apply(plan)
def test_pools_that_are_notfound(self):
provider = ConstellixProvider('test', 'api', 'secret')
provider._client.pools = Mock(return_value=[{
"id": 1808521,
"name": "unit.tests.:www.dynamic:A:two",
"type": "A",
"values": [
{
"value": "1.2.3.4",
"weight": 1
}
]
}])
self.assertIsNone(provider._client.pool_by_id('A', 1))
self.assertIsNone(provider._client.pool('A', 'foobar'))
def test_pools_are_cached_correctly(self):
provider = ConstellixProvider('test', 'api', 'secret')
provider._client.pools = Mock(return_value=[{
"id": 1808521,
"name": "unit.tests.:www.dynamic:A:two",
"type": "A",
"values": [
{
"value": "1.2.3.4",
"weight": 1
}
]
}])
found = provider._client.pool('A', 'unit.tests.:www.dynamic:A:two')
self.assertIsNotNone(found)
not_found = provider._client.pool('AAAA',
'unit.tests.:www.dynamic:A:two')
self.assertIsNone(not_found)
provider._client.pools = Mock(return_value=[{
"id": 42,
"name": "unit.tests.:www.dynamic:A:two",
"type": "A",
"values": [
{
"value": "1.2.3.4",
"weight": 1
}
]
}, {
"id": 451,
"name": "unit.tests.:www.dynamic:A:two",
"type": "AAAA",
"values": [
{
"value": "1.2.3.4",
"weight": 1
}
]
}])
a_pool = provider._client.pool('A', 'unit.tests.:www.dynamic:A:two')
self.assertEquals(42, a_pool['id'])
aaaa_pool = provider._client.pool('AAAA',
'unit.tests.:www.dynamic:A:two')
self.assertEquals(451, aaaa_pool['id'])
| 32.886213 | 79 | 0.386513 | 3,295 | 39,595 | 4.578452 | 0.098027 | 0.040567 | 0.009744 | 0.021742 | 0.863847 | 0.844425 | 0.814265 | 0.792655 | 0.762362 | 0.755204 | 0 | 0.057968 | 0.470211 | 39,595 | 1,203 | 80 | 32.913549 | 0.6612 | 0.057836 | 0 | 0.770913 | 0 | 0 | 0.211531 | 0.039852 | 0 | 0 | 0 | 0 | 0.042776 | 1 | 0.006654 | false | 0 | 0.010456 | 0 | 0.021863 | 0.000951 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e3dfa5c11db43e7d056cca6f4cc2a99875eb2ada | 13,573 | py | Python | tests/test_instancemethod.py | ionelmc/wrapt | 4abbac872ccf0c253374277ce7c72f188b8469b7 | [
"BSD-2-Clause"
] | null | null | null | tests/test_instancemethod.py | ionelmc/wrapt | 4abbac872ccf0c253374277ce7c72f188b8469b7 | [
"BSD-2-Clause"
] | null | null | null | tests/test_instancemethod.py | ionelmc/wrapt | 4abbac872ccf0c253374277ce7c72f188b8469b7 | [
"BSD-2-Clause"
] | null | null | null | from __future__ import print_function
import unittest
import inspect
import imp
import wrapt
from wrapt import six
DECORATORS_CODE = """
import wrapt
@wrapt.decorator
def passthru_decorator(wrapped, instance, args, kwargs):
return wrapped(*args, **kwargs)
"""
decorators = imp.new_module('decorators')
six.exec_(DECORATORS_CODE, decorators.__dict__, decorators.__dict__)
class OldClass1():
def function(self, arg):
'''documentation'''
return arg
OldClass1o = OldClass1
class OldClass1():
@decorators.passthru_decorator
def function(self, arg):
'''documentation'''
return arg
OldClass1d = OldClass1
class TestNamingInstanceMethodOldStyle(unittest.TestCase):
def test_class_object_name(self):
# Test preservation of instance method __name__ attribute.
self.assertEqual(OldClass1d.function.__name__,
OldClass1o.function.__name__)
def test_instance_object_name(self):
# Test preservation of instance method __name__ attribute.
self.assertEqual(OldClass1d().function.__name__,
OldClass1o().function.__name__)
def test_class_object_qualname(self):
# Test preservation of instance method __qualname__ attribute.
try:
__qualname__ = OldClass1o.original.__qualname__
except AttributeError:
pass
else:
self.assertEqual(OldClass1d.function.__qualname__, __qualname__)
def test_instance_object_qualname(self):
# Test preservation of instance method __qualname__ attribute.
try:
__qualname__ = OldClass1o().original.__qualname__
except AttributeError:
pass
else:
self.assertEqual(OldClass1d().function.__qualname__, __qualname__)
def test_class_module_name(self):
# Test preservation of instance method __module__ attribute.
self.assertEqual(OldClass1d.function.__module__,
OldClass1o.function.__module__)
def test_instance_module_name(self):
# Test preservation of instance method __module__ attribute.
self.assertEqual(OldClass1d().function.__module__,
OldClass1o().function.__module__)
def test_class_doc_string(self):
# Test preservation of instance method __doc__ attribute.
self.assertEqual(OldClass1d.function.__doc__,
OldClass1o.function.__doc__)
def test_instance_doc_string(self):
# Test preservation of instance method __doc__ attribute.
self.assertEqual(OldClass1d().function.__doc__,
OldClass1o().function.__doc__)
def test_class_argspec(self):
# Test preservation of instance method argument specification.
original_argspec = inspect.getargspec(OldClass1o.function)
function_argspec = inspect.getargspec(OldClass1d.function)
self.assertEqual(original_argspec, function_argspec)
def test_instance_argspec(self):
# Test preservation of instance method argument specification.
original_argspec = inspect.getargspec(OldClass1o().function)
function_argspec = inspect.getargspec(OldClass1d().function)
self.assertEqual(original_argspec, function_argspec)
def test_class_isinstance(self):
# Test preservation of isinstance() checks.
self.assertTrue(isinstance(OldClass1d.function,
type(OldClass1o.function)))
def test_instance_isinstance(self):
# Test preservation of isinstance() checks.
self.assertTrue(isinstance(OldClass1d().function,
type(OldClass1o().function)))
class NewClass1(object):
def function(self, arg):
'''documentation'''
return arg
NewClass1o = NewClass1
class NewClass1(object):
@decorators.passthru_decorator
def function(self, arg):
'''documentation'''
return arg
NewClass1d = NewClass1
class TestNamingInstanceMethodNewStyle(unittest.TestCase):
def test_class_object_name(self):
# Test preservation of instance method __name__ attribute.
self.assertEqual(NewClass1d.function.__name__,
NewClass1o.function.__name__)
def test_instance_object_name(self):
# Test preservation of instance method __name__ attribute.
self.assertEqual(NewClass1d().function.__name__,
NewClass1o().function.__name__)
def test_class_object_qualname(self):
# Test preservation of instance method __qualname__ attribute.
try:
__qualname__ = NewClass1o.original.__qualname__
except AttributeError:
pass
else:
self.assertEqual(NewClass1d.function.__qualname__, __qualname__)
def test_instance_object_qualname(self):
# Test preservation of instance method __qualname__ attribute.
try:
__qualname__ = NewClass1o().original.__qualname__
except AttributeError:
pass
else:
self.assertEqual(NewClass1d().function.__qualname__, __qualname__)
def test_class_module_name(self):
# Test preservation of instance method __module__ attribute.
self.assertEqual(NewClass1d.function.__module__,
NewClass1o.function.__module__)
def test_instance_module_name(self):
# Test preservation of instance method __module__ attribute.
self.assertEqual(NewClass1d().function.__module__,
NewClass1o().function.__module__)
def test_class_doc_string(self):
# Test preservation of instance method __doc__ attribute.
self.assertEqual(NewClass1d.function.__doc__,
NewClass1o.function.__doc__)
def test_instance_doc_string(self):
# Test preservation of instance method __doc__ attribute.
self.assertEqual(NewClass1d().function.__doc__,
NewClass1o().function.__doc__)
def test_class_argspec(self):
# Test preservation of instance method argument specification.
original_argspec = inspect.getargspec(NewClass1o.function)
function_argspec = inspect.getargspec(NewClass1d.function)
self.assertEqual(original_argspec, function_argspec)
def test_instance_argspec(self):
# Test preservation of instance method argument specification.
original_argspec = inspect.getargspec(NewClass1o().function)
function_argspec = inspect.getargspec(NewClass1d().function)
self.assertEqual(original_argspec, function_argspec)
def test_class_isinstance(self):
# Test preservation of isinstance() checks.
self.assertTrue(isinstance(NewClass1d.function,
type(NewClass1o.function)))
def test_instance_isinstance(self):
# Test preservation of isinstance() checks.
self.assertTrue(isinstance(NewClass1d().function,
type(NewClass1o().function)))
class TestCallingInstanceMethodOldStyle(unittest.TestCase):
def test_class_call_function(self):
# Test calling instancemethod via class and passing in the class
# instance directly.
_args = (1, 2)
_kwargs = { 'one': 1, 'two': 2 }
@wrapt.decorator
def _decorator(wrapped, instance, args, kwargs):
self.assertNotEqual(instance, None)
self.assertEqual(args, _args)
self.assertEqual(kwargs, _kwargs)
return wrapped(*args, **kwargs)
@_decorator
def _function(*args, **kwargs):
return args, kwargs
class Class():
@_decorator
def _function(self, *args, **kwargs):
return (args, kwargs)
result = Class._function(*((Class(),)+_args), **_kwargs)
self.assertEqual(result, (_args, _kwargs))
def test_instance_call_function(self):
# Test calling instancemethod via class instance.
_args = (1, 2)
_kwargs = { 'one': 1, 'two': 2 }
@wrapt.decorator
def _decorator(wrapped, instance, args, kwargs):
self.assertNotEqual(instance, None)
self.assertEqual(args, _args)
self.assertEqual(kwargs, _kwargs)
return wrapped(*args, **kwargs)
@_decorator
def _function(*args, **kwargs):
return args, kwargs
class Class():
@_decorator
def _function(self, *args, **kwargs):
return (args, kwargs)
result = Class()._function(*_args, **_kwargs)
self.assertEqual(result, (_args, _kwargs))
def test_class_call_function_nested(self):
# Test calling instancemethod via class and passing in the class
# instance directly.
_args = (1, 2)
_kwargs = { 'one': 1, 'two': 2 }
@wrapt.decorator
def _decorator(wrapped, instance, args, kwargs):
self.assertNotEqual(instance, None)
self.assertEqual(args, _args)
self.assertEqual(kwargs, _kwargs)
return wrapped(*args, **kwargs)
@_decorator
def _function(*args, **kwargs):
return args, kwargs
class Class():
@_decorator
@_decorator
def _function(self, *args, **kwargs):
return (args, kwargs)
result = Class._function(*((Class(),)+_args), **_kwargs)
self.assertEqual(result, (_args, _kwargs))
def test_instance_call_function_nested(self):
# Test calling instancemethod via class instance.
_args = (1, 2)
_kwargs = { 'one': 1, 'two': 2 }
@wrapt.decorator
def _decorator(wrapped, instance, args, kwargs):
self.assertNotEqual(instance, None)
self.assertEqual(args, _args)
self.assertEqual(kwargs, _kwargs)
return wrapped(*args, **kwargs)
@_decorator
def _function(*args, **kwargs):
return args, kwargs
class Class():
@_decorator
@_decorator
def _function(self, *args, **kwargs):
return (args, kwargs)
result = Class()._function(*_args, **_kwargs)
self.assertEqual(result, (_args, _kwargs))
class TestCallingInstanceMethodNewStyle(unittest.TestCase):
def test_class_call_function(self):
# Test calling instancemethod via class and passing in the class
# instance directly.
_args = (1, 2)
_kwargs = { 'one': 1, 'two': 2 }
@wrapt.decorator
def _decorator(wrapped, instance, args, kwargs):
self.assertNotEqual(instance, None)
self.assertEqual(args, _args)
self.assertEqual(kwargs, _kwargs)
return wrapped(*args, **kwargs)
@_decorator
def _function(*args, **kwargs):
return args, kwargs
class Class(object):
@_decorator
def _function(self, *args, **kwargs):
return (args, kwargs)
result = Class._function(Class(), *_args, **_kwargs)
self.assertEqual(result, (_args, _kwargs))
def test_instance_call_function(self):
# Test calling instancemethod via class instance.
_args = (1, 2)
_kwargs = { 'one': 1, 'two': 2 }
@wrapt.decorator
def _decorator(wrapped, instance, args, kwargs):
self.assertNotEqual(instance, None)
self.assertEqual(args, _args)
self.assertEqual(kwargs, _kwargs)
return wrapped(*args, **kwargs)
@_decorator
def _function(*args, **kwargs):
return args, kwargs
class Class(object):
@_decorator
def _function(self, *args, **kwargs):
return (args, kwargs)
result = Class()._function(*_args, **_kwargs)
self.assertEqual(result, (_args, _kwargs))
def test_class_call_function_nested(self):
# Test calling instancemethod via class and passing in the class
# instance directly.
_args = (1, 2)
_kwargs = { 'one': 1, 'two': 2 }
@wrapt.decorator
def _decorator(wrapped, instance, args, kwargs):
self.assertNotEqual(instance, None)
self.assertEqual(args, _args)
self.assertEqual(kwargs, _kwargs)
return wrapped(*args, **kwargs)
@_decorator
def _function(*args, **kwargs):
return args, kwargs
class Class(object):
@_decorator
@_decorator
def _function(self, *args, **kwargs):
return (args, kwargs)
result = Class._function(Class(), *_args, **_kwargs)
self.assertEqual(result, (_args, _kwargs))
def test_instance_call_function_nested(self):
# Test calling instancemethod via class instance.
_args = (1, 2)
_kwargs = { 'one': 1, 'two': 2 }
@wrapt.decorator
def _decorator(wrapped, instance, args, kwargs):
self.assertNotEqual(instance, None)
self.assertEqual(args, _args)
self.assertEqual(kwargs, _kwargs)
return wrapped(*args, **kwargs)
@_decorator
def _function(*args, **kwargs):
return args, kwargs
class Class(object):
@_decorator
@_decorator
def _function(self, *args, **kwargs):
return (args, kwargs)
result = Class()._function(*_args, **_kwargs)
self.assertEqual(result, (_args, _kwargs))
if __name__ == '__main__':
unittest.main()
| 30.708145 | 78 | 0.6342 | 1,300 | 13,573 | 6.255385 | 0.063846 | 0.081161 | 0.059026 | 0.064929 | 0.933473 | 0.926463 | 0.926463 | 0.916626 | 0.916626 | 0.916626 | 0 | 0.009338 | 0.274147 | 13,573 | 441 | 79 | 30.777778 | 0.816078 | 0.141457 | 0 | 0.755396 | 0 | 0 | 0.01647 | 0.002328 | 0 | 0 | 0 | 0 | 0.201439 | 1 | 0.215827 | false | 0.02518 | 0.02518 | 0.057554 | 0.402878 | 0.003597 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
54381398741fc48765fe50145f15b5edcaaf3306 | 104 | py | Python | blitzdb/backends/mongo/__init__.py | jcollado/blitzdb | 88e1510fe555a0fe1cca15103bbef15e8caadf04 | [
"MIT"
] | null | null | null | blitzdb/backends/mongo/__init__.py | jcollado/blitzdb | 88e1510fe555a0fe1cca15103bbef15e8caadf04 | [
"MIT"
] | null | null | null | blitzdb/backends/mongo/__init__.py | jcollado/blitzdb | 88e1510fe555a0fe1cca15103bbef15e8caadf04 | [
"MIT"
] | null | null | null | from blitzdb.backends.mongo.backend import Backend
from blitzdb.backends.mongo.queryset import QuerySet
| 34.666667 | 52 | 0.865385 | 14 | 104 | 6.428571 | 0.5 | 0.244444 | 0.422222 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 104 | 2 | 53 | 52 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
546c0b9cfe724754483d3ded1e481bb19a29e128 | 7,222 | py | Python | fireant/tests/slicer/query_builder/test_build_joins.py | vladaspasic/fireant | 2dbae6a97a927ef62fdcd5f37fcb51a7d6d55334 | [
"Apache-2.0"
] | null | null | null | fireant/tests/slicer/query_builder/test_build_joins.py | vladaspasic/fireant | 2dbae6a97a927ef62fdcd5f37fcb51a7d6d55334 | [
"Apache-2.0"
] | null | null | null | fireant/tests/slicer/query_builder/test_build_joins.py | vladaspasic/fireant | 2dbae6a97a927ef62fdcd5f37fcb51a7d6d55334 | [
"Apache-2.0"
] | null | null | null | from unittest import TestCase
import fireant as f
from ..mocks import slicer
# noinspection SqlDialectInspection,SqlNoDataSourceInspection
class QueryBuilderJoinTests(TestCase):
maxDiff = None
def test_dimension_with_join_includes_join_in_query(self):
queries = slicer.data \
.widget(f.DataTablesJS(slicer.metrics.votes)) \
.dimension(slicer.dimensions.timestamp) \
.dimension(slicer.dimensions.district) \
.queries
self.assertEqual(len(queries), 1)
self.assertEqual('SELECT '
'TRUNC("politician"."timestamp",\'DD\') "$d$timestamp",'
'"politician"."district_id" "$d$district",'
'"district"."district_name" "$d$district_display",'
'SUM("politician"."votes") "$m$votes" '
'FROM "politics"."politician" '
'OUTER JOIN "locations"."district" '
'ON "politician"."district_id"="district"."id" '
'GROUP BY "$d$timestamp","$d$district","$d$district_display" '
'ORDER BY "$d$timestamp","$d$district_display"', str(queries[0]))
def test_dimension_with_multiple_joins_includes_joins_ordered__in_query(self):
queries = slicer.data \
.widget(f.DataTablesJS(slicer.metrics.votes,
slicer.metrics.voters)) \
.dimension(slicer.dimensions.timestamp) \
.dimension(slicer.dimensions.district) \
.queries
self.assertEqual(len(queries), 1)
self.assertEqual('SELECT '
'TRUNC("politician"."timestamp",\'DD\') "$d$timestamp",'
'"politician"."district_id" "$d$district",'
'"district"."district_name" "$d$district_display",'
'SUM("politician"."votes") "$m$votes",'
'COUNT("voter"."id") "$m$voters" '
'FROM "politics"."politician" '
'JOIN "politics"."voter" '
'ON "politician"."id"="voter"."politician_id" '
'OUTER JOIN "locations"."district" '
'ON "politician"."district_id"="district"."id" '
'GROUP BY "$d$timestamp","$d$district","$d$district_display" '
'ORDER BY "$d$timestamp","$d$district_display"', str(queries[0]))
def test_dimension_with_recursive_join_joins_all_join_tables(self):
queries = slicer.data \
.widget(f.DataTablesJS(slicer.metrics.votes)) \
.dimension(slicer.dimensions.timestamp) \
.dimension(slicer.dimensions.state) \
.queries
self.assertEqual(len(queries), 1)
self.assertEqual('SELECT '
'TRUNC("politician"."timestamp",\'DD\') "$d$timestamp",'
'"district"."state_id" "$d$state",'
'"state"."state_name" "$d$state_display",'
'SUM("politician"."votes") "$m$votes" '
'FROM "politics"."politician" '
'OUTER JOIN "locations"."district" '
'ON "politician"."district_id"="district"."id" '
'JOIN "locations"."state" '
'ON "district"."state_id"="state"."id" '
'GROUP BY "$d$timestamp","$d$state","$d$state_display" '
'ORDER BY "$d$timestamp","$d$state_display"', str(queries[0]))
def test_metric_with_join_includes_join_in_query(self):
queries = slicer.data \
.widget(f.DataTablesJS(slicer.metrics.voters)) \
.dimension(slicer.dimensions.political_party) \
.queries
self.assertEqual(len(queries), 1)
self.assertEqual('SELECT '
'"politician"."political_party" "$d$political_party",'
'COUNT("voter"."id") "$m$voters" '
'FROM "politics"."politician" '
'JOIN "politics"."voter" '
'ON "politician"."id"="voter"."politician_id" '
'GROUP BY "$d$political_party" '
'ORDER BY "$d$political_party"', str(queries[0]))
def test_dimension_filter_with_join_on_display_definition_does_not_include_join_in_query(self):
queries = slicer.data \
.widget(f.DataTablesJS(slicer.metrics.votes)) \
.filter(slicer.dimensions.district.isin([1])) \
.queries
self.assertEqual(len(queries), 1)
self.assertEqual('SELECT '
'SUM("votes") "$m$votes" '
'FROM "politics"."politician" '
'WHERE "district_id" IN (1)', str(queries[0]))
def test_dimension_filter_display_field_with_join_includes_join_in_query(self):
queries = slicer.data \
.widget(f.DataTablesJS(slicer.metrics.votes)) \
.filter(slicer.dimensions.district.display.isin(['District 4'])) \
.queries
self.assertEqual(len(queries), 1)
self.assertEqual('SELECT '
'SUM("politician"."votes") "$m$votes" '
'FROM "politics"."politician" '
'OUTER JOIN "locations"."district" '
'ON "politician"."district_id"="district"."id" '
'WHERE "district"."district_name" IN (\'District 4\')', str(queries[0]))
def test_dimension_filter_with_recursive_join_includes_join_in_query(self):
queries = slicer.data \
.widget(f.DataTablesJS(slicer.metrics.votes)) \
.filter(slicer.dimensions.state.isin([1])) \
.queries
self.assertEqual(len(queries), 1)
self.assertEqual('SELECT '
'SUM("politician"."votes") "$m$votes" '
'FROM "politics"."politician" '
'OUTER JOIN "locations"."district" '
'ON "politician"."district_id"="district"."id" '
'WHERE "district"."state_id" IN (1)', str(queries[0]))
def test_dimension_filter_with_deep_recursive_join_includes_joins_in_query(self):
queries = slicer.data \
.widget(f.DataTablesJS(slicer.metrics.votes)) \
.filter(slicer.dimensions.deepjoin.isin([1])) \
.queries
self.assertEqual(len(queries), 1)
self.assertEqual('SELECT '
'SUM("politician"."votes") "$m$votes" '
'FROM "politics"."politician" '
'OUTER JOIN "locations"."district" '
'ON "politician"."district_id"="district"."id" '
'JOIN "locations"."state" '
'ON "district"."state_id"="state"."id" '
'JOIN "test"."deep" '
'ON "deep"."id"="state"."ref_id" '
'WHERE "deep"."id" IN (1)', str(queries[0]))
| 46.593548 | 99 | 0.51537 | 654 | 7,222 | 5.525994 | 0.123853 | 0.066408 | 0.037631 | 0.046486 | 0.833702 | 0.830935 | 0.784726 | 0.784726 | 0.763143 | 0.748201 | 0 | 0.005063 | 0.343672 | 7,222 | 154 | 100 | 46.896104 | 0.757384 | 0.008169 | 0 | 0.706349 | 0 | 0 | 0.327189 | 0.210585 | 0 | 0 | 0 | 0 | 0.126984 | 1 | 0.063492 | false | 0 | 0.02381 | 0 | 0.103175 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
54808d45f03ed01918c14bedef2dcea0b6797ccf | 9,324 | py | Python | tests/test_app_manager.py | jayvdb/django-test-tools | a832cc6acf8e45c8d6b0cd5e3c424b95595c1855 | [
"MIT"
] | 9 | 2017-04-29T20:21:07.000Z | 2021-11-16T07:00:01.000Z | tests/test_app_manager.py | jayvdb/django-test-tools | a832cc6acf8e45c8d6b0cd5e3c424b95595c1855 | [
"MIT"
] | 211 | 2017-11-21T00:23:03.000Z | 2022-03-28T02:06:25.000Z | tests/test_app_manager.py | jayvdb/django-test-tools | a832cc6acf8e45c8d6b0cd5e3c424b95595c1855 | [
"MIT"
] | 4 | 2017-11-21T18:19:53.000Z | 2021-05-24T06:34:16.000Z | from django.conf import settings
from django.test import TestCase
from django_test_tools.app_manager import DjangoAppManager
class TestDjangoAppManager(TestCase):
def test_installed_apps(self):
app_manager = DjangoAppManager()
self.assertEqual(9, len(app_manager.installed_apps))
def test_get_app(self):
app_manager = DjangoAppManager()
app = app_manager.get_app(settings.TEST_APP_SERVERS)
self.assertEqual(settings.TEST_APP_SERVERS, app.name)
self.assertEqual('example.servers', app.name)
self.assertEqual(app.models['server'].__name__, 'server')
self.assertEqual(len(app.models['server']._meta.fields), 11)
self.assertEqual(app.models['server']._meta.fields[0].name, 'id')
self.assertEqual(type(app.models['server']._meta.fields[0].name).__name__, 'id')
def test_get_project_apps(self):
app_manager = DjangoAppManager()
app_module = settings.TEST_APP_SERVERS.split('.')[0]
apps = app_manager.get_project_apps(app_module)
self.assertEqual(2, len(apps))
apps = app_manager.get_project_apps('django')
self.assertEqual(6, len(apps))
def test_get_app(self):
app_manager = DjangoAppManager()
app_dict = app_manager.get_app_data(settings.TEST_APP_PEOPLE)
# write_assertions(app_dict, 'app_dict')
# self.fail('Writing assertions')
self.assertEqual(app_dict['app_name'], 'example.people')
self.assertEqual(len(app_dict['models']['person']['fields']), 16)
self.assertEqual(app_dict['models']['person']['fields'][0]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][0]['field_name'], 'id')
self.assertEqual(app_dict['models']['person']['fields'][0]['type'], 'AutoField')
self.assertEqual(app_dict['models']['person']['fields'][0]['unique'], True)
self.assertEqual(app_dict['models']['person']['fields'][1]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][1]['field_name'], 'first_name')
self.assertEqual(app_dict['models']['person']['fields'][1]['max_length'], 60)
self.assertEqual(app_dict['models']['person']['fields'][1]['type'], 'CharField')
self.assertEqual(app_dict['models']['person']['fields'][1]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][2]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][2]['field_name'], 'middle_name')
self.assertEqual(app_dict['models']['person']['fields'][2]['max_length'], 60)
self.assertEqual(app_dict['models']['person']['fields'][2]['type'], 'CharField')
self.assertEqual(app_dict['models']['person']['fields'][2]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][3]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][3]['field_name'], 'last_name')
self.assertEqual(app_dict['models']['person']['fields'][3]['max_length'], 60)
self.assertEqual(app_dict['models']['person']['fields'][3]['type'], 'CharField')
self.assertEqual(app_dict['models']['person']['fields'][3]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][4]['choices'], (('M', 'Male'), ('F', 'Female')))
self.assertEqual(app_dict['models']['person']['fields'][4]['choices_type'], 'tuple')
self.assertEqual(app_dict['models']['person']['fields'][4]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][4]['field_name'], 'sex')
self.assertEqual(app_dict['models']['person']['fields'][4]['max_length'], 1)
self.assertEqual(app_dict['models']['person']['fields'][4]['type'], 'CharField')
self.assertEqual(app_dict['models']['person']['fields'][4]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][5]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][5]['field_name'], 'national_id')
self.assertEqual(app_dict['models']['person']['fields'][5]['max_length'], 50)
self.assertEqual(app_dict['models']['person']['fields'][5]['type'], 'CharField')
self.assertEqual(app_dict['models']['person']['fields'][5]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][6]['choices'],
((1, 'National Id'), (2, 'Drivers License'), (3, 'Passport'), (4, 'Other')))
self.assertEqual(app_dict['models']['person']['fields'][6]['choices_type'], 'tuple')
self.assertEqual(app_dict['models']['person']['fields'][6]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][6]['field_name'], 'national_id_type')
self.assertEqual(app_dict['models']['person']['fields'][6]['type'], 'IntegerField')
self.assertEqual(app_dict['models']['person']['fields'][6]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][7]['choices_type'], 'Countries')
self.assertEqual(app_dict['models']['person']['fields'][7]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][7]['field_name'], 'country_for_id')
self.assertEqual(app_dict['models']['person']['fields'][7]['max_length'], 2)
self.assertEqual(app_dict['models']['person']['fields'][7]['type'], 'CountryField')
self.assertEqual(app_dict['models']['person']['fields'][7]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][8]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][8]['field_name'], 'picture')
self.assertEqual(app_dict['models']['person']['fields'][8]['max_length'], 100)
self.assertEqual(app_dict['models']['person']['fields'][8]['type'], 'ImageField')
self.assertEqual(app_dict['models']['person']['fields'][8]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][9]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][9]['field_name'], 'date_of_birth')
self.assertEqual(app_dict['models']['person']['fields'][9]['type'], 'DateField')
self.assertEqual(app_dict['models']['person']['fields'][9]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][10]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][10]['field_name'], 'blood_type')
self.assertEqual(app_dict['models']['person']['fields'][10]['max_length'], 4)
self.assertEqual(app_dict['models']['person']['fields'][10]['type'], 'CharField')
self.assertEqual(app_dict['models']['person']['fields'][10]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][11]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][11]['field_name'], 'religion')
self.assertEqual(app_dict['models']['person']['fields'][11]['max_length'], 60)
self.assertEqual(app_dict['models']['person']['fields'][11]['type'], 'CharField')
self.assertEqual(app_dict['models']['person']['fields'][11]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][12]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][12]['field_name'], 'document')
self.assertEqual(app_dict['models']['person']['fields'][12]['max_length'], 100)
self.assertEqual(app_dict['models']['person']['fields'][12]['type'], 'FileField')
self.assertEqual(app_dict['models']['person']['fields'][12]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][13]['choices_type'], 'list')
self.assertEqual(app_dict['models']['person']['fields'][13]['editable'], False)
self.assertEqual(app_dict['models']['person']['fields'][13]['field_name'], 'salary_currency')
self.assertEqual(app_dict['models']['person']['fields'][13]['max_length'], 3)
self.assertEqual(app_dict['models']['person']['fields'][13]['type'], 'CurrencyField')
self.assertEqual(app_dict['models']['person']['fields'][13]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][14]['decimal_places'], 2)
self.assertEqual(app_dict['models']['person']['fields'][14]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][14]['field_name'], 'salary')
self.assertEqual(app_dict['models']['person']['fields'][14]['max_digits'], 14)
self.assertEqual(app_dict['models']['person']['fields'][14]['type'], 'MoneyField')
self.assertEqual(app_dict['models']['person']['fields'][14]['unique'], False)
self.assertEqual(app_dict['models']['person']['fields'][15]['editable'], True)
self.assertEqual(app_dict['models']['person']['fields'][15]['field_name'], 'cell_phone')
self.assertEqual(app_dict['models']['person']['fields'][15]['max_length'], 16)
self.assertEqual(app_dict['models']['person']['fields'][15]['type'], 'CharField')
self.assertEqual(app_dict['models']['person']['fields'][15]['unique'], False)
self.assertEqual(app_dict['models']['person']['model_name'], 'Person')
self.assertEqual(app_dict['models']['person']['original_attrs']['abstract'], False)
| 75.193548 | 112 | 0.638996 | 1,112 | 9,324 | 5.185252 | 0.101619 | 0.252341 | 0.277836 | 0.331946 | 0.793097 | 0.756851 | 0.741242 | 0.730836 | 0.456296 | 0.090184 | 0 | 0.019014 | 0.125697 | 9,324 | 123 | 113 | 75.804878 | 0.688297 | 0.007508 | 0 | 0.052174 | 0 | 0 | 0.292617 | 0 | 0 | 0 | 0 | 0 | 0.843478 | 1 | 0.034783 | false | 0.008696 | 0.026087 | 0 | 0.069565 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
49ace69e501422d4db2ef082a4ea1c3590cfbc3e | 2,250 | py | Python | server/opendp_apps/dataverses/testing/__init__.py | mikephelan/opendp-ux | 80c65da0ed17adc01c69b05dbc9cbf3a5973a016 | [
"MIT"
] | 6 | 2021-05-25T18:50:58.000Z | 2022-03-23T19:52:15.000Z | server/opendp_apps/dataverses/testing/__init__.py | mikephelan/opendp-ux | 80c65da0ed17adc01c69b05dbc9cbf3a5973a016 | [
"MIT"
] | 298 | 2021-05-19T17:34:09.000Z | 2022-03-29T18:45:22.000Z | server/opendp_apps/dataverses/testing/__init__.py | opendp/dpcreator | 6ba3c58ecdcd81ca1f4533a14ce7604eccf6a646 | [
"MIT"
] | 2 | 2020-10-16T22:03:24.000Z | 2020-11-15T22:45:19.000Z | """
Running individual tests
python manage.py test opendp_apps.dataverses.testing.test_dataverse_handoff_view
python manage.py test opendp_apps.dataverses.testing.test_dv_user_handler
python manage.py test opendp_apps.dataverses.testing.test_endpoints.DataversePostTest
python manage.py test opendp_apps.dataverses.testing.test_endpoints.DataversePostTest
docker-compose run server python manage.py test opendp_apps.dataverses.testing.test_downloader_profiler.DownloadProfileTests.test_20_download_errors
docker-compose run server
python manage.py test opendp_apps.dataverses.testing.test_downloader_handler.DownloadHandlerTests.test_80_direct_profile
python manage.py test opendp_apps.dataverses.testing.test_file_view.FileViewGetTest.test_10_successful_get
python manage.py test opendp_apps.dataverses.testing.test_endpoints.DataversePutTest.test_10_successful_creation
python manage.py test opendp_apps.dataverses.testing.test_endpoints.DataversePutTest.test_40_invalid_site_url
python manage.py test opendp_apps.dataverses.testing.test_dataverse_incoming
python manage.py test opendp_apps.dataverses.testing.test_dataverse_incoming.DataverseIncomingTest.test_010_dv_params
python manage.py test opendp_apps.dataverses.testing.test_dataverse_incoming.DataverseIncomingTest.test_020_check_dv_handler_directly
python manage.py test opendp_apps.dataverses.testing.test_dataverse_incoming.DataverseIncomingTest.test_030_dv_handler_bad_param
python manage.py test opendp_apps.dataverses.testing.test_dataverse_incoming.DataverseIncomingTest.test_100_check_dv_handler_via_url
docker-compose run server python manage.py test opendp_apps.dataverses.testing.test_downloader_handler.DownloadHandlerTests
#.test_100_check_dv_handler_via_url
docker-compose run server python manage.py test opendp_apps.dataverses.testing.dv_user_handler_test
docker-compose run server python manage.py test opendp_apps.dataverses.testing.test_endpoints
docker-compose run server python manage.py test opendp_apps.dataverses.testing.test_dataverse_incoming.DataverseIncomingTest.test_010_dv_params
docker-compose run server python manage.py test opendp_apps.dataverses.testing.test_dataverse_incoming.DataverseIncomingTest.test_020_check_dv_handler_directly
"""
| 56.25 | 159 | 0.896 | 313 | 2,250 | 6.108626 | 0.178914 | 0.119247 | 0.139121 | 0.17887 | 0.882845 | 0.882845 | 0.882845 | 0.882845 | 0.882845 | 0.83159 | 0 | 0.01454 | 0.052444 | 2,250 | 39 | 160 | 57.692308 | 0.88227 | 0.996 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
49fe24a9cbbee313b0374faf70c775ece8af04b7 | 125 | py | Python | main.py | TheJokersThief/Daft2BigQuery | fb81ea933645da9737a3fc83c02dd225eb517042 | [
"MIT"
] | 3 | 2021-02-19T20:02:10.000Z | 2022-03-12T15:01:58.000Z | main.py | TheJokersThief/Daft2BigQuery | fb81ea933645da9737a3fc83c02dd225eb517042 | [
"MIT"
] | null | null | null | main.py | TheJokersThief/Daft2BigQuery | fb81ea933645da9737a3fc83c02dd225eb517042 | [
"MIT"
] | null | null | null | from daft2bigquery import ingest_pubsub
def execute_daft2bigquery(event, context):
return ingest_pubsub(event, context)
| 25 | 42 | 0.824 | 15 | 125 | 6.666667 | 0.666667 | 0.24 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018182 | 0.12 | 125 | 4 | 43 | 31.25 | 0.890909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
3fd0f8ab29aa32c6acba0afca15f04f091effe8f | 4,969 | py | Python | lib/mesh_util.py | tamnguyenvan/AnchorUDF | 4a25540365d6c52a632f7c5dc7dbc0094cff20df | [
"MIT"
] | 28 | 2021-08-16T11:50:33.000Z | 2022-02-27T14:20:02.000Z | lib/mesh_util.py | osmr/AnchorUDF | e08705e5b7350367df3868432ddb9a3a32628f5a | [
"MIT"
] | 14 | 2021-09-06T06:49:00.000Z | 2022-03-31T07:23:44.000Z | lib/mesh_util.py | osmr/AnchorUDF | e08705e5b7350367df3868432ddb9a3a32628f5a | [
"MIT"
] | 8 | 2021-09-25T10:54:03.000Z | 2022-03-30T08:06:14.000Z | import numpy as np
import torch
from torch.nn import functional as F
def reconstruction(net, cuda, calib_tensor, b_min, b_max, max_dist=0.1, filter_val=0.006, num_steps=10):
length = b_max[0] - b_min[0]
sample_num = 200000
samples_cpu = np.zeros((0, 3))
samples = torch.rand(1, sample_num, 3).float().to(device=cuda) * length + b_min[0]
samples.requires_grad = True
num_points = 900000
i = 0
while len(samples_cpu) < num_points:
print('iteration', i)
for j in range(num_steps):
print('refinement', j)
net.query(torch.transpose(samples, 1, 2), calib_tensor)
pred = net.get_preds()
pred = pred.squeeze(1)
print(pred)
df_pred = torch.clamp(pred, max=max_dist)
df_pred.sum().backward()
gradient = samples.grad.detach()
samples = samples.detach()
df_pred = df_pred.detach()
samples = samples - F.normalize(gradient, dim=2) * df_pred.reshape(-1, 1) # better use Tensor.copy method?
samples = samples.detach()
samples.requires_grad = True
print('finished refinement')
if not i == 0:
samples_cpu = np.vstack((samples_cpu, samples[df_pred < filter_val].detach().cpu().numpy()))
samples = samples[df_pred < 0.03].unsqueeze(0)
indices = torch.randint(samples.shape[1], (1, sample_num))
samples = samples[[[0, ] * sample_num], indices]
samples += (max_dist / 3) * torch.randn(samples.shape).to(device=cuda) # 3 sigma rule
samples = samples.detach()
samples.requires_grad = True
i += 1
print(samples_cpu.shape)
return samples_cpu
def reconstruction_anchor(net, cuda, calib_tensor, b_min, b_max, max_dist=0.1, filter_val=0.006, num_steps=10, num_points=900000):
length = b_max[0] - b_min[0]
sample_num = 200000
samples_cpu = np.zeros((0, 3))
samples = torch.rand(1, sample_num, 3).float().to(device=cuda) * length + b_min[0]
samples.requires_grad = True
i = 0
while len(samples_cpu) < num_points:
print('iteration', i)
for j in range(num_steps):
print('refinement', j)
net.query(torch.transpose(samples, 1, 2), calib_tensor)
pred = net.get_preds()
pred = pred.squeeze(1)
print(pred)
df_pred = torch.clamp(pred, max=max_dist)
df_pred.sum().backward()
gradient = samples.grad.detach()
samples = samples.detach()
df_pred = df_pred.detach()
samples = samples - F.normalize(gradient, dim=2) * df_pred.reshape(-1, 1) # better use Tensor.copy method?
samples = samples.detach()
samples.requires_grad = True
print('finished refinement')
if not i == 0:
samples_cpu = np.vstack((samples_cpu, samples[df_pred < filter_val].detach().cpu().numpy()))
samples = samples[df_pred < 0.03].unsqueeze(0)
indices = torch.randint(samples.shape[1], (1, sample_num))
samples = samples[[[0, ] * sample_num], indices]
samples += (max_dist / 3) * torch.randn(samples.shape).to(device=cuda) # 3 sigma rule
samples = samples.detach()
samples.requires_grad = True
i += 1
print(samples_cpu.shape)
return samples_cpu
def create_grid_points_from_bounds(minimun, maximum, res):
x = np.linspace(minimun, maximum, res)
X, Y, Z = np.meshgrid(x, x, x, indexing='ij')
X = X.reshape((np.prod(X.shape),))
Y = Y.reshape((np.prod(Y.shape),))
Z = Z.reshape((np.prod(Z.shape),))
points_list = np.column_stack((X, Y, Z))
del X, Y, Z, x
return points_list
def save_obj_mesh(mesh_path, verts, faces):
file = open(mesh_path, 'w')
for v in verts:
file.write('v %.4f %.4f %.4f\n' % (v[0], v[1], v[2]))
for f in faces:
f_plus = f + 1
file.write('f %d %d %d\n' % (f_plus[0], f_plus[2], f_plus[1]))
file.close()
def save_obj_mesh_with_color(mesh_path, verts, faces, colors):
file = open(mesh_path, 'w')
for idx, v in enumerate(verts):
c = colors[idx]
file.write('v %.4f %.4f %.4f %.4f %.4f %.4f\n' % (v[0], v[1], v[2], c[0], c[1], c[2]))
for f in faces:
f_plus = f + 1
file.write('f %d %d %d\n' % (f_plus[0], f_plus[2], f_plus[1]))
file.close()
def save_obj_mesh_with_uv(mesh_path, verts, faces, uvs):
file = open(mesh_path, 'w')
for idx, v in enumerate(verts):
vt = uvs[idx]
file.write('v %.4f %.4f %.4f\n' % (v[0], v[1], v[2]))
file.write('vt %.4f %.4f\n' % (vt[0], vt[1]))
for f in faces:
f_plus = f + 1
file.write('f %d/%d %d/%d %d/%d\n' % (f_plus[0], f_plus[0],
f_plus[2], f_plus[2],
f_plus[1], f_plus[1]))
file.close()
| 31.649682 | 130 | 0.569531 | 731 | 4,969 | 3.716826 | 0.177839 | 0.027604 | 0.041958 | 0.050791 | 0.829591 | 0.824071 | 0.814133 | 0.806036 | 0.801619 | 0.796099 | 0 | 0.037234 | 0.281143 | 4,969 | 156 | 131 | 31.852564 | 0.723404 | 0.017509 | 0 | 0.767857 | 0 | 0 | 0.042854 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053571 | false | 0 | 0.026786 | 0 | 0.107143 | 0.089286 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b2053ce2a13fefe4c2437258b7a98aea9a768090 | 14,507 | py | Python | symnet/metric.py | XYPB/myMaskRCNN-mxnet | 88a626b783cee9d8c1b4a6d54a53b95a9ed4a2eb | [
"Apache-2.0"
] | 2 | 2019-10-28T10:10:22.000Z | 2020-05-22T03:23:04.000Z | symnet/metric.py | XYPB/myMaskRCNN-mxnet | 88a626b783cee9d8c1b4a6d54a53b95a9ed4a2eb | [
"Apache-2.0"
] | null | null | null | symnet/metric.py | XYPB/myMaskRCNN-mxnet | 88a626b783cee9d8c1b4a6d54a53b95a9ed4a2eb | [
"Apache-2.0"
] | null | null | null | import mxnet as mx
import numpy as np
RPN_FEAT_STRIDE = [64, 32, 16, 8, 4]
def get_names():
pred = ['rpn_cls_output64', 'rpn_cls_output32', 'rpn_cls_output16', 'rpn_cls_output8', 'rpn_cls_output4',
'rpn_bbox_loss64', 'rpn_bbox_loss32', 'rpn_bbox_loss16', 'rpn_bbox_loss8', 'rpn_bbox_loss4',
'rcnn_cls_prob', 'rcnn_bbox_loss', 'rcnn_label']
label = ['label_stride64', 'label_stride32', 'label_stride16', 'label_stride8', 'label_stride4',
'bbox_target_stride64', 'bbox_target_stride32', 'bbox_target_stride16', 'bbox_target_stride8', 'bbox_target_stride4',
'bbox_weight_stride64', 'bbox_weight_stride32', 'bbox_weight_stride16', 'bbox_weight_stride8', 'bbox_weight_stride4',]
return pred, label
class RPNAccMetricS64(mx.metric.EvalMetric):
def __init__(self):
super(RPNAccMetricS64, self).__init__('RPNAcc_S64')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output64')]
label = labels[self.label.index('label_stride64')]
# pred (b, c, p) or (b, c, h, w)
pred_label = mx.ndarray.argmax_channel(pred).asnumpy().astype('int32')
pred_label = pred_label.reshape((pred_label.shape[0], -1))
# label (b, p)
label = label.asnumpy().astype('int32')
# filter with keep_inds
keep_inds = np.where(label != -1)
pred_label = pred_label[keep_inds]
label = label[keep_inds]
self.sum_metric += np.sum(pred_label.flat == label.flat)
self.num_inst += len(pred_label.flat)
class RPNAccMetricS32(mx.metric.EvalMetric):
def __init__(self):
super(RPNAccMetricS32, self).__init__('RPNAcc_S32')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output32')]
label = labels[self.label.index('label_stride32')]
# pred (b, c, p) or (b, c, h, w)
pred_label = mx.ndarray.argmax_channel(pred).asnumpy().astype('int32')
pred_label = pred_label.reshape((pred_label.shape[0], -1))
# label (b, p)
label = label.asnumpy().astype('int32')
# filter with keep_inds
keep_inds = np.where(label != -1)
pred_label = pred_label[keep_inds]
label = label[keep_inds]
self.sum_metric += np.sum(pred_label.flat == label.flat)
self.num_inst += len(pred_label.flat)
class RPNAccMetricS16(mx.metric.EvalMetric):
def __init__(self):
super(RPNAccMetricS16, self).__init__('RPNAcc_S16')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output16')]
label = labels[self.label.index('label_stride16')]
# pred (b, c, p) or (b, c, h, w)
pred_label = mx.ndarray.argmax_channel(pred).asnumpy().astype('int32')
pred_label = pred_label.reshape((pred_label.shape[0], -1))
# label (b, p)
label = label.asnumpy().astype('int32')
# filter with keep_inds
keep_inds = np.where(label != -1)
pred_label = pred_label[keep_inds]
label = label[keep_inds]
self.sum_metric += np.sum(pred_label.flat == label.flat)
self.num_inst += len(pred_label.flat)
class RPNAccMetricS8(mx.metric.EvalMetric):
def __init__(self):
super(RPNAccMetricS8, self).__init__('RPNAcc_S8')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output8')]
label = labels[self.label.index('label_stride8')]
# pred (b, c, p) or (b, c, h, w)
pred_label = mx.ndarray.argmax_channel(pred).asnumpy().astype('int32')
pred_label = pred_label.reshape((pred_label.shape[0], -1))
# label (b, p)
label = label.asnumpy().astype('int32')
# filter with keep_inds
keep_inds = np.where(label != -1)
pred_label = pred_label[keep_inds]
label = label[keep_inds]
self.sum_metric += np.sum(pred_label.flat == label.flat)
self.num_inst += len(pred_label.flat)
class RPNAccMetricS4(mx.metric.EvalMetric):
def __init__(self):
super(RPNAccMetricS4, self).__init__('RPNAcc_S4')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output4')]
label = labels[self.label.index('label_stride4')]
# pred (b, c, p) or (b, c, h, w)
pred_label = mx.ndarray.argmax_channel(pred).asnumpy().astype('int32')
pred_label = pred_label.reshape((pred_label.shape[0], -1))
# label (b, p)
label = label.asnumpy().astype('int32')
# filter with keep_inds
keep_inds = np.where(label != -1)
pred_label = pred_label[keep_inds]
label = label[keep_inds]
self.sum_metric += np.sum(pred_label.flat == label.flat)
self.num_inst += len(pred_label.flat)
class RCNNAccMetric(mx.metric.EvalMetric):
def __init__(self):
super(RCNNAccMetric, self).__init__('RCNNAcc')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rcnn_cls_prob')]
label = preds[self.pred.index('rcnn_label')]
last_dim = pred.shape[-1]
pred_label = pred.asnumpy().reshape(-1, last_dim).argmax(axis=1).astype('int32')
label = label.asnumpy().reshape(-1,).astype('int32')
self.sum_metric += np.sum(pred_label.flat == label.flat)
self.num_inst += len(pred_label.flat)
class RPNLogLossMetricS64(mx.metric.EvalMetric):
def __init__(self):
super(RPNLogLossMetricS64, self).__init__('RPNLogLoss_S64')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output64')]
label = labels[self.label.index('label_stride64')]
# label (b, p)
label = label.asnumpy().astype('int32').reshape((-1))
# pred (b, c, p) or (b, c, h, w) --> (b, p, c) --> (b*p, c)
pred = pred.asnumpy().reshape((pred.shape[0], pred.shape[1], -1)).transpose((0, 2, 1))
pred = pred.reshape((label.shape[0], -1))
# filter with keep_inds
keep_inds = np.where(label != -1)[0]
label = label[keep_inds]
cls = pred[keep_inds, label]
cls += 1e-14
cls_loss = -1 * np.log(cls)
cls_loss = np.sum(cls_loss)
self.sum_metric += cls_loss
self.num_inst += label.shape[0]
class RPNLogLossMetricS32(mx.metric.EvalMetric):
def __init__(self):
super(RPNLogLossMetricS32, self).__init__('RPNLogLoss_S32')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output32')]
label = labels[self.label.index('label_stride32')]
# label (b, p)
label = label.asnumpy().astype('int32').reshape((-1))
# pred (b, c, p) or (b, c, h, w) --> (b, p, c) --> (b*p, c)
pred = pred.asnumpy().reshape((pred.shape[0], pred.shape[1], -1)).transpose((0, 2, 1))
pred = pred.reshape((label.shape[0], -1))
# filter with keep_inds
keep_inds = np.where(label != -1)[0]
label = label[keep_inds]
cls = pred[keep_inds, label]
cls += 1e-14
cls_loss = -1 * np.log(cls)
cls_loss = np.sum(cls_loss)
self.sum_metric += cls_loss
self.num_inst += label.shape[0]
class RPNLogLossMetricS16(mx.metric.EvalMetric):
def __init__(self):
super(RPNLogLossMetricS16, self).__init__('RPNLogLoss_S16')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output16')]
label = labels[self.label.index('label_stride16')]
# label (b, p)
label = label.asnumpy().astype('int32').reshape((-1))
# pred (b, c, p) or (b, c, h, w) --> (b, p, c) --> (b*p, c)
pred = pred.asnumpy().reshape((pred.shape[0], pred.shape[1], -1)).transpose((0, 2, 1))
pred = pred.reshape((label.shape[0], -1))
# filter with keep_inds
keep_inds = np.where(label != -1)[0]
label = label[keep_inds]
cls = pred[keep_inds, label]
cls += 1e-14
cls_loss = -1 * np.log(cls)
cls_loss = np.sum(cls_loss)
self.sum_metric += cls_loss
self.num_inst += label.shape[0]
class RPNLogLossMetricS8(mx.metric.EvalMetric):
def __init__(self):
super(RPNLogLossMetricS8, self).__init__('RPNLogLoss_S8')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output8')]
label = labels[self.label.index('label_stride8')]
# label (b, p)
label = label.asnumpy().astype('int32').reshape((-1))
# pred (b, c, p) or (b, c, h, w) --> (b, p, c) --> (b*p, c)
pred = pred.asnumpy().reshape((pred.shape[0], pred.shape[1], -1)).transpose((0, 2, 1))
pred = pred.reshape((label.shape[0], -1))
# filter with keep_inds
keep_inds = np.where(label != -1)[0]
label = label[keep_inds]
cls = pred[keep_inds, label]
cls += 1e-14
cls_loss = -1 * np.log(cls)
cls_loss = np.sum(cls_loss)
self.sum_metric += cls_loss
self.num_inst += label.shape[0]
class RPNLogLossMetricS4(mx.metric.EvalMetric):
def __init__(self):
super(RPNLogLossMetricS4, self).__init__('RPNLogLoss_S4')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rpn_cls_output4')]
label = labels[self.label.index('label_stride4')]
# label (b, p)
label = label.asnumpy().astype('int32').reshape((-1))
# pred (b, c, p) or (b, c, h, w) --> (b, p, c) --> (b*p, c)
pred = pred.asnumpy().reshape((pred.shape[0], pred.shape[1], -1)).transpose((0, 2, 1))
pred = pred.reshape((label.shape[0], -1))
# filter with keep_inds
keep_inds = np.where(label != -1)[0]
label = label[keep_inds]
cls = pred[keep_inds, label]
cls += 1e-14
cls_loss = -1 * np.log(cls)
cls_loss = np.sum(cls_loss)
self.sum_metric += cls_loss
self.num_inst += label.shape[0]
class RCNNLogLossMetric(mx.metric.EvalMetric):
def __init__(self):
super(RCNNLogLossMetric, self).__init__('RCNNLogLoss')
self.pred, self.label = get_names()
def update(self, labels, preds):
pred = preds[self.pred.index('rcnn_cls_prob')]
label = preds[self.pred.index('rcnn_label')]
last_dim = pred.shape[-1]
pred = pred.asnumpy().reshape(-1, last_dim)
label = label.asnumpy().reshape(-1,).astype('int32')
# print(pred.shape)
# print(label.shape[0])
cls = pred[np.arange(label.shape[0]), label]
cls += 1e-14
cls_loss = -1 * np.log(cls)
cls_loss = np.sum(cls_loss)
self.sum_metric += cls_loss
self.num_inst += label.shape[0]
class RPNL1LossMetricS64(mx.metric.EvalMetric):
def __init__(self):
super(RPNL1LossMetricS64, self).__init__('RPNL1Loss_S64')
self.pred, self.label = get_names()
def update(self, labels, preds):
bbox_loss = preds[self.pred.index('rpn_bbox_loss64')].asnumpy()
bbox_weight = labels[self.label.index('bbox_weight_stride64')].asnumpy()
# calculate num_inst (average on those fg anchors)
num_inst = np.sum(bbox_weight > 0) / 4
self.sum_metric += np.sum(bbox_loss)
self.num_inst += num_inst
class RPNL1LossMetricS32(mx.metric.EvalMetric):
def __init__(self):
super(RPNL1LossMetricS32, self).__init__('RPNL1Loss_S32')
self.pred, self.label = get_names()
def update(self, labels, preds):
bbox_loss = preds[self.pred.index('rpn_bbox_loss32')].asnumpy()
bbox_weight = labels[self.label.index('bbox_weight_stride32')].asnumpy()
# calculate num_inst (average on those fg anchors)
num_inst = np.sum(bbox_weight > 0) / 4
self.sum_metric += np.sum(bbox_loss)
self.num_inst += num_inst
class RPNL1LossMetricS16(mx.metric.EvalMetric):
def __init__(self):
super(RPNL1LossMetricS16, self).__init__('RPNL1Loss_S16')
self.pred, self.label = get_names()
def update(self, labels, preds):
bbox_loss = preds[self.pred.index('rpn_bbox_loss16')].asnumpy()
bbox_weight = labels[self.label.index('bbox_weight_stride16')].asnumpy()
# calculate num_inst (average on those fg anchors)
num_inst = np.sum(bbox_weight > 0) / 4
self.sum_metric += np.sum(bbox_loss)
self.num_inst += num_inst
class RPNL1LossMetricS8(mx.metric.EvalMetric):
def __init__(self):
super(RPNL1LossMetricS8, self).__init__('RPNL1Loss_S8')
self.pred, self.label = get_names()
def update(self, labels, preds):
bbox_loss = preds[self.pred.index('rpn_bbox_loss8')].asnumpy()
bbox_weight = labels[self.label.index('bbox_weight_stride8')].asnumpy()
# calculate num_inst (average on those fg anchors)
num_inst = np.sum(bbox_weight > 0) / 4
self.sum_metric += np.sum(bbox_loss)
self.num_inst += num_inst
class RPNL1LossMetricS4(mx.metric.EvalMetric):
def __init__(self):
super(RPNL1LossMetricS4, self).__init__('RPNL1Loss_S4')
self.pred, self.label = get_names()
def update(self, labels, preds):
bbox_loss = preds[self.pred.index('rpn_bbox_loss4')].asnumpy()
bbox_weight = labels[self.label.index('bbox_weight_stride4')].asnumpy()
# calculate num_inst (average on those fg anchors)
num_inst = np.sum(bbox_weight > 0) / 4
self.sum_metric += np.sum(bbox_loss)
self.num_inst += num_inst
class RCNNL1LossMetric(mx.metric.EvalMetric):
def __init__(self):
super(RCNNL1LossMetric, self).__init__('RCNNL1Loss')
self.pred, self.label = get_names()
def update(self, labels, preds):
bbox_loss = preds[self.pred.index('rcnn_bbox_loss')].asnumpy()
label = preds[self.pred.index('rcnn_label')].asnumpy()
# calculate num_inst
keep_inds = np.where(label != 0)[0]
num_inst = len(keep_inds)
self.sum_metric += np.sum(bbox_loss)
self.num_inst += num_inst
| 34.705742 | 130 | 0.618184 | 1,951 | 14,507 | 4.349052 | 0.063045 | 0.046671 | 0.032174 | 0.044549 | 0.812728 | 0.8099 | 0.803771 | 0.718444 | 0.718444 | 0.690748 | 0 | 0.030837 | 0.233267 | 14,507 | 417 | 131 | 34.788969 | 0.731997 | 0.075688 | 0 | 0.758491 | 0 | 0 | 0.095886 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.139623 | false | 0 | 0.007547 | 0 | 0.218868 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b7566e4d9767bf6c63a03017d020301922b43136 | 1,411 | py | Python | fecs.py | xuexb/sublime-fecs | 0dc2f91d983d94d03f41b56c1a01ff1b09d52fcc | [
"MIT"
] | null | null | null | fecs.py | xuexb/sublime-fecs | 0dc2f91d983d94d03f41b56c1a01ff1b09d52fcc | [
"MIT"
] | null | null | null | fecs.py | xuexb/sublime-fecs | 0dc2f91d983d94d03f41b56c1a01ff1b09d52fcc | [
"MIT"
] | null | null | null | import sublime
import sublime_plugin
class fecsCheckCommand(sublime_plugin.TextCommand):
def run(self, edit):
filepath = self.view.file_name()
packages = sublime.packages_path()
args = {
"cmd": [
"fecs",
filepath,
"--reporter=baidu",
"--rule"
],
"file_regex": r"fecs: (.+)\]",
"line_regex": r"(\d+),(\d+): (.*)$"
}
if sublime.platform() == "windows":
args['cmd'][0] += ".cmd"
elif sublime.platform() == "osx":
args['path'] = "/usr/local/share/npm/bin:/usr/local/bin:/opt/local/bin"
self.view.window().run_command('exec', args)
class fecsFormatCommand(sublime_plugin.TextCommand):
def run(self, edit):
filepath = self.view.file_name()
packages = sublime.packages_path()
args = {
"cmd": [
"fecs",
"format",
filepath,
"--replace"
],
"file_regex": r"fecs: (.+)\]",
"line_regex": r"(\d+),(\d+): (.*)$"
}
if sublime.platform() == "windows":
args['cmd'][0] += ".cmd"
elif sublime.platform() == "osx":
args['path'] = "/usr/local/share/npm/bin:/usr/local/bin:/opt/local/bin"
self.view.window().run_command('exec', args)
| 28.795918 | 83 | 0.469171 | 136 | 1,411 | 4.772059 | 0.323529 | 0.049307 | 0.07396 | 0.083205 | 0.813559 | 0.813559 | 0.813559 | 0.813559 | 0.813559 | 0.813559 | 0 | 0.002176 | 0.348689 | 1,411 | 48 | 84 | 29.395833 | 0.704026 | 0 | 0 | 0.75 | 0 | 0.05 | 0.218994 | 0.076541 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.05 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b774d00095d794b7fab2fb61dae3736c1d99a259 | 28,940 | py | Python | Context-Aggregator/parsing/legacy_sp_models.py | nadgeri14/KGPool | 2b0c0b71301a023b9a7c6dcba9932c6f37e60c8f | [
"MIT"
] | 33 | 2021-06-06T11:31:32.000Z | 2022-03-28T14:34:21.000Z | Context-Aggregator/parsing/legacy_sp_models.py | LiuChuang0059/KGPool | 77a6f78ac48884eb3e1a4568c9535b581eadf69d | [
"MIT"
] | 9 | 2021-09-23T09:47:21.000Z | 2022-03-17T10:10:52.000Z | Context-Aggregator/parsing/legacy_sp_models.py | LiuChuang0059/KGPool | 77a6f78ac48884eb3e1a4568c9535b581eadf69d | [
"MIT"
] | 8 | 2021-11-09T10:03:03.000Z | 2022-03-27T11:55:17.000Z | # coding: utf-8
# Copyright (C) 2016 UKP lab
#
# Author: Daniil Sorokin (ukp.tu-darmstadt.de/ukp-home/)
#
import itertools
import numpy as np
np.random.seed(1)
import tqdm
import sys
import pdb
#sys.path.insert(0, '..')
#sys.path.insert(0, '../..') # maybe troublesome when on windows
from utils import embedding_utils, graph
from semanticgraph import graph_utils
from utils.conversion_util import calculate_order_conversion
RESOURCES_FOLDER = "resources/"
property_blacklist = embedding_utils.load_blacklist(RESOURCES_FOLDER + "property_blacklist.txt")
def get_negative_edges(g, limit=1):
"""
Generate negative edges for every entity pair if no relation is available. If generated set is bigger that limit, it will be dropped randomly.
:param g: graphs a dictionary
:return: a list of negative edges
>>> get_negative_edges({'edgeSet': [{'kbID': 'P397', 'left': [8], 'right': [23]}, \
{'kbID': 'P376', 'left': [80], 'right': [8]}], 'vertexSet': [{'tokenpositions': [8]}, {'tokenpositions': [23]}, {'tokenpositions': [80]}]}) \
== [{'left': [23], 'kbID': 'P0', 'right': [80]}]
True
"""
# get all combinations of vertex set
# combinations('ABCD', 2) => AB AC AD BC BD CD
vertex_pairs = itertools.combinations(g["vertexSet"], 2)
existing_edges = [p for e in g["edgeSet"] for p in [(e['left'], e['right']), (e['right'], e['left'])]]
negative_edges = []
for vertex_pair in vertex_pairs:
left_right = (vertex_pair[0]['tokenpositions'], vertex_pair[1]['tokenpositions'])
if left_right not in existing_edges:
negative_edges.append({'kbID': 'P0', 'left': left_right[0], 'right': left_right[1]})
if len(negative_edges) > limit:
negative_edges = np.random.choice(negative_edges, limit, replace=False)
return list(negative_edges)
def get_all_negative_edges(g, limit=100000):
"""
Generate negative edges for every entity pair if no relation is available. If generated set is bigger that limit, it will be dropped randomly.
:param g: graphs a dictionary
:return: full list of edges
>>> get_negative_edges({'edgeSet': [{'kbID': 'P397', 'left': [8], 'right': [23]}, \
{'kbID': 'P376', 'left': [80], 'right': [8]}], 'vertexSet': [{'tokenpositions': [8]}, {'tokenpositions': [23]}, {'tokenpositions': [80]}]}) \
== [{'left': [23], 'kbID': 'P0', 'right': [80]}]
True
"""
# get all products of vertex set
# combinations('ABC', 2) => AB AC BC
vertex_pairs = itertools.combinations(g["vertexSet"], 2)
existing_edges = [p for e in g["edgeSet"] for p in [(e['left'], e['right']), (e['right'], e['left'])]]
negative_edges = []
for vertex_pair in vertex_pairs:
left_right = (vertex_pair[0]['tokenpositions'], vertex_pair[1]['tokenpositions'])
if left_right not in existing_edges:
negative_edges.append({'kbID': 'P0', 'left': left_right[0], 'right': left_right[1]})
if len(negative_edges) > limit:
negative_edges = np.random.choice(negative_edges, limit, replace=False)
return list(negative_edges)
def to_indices(graphs, word2idx, property2idx, max_sent_len, replace_entities_with_unkown = False, mode='train', **kwargs):
"""
:param graphs:
:param word2idx:
:param property2idx:
:param max_sent_len:
:return:
"""
num_edges = len([e for g in graphs for e in g['edgeSet'] if e['kbID'] not in property_blacklist])
print("Dataset number of edges: {}".format(num_edges))
sentences_matrix = np.zeros((num_edges, max_sent_len), dtype="int32")
entity_matrix = np.zeros((num_edges, max_sent_len), dtype="int8")
y_matrix = np.zeros(num_edges, dtype="int16")
index = 0
for g in tqdm.tqdm(graphs, ascii=True):
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
for edge in g["edgeSet"]:
if edge['kbID'] not in property_blacklist:
sentences_matrix[index, :len(token_ids)] = \
[word2idx[embedding_utils.unknown] if i in edge["left"] + edge["right"] else t for i, t in enumerate(token_ids)] \
if replace_entities_with_unkown else token_ids
entity_matrix[index, :len(token_ids)] = \
[m for _, m in graph_utils.get_entity_indexed_vector(token_ids, edge, mode="mark-bi")]
if mode == "train":
_, property_kbid, _ = graph_utils.edge_to_kb_ids(edge, g)
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
y_matrix[index] = property_kbid
index += 1
return [sentences_matrix, entity_matrix, y_matrix]
def to_indices_and_entity_pair(graphs, word2idx, property2idx, max_sent_len, replace_entities_with_unkown = False, mode='train', **kwargs):
"""
:param graphs:
:param word2idx:
:param property2idx:
:param max_sent_len:
:return:
"""
num_edges = len([e for g in graphs for e in g['edgeSet'] if e['kbID'] not in property_blacklist])
print("Dataset number of edges: {}".format(num_edges))
sentences_matrix = np.zeros((num_edges, max_sent_len), dtype="int32")
entity_matrix = np.zeros((num_edges, max_sent_len), dtype="int8")
y_matrix = np.zeros(num_edges, dtype="int16")
index = 0
entity_cnt = []
pos2id = dict()
entity_pair = []
for g in tqdm.tqdm(graphs, ascii=True):
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
try:
entity_cnt.append(len(g["vertexSet"]))
for i in g['vertexSet']:
pos2id[tuple(i['tokenpositions'])] = i['kbID']
except:
continue
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
for edge in g["edgeSet"]:
if edge['kbID'] not in property_blacklist:
sentences_matrix[index, :len(token_ids)] = \
[word2idx[embedding_utils.unknown] if i in edge["left"] + edge["right"] else t for i, t in enumerate(token_ids)] \
if replace_entities_with_unkown else token_ids
entity_matrix[index, :len(token_ids)] = \
[m for _, m in graph_utils.get_entity_indexed_vector(token_ids, edge, mode="mark-bi")]
if mode == "train":
_, property_kbid, _ = graph_utils.edge_to_kb_ids(edge, g)
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
y_matrix[index] = property_kbid
entity_pair.append((pos2id[tuple(edge['left'])], pos2id[tuple(edge['right'])]))
index += 1
return [sentences_matrix, entity_matrix, y_matrix, entity_pair]
MAX_EDGES_PER_GRAPH = 72
def to_indices_with_real_entities(graphs, word2idx, property2idx, max_sent_len, mode='train', **kwargs):
"""
:param graphs:
:param word2idx:
:param property2idx:
:param max_sent_len:
:return:
"""
graphs_to_process = []
for g in graphs:
if len(g['edgeSet']) > 0:
if len(g['edgeSet']) <= MAX_EDGES_PER_GRAPH:
graphs_to_process.append(g)
else:
for i in range(0, len(g['edgeSet']), MAX_EDGES_PER_GRAPH):
graphs_to_process.append({"tokens": g["tokens"], "edgeSet": g["edgeSet"][i:i+ MAX_EDGES_PER_GRAPH]})
graphs = graphs_to_process
sentences_matrix = np.zeros((len(graphs), max_sent_len), dtype="int32")
entity_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH, max_sent_len), dtype="int8")
y_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH), dtype="int16")
for index, g in enumerate(tqdm.tqdm(graphs, ascii=True)):
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
sentences_matrix[index, :len(token_ids)] = token_ids
for j, edge in enumerate(g["edgeSet"][:MAX_EDGES_PER_GRAPH]):
entity_matrix[index, j, :len(token_ids)] = \
[m for _, m in graph_utils.get_entity_indexed_vector(token_ids, edge, mode="mark-bi")]
_, property_kbid, _ = graph_utils.edge_to_kb_ids(edge, g)
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
y_matrix[index, j] = property_kbid
return sentences_matrix, entity_matrix, y_matrix
def to_indices_with_real_entities_and_entity_nums(graphs, word2idx, property2idx, max_sent_len, mode='train', **kwargs):
"""
:param graphs:
:param word2idx:
:param property2idx:
:param max_sent_len:
:return:
"""
graphs_to_process = []
for g in graphs:
if len(g['edgeSet']) > 0:
if len(g['edgeSet']) <= MAX_EDGES_PER_GRAPH:
graphs_to_process.append(g)
else:
continue # here we discard these data points
for i in range(0, len(g['edgeSet']), MAX_EDGES_PER_GRAPH):
graphs_to_process.append({"tokens": g["tokens"], "edgeSet": g["edgeSet"][i:i+ MAX_EDGES_PER_GRAPH]})
graphs = graphs_to_process
sentences_matrix = np.zeros((len(graphs), max_sent_len), dtype="int32")
entity_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH, max_sent_len), dtype="int8")
y_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH), dtype="int16")
entity_cnt = []
for index, g in enumerate(tqdm.tqdm(graphs, ascii=True)):
try:
entity_cnt.append(len(g["vertexSet"]))
except:
continue
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
sentences_matrix[index, :len(token_ids)] = token_ids
for j, edge in enumerate(g["edgeSet"][:MAX_EDGES_PER_GRAPH]):
entity_matrix[index, j, :len(token_ids)] = \
[m for _, m in graph_utils.get_entity_indexed_vector(token_ids, edge, mode="mark-bi")]
_, property_kbid, _ = graph_utils.edge_to_kb_ids(edge, g)
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
y_matrix[index, j] = property_kbid
entity_cnt = np.array(entity_cnt, dtype=np.int32)
return sentences_matrix, entity_matrix, y_matrix, entity_cnt
def to_indices_with_real_entities_and_entity_nums_with_vertex_padding(graphs, word2idx, property2idx, max_sent_len, mode='train', **kwargs):
"""
:param graphs:
:param word2idx:
:param property2idx:
:param max_sent_len:
:return:
"""
graphs_to_process = []
for g in graphs:
if len(g['edgeSet']) > 0:
if len(g['edgeSet']) <= MAX_EDGES_PER_GRAPH:
graphs_to_process.append(g)
else:
continue # here we discard these data points
for i in range(0, len(g['edgeSet']), MAX_EDGES_PER_GRAPH):
graphs_to_process.append({"tokens": g["tokens"], "edgeSet": g["edgeSet"][i:i+ MAX_EDGES_PER_GRAPH]})
graphs = graphs_to_process
sentences_matrix = np.zeros((len(graphs), max_sent_len), dtype="int32")
entity_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH, max_sent_len), dtype="int8")
y_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH), dtype="int16")
kbID_matrix = np.empty((len(graphs), MAX_EDGES_PER_GRAPH), dtype=object)
entity_cnt = []
for index, g in enumerate(tqdm.tqdm(graphs, ascii=True)):
try:
entity_cnt.append(len(g["vertexSet"]))
except:
continue
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
sentences_matrix[index, :len(token_ids)] = token_ids
for j, edge in enumerate(g["edgeSet"][:MAX_EDGES_PER_GRAPH]):
new_j = calculate_order_conversion(j, len(g["vertexSet"]))
entity_matrix[index, new_j, :len(token_ids)] = \
[m for _, m in graph_utils.get_entity_indexed_vector(token_ids, edge, mode="mark-bi")]
left_entity, property_kbid, right_entity = graph_utils.edge_to_kb_ids(edge, g)
relationID = property_kbid
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
y_matrix[index, new_j] = property_kbid
kbID_matrix[index, new_j] = {"graph":g,"left_entity":left_entity,"right_entity":right_entity,"relation":relationID}
entity_cnt = np.array(entity_cnt, dtype=np.int32)
return sentences_matrix, entity_matrix, y_matrix, entity_cnt, kbID_matrix
def to_indices_with_real_entities_and_entity_nums_with_vertex_padding_and_entity_pair(graphs, word2idx, property2idx, max_sent_len, mode='train', **kwargs):
"""
:param graphs:
:param word2idx:
:param property2idx:
:param max_sent_len:
:return:
"""
graphs_to_process = []
for g in graphs:
if len(g['edgeSet']) > 0:
if len(g['edgeSet']) <= MAX_EDGES_PER_GRAPH:
graphs_to_process.append(g)
else:
continue # here we discard these data points
for i in range(0, len(g['edgeSet']), MAX_EDGES_PER_GRAPH):
graphs_to_process.append({"tokens": g["tokens"], "edgeSet": g["edgeSet"][i:i+ MAX_EDGES_PER_GRAPH]})
graphs = graphs_to_process
sentences_matrix = np.zeros((len(graphs), max_sent_len), dtype="int32")
entity_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH, max_sent_len), dtype="int8")
y_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH), dtype="int16")
entity_cnt = []
kbID_matrix = np.empty((len(graphs), MAX_EDGES_PER_GRAPH), dtype=object)
pos2id = dict()
entity_pair = []
for index, g in enumerate(tqdm.tqdm(graphs, ascii=True)):
try:
entity_cnt.append(len(g["vertexSet"]))
for i in g['vertexSet']:
pos2id[tuple(i['tokenpositions'])] = i['kbID']
except:
continue
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
sentences_matrix[index, :len(token_ids)] = token_ids
entity_pair_instance = []
for j, edge in enumerate(g["edgeSet"][:MAX_EDGES_PER_GRAPH]):
new_j = calculate_order_conversion(j, len(g["vertexSet"]))
entity_matrix[index, new_j, :len(token_ids)] = \
[m for _, m in graph_utils.get_entity_indexed_vector(token_ids, edge, mode="mark-bi")]
left_entity, property_kbid, right_entity = graph_utils.edge_to_kb_ids(edge, g)
relationID = property_kbid
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
y_matrix[index, new_j] = property_kbid
kbID_matrix[index, new_j] = {"graph":g,"left_entity":left_entity,"right_entity":right_entity,"relation":relationID}
entity_pair_instance.append((pos2id[tuple(edge['left'])], pos2id[tuple(edge['right'])]))
entity_pair.append(entity_pair_instance)
entity_cnt = np.array(entity_cnt, dtype=np.int32)
return sentences_matrix, entity_matrix, y_matrix, entity_cnt, entity_pair, kbID_matrix
def to_indices_with_real_entities_completely(graphs, word2idx, property2idx, max_sent_len, mode='train', **kwargs):
"""
This function add N/A relations to all entity pairs with no relation in dataset
:param graphs:
:param word2idx:
:param property2idx:
:param max_sent_len:
:return:
"""
graphs_to_process = []
for g in graphs:
if len(g['edgeSet']) > 0:
if len(g['edgeSet']) <= MAX_EDGES_PER_GRAPH:
graphs_to_process.append(g)
else:
for i in range(0, len(g['edgeSet']), MAX_EDGES_PER_GRAPH):
graphs_to_process.append({"tokens": g["tokens"], "edgeSet": g["edgeSet"][i:i+ MAX_EDGES_PER_GRAPH]})
graphs = graphs_to_process
sentences_matrix = np.zeros((len(graphs), max_sent_len), dtype="int32")
entity_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH, max_sent_len), dtype="int8")
y_matrix = np.zeros((len(graphs), MAX_EDGES_PER_GRAPH), dtype="int16")
for index, g in enumerate(tqdm.tqdm(graphs, ascii=True)):
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
sentences_matrix[index, :len(token_ids)] = token_ids
for j, edge in enumerate(g["edgeSet"][:MAX_EDGES_PER_GRAPH]):
entity_matrix[index, j, :len(token_ids)] = \
[m for _, m in graph_utils.get_entity_indexed_vector(token_ids, edge, mode="mark-bi")]
_, property_kbid, _ = graph_utils.edge_to_kb_ids(edge, g)
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
y_matrix[index, j] = property_kbid
return sentences_matrix, entity_matrix, y_matrix
def graphs_for_evaluation(graphs, graphs_tagged):
for_evaluation = []
for i, g in enumerate(tqdm.tqdm(graphs, ascii=True, ncols=100)):
for edge in g["edgeSet"]:
new_g = {"edgeSet": [edge], "tokens": g['tokens']}
entities = [ne for ne, t in graph.extract_entities(graphs_tagged[i])]
entities += [edge['left'], edge['right']]
new_g['vertexSet'] = [{'tokenpositions': ne} for ne in entities]
new_g['edgeSet'].extend(get_negative_edges(new_g, limit=6))
for_evaluation.append(new_g)
return for_evaluation
def to_indices_with_ghost_entities(graphs, word2idx, property2idx, max_sent_len, embeddings, **kwargs):
sentences_matrix, entity_matrix, y_matrix = to_indices(graphs, word2idx, property2idx, max_sent_len, **kwargs)
ghost_entity_matrix = create_ghost_edges(sentences_matrix, entity_matrix, embeddings)
entity_matrix = entity_matrix.reshape((entity_matrix.shape[0], 1, entity_matrix.shape[1]))
entity_matrix = np.concatenate([entity_matrix, ghost_entity_matrix], axis = 1)
return [sentences_matrix, entity_matrix, y_matrix]
def to_indices_with_relative_positions(graphs, word2idx, property2idx, max_sent_len, position2idx, **kwargs):
num_edges = len([e for g in graphs for e in g['edgeSet']])
sentences_matrix = np.zeros((num_edges, max_sent_len), dtype="int32")
entity_matrix = np.zeros((num_edges, 2, max_sent_len), dtype="int8")
y_matrix = np.zeros(num_edges, dtype="int16")
index = 0
max_entity_index = max_sent_len - 1
for g in tqdm.tqdm(graphs, ascii=True):
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
for edge in g["edgeSet"]:
sentences_matrix[index, :len(token_ids)] = token_ids
_, property_kbid, _ = graph_utils.edge_to_kb_ids(edge, g)
try:
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
except:
pdb.set_trace()
entity_vector = graph_utils.get_entity_indexed_vector(token_ids, edge, mode="position")
entity_vector = [(-max_entity_index if m1 < -max_entity_index else max_entity_index if m1 > max_entity_index else m1,
-max_entity_index if m2 < -max_entity_index else max_entity_index if m2 > max_entity_index else m2) for _, m1,m2 in entity_vector]
entity_matrix[index, :, :len(token_ids)] = [[position2idx[m] for m,_ in entity_vector],[position2idx[m] for _, m in entity_vector]]
y_matrix[index] = property_kbid
index += 1
return [sentences_matrix, entity_matrix, y_matrix]
def to_indices_with_relative_positions_and_entity_pair(graphs, word2idx, property2idx, max_sent_len, position2idx, **kwargs):
num_edges = len([e for g in graphs for e in g['edgeSet']])
sentences_matrix = np.zeros((num_edges, max_sent_len), dtype="int32")
entity_matrix = np.zeros((num_edges, 2, max_sent_len), dtype="int8")
y_matrix = np.zeros(num_edges, dtype="int16")
index = 0
max_entity_index = max_sent_len - 1
entity_pair = []
pos2id = dict()
for g in tqdm.tqdm(graphs, ascii=True):
try:
for i in g['vertexSet']:
pos2id[tuple(i['tokenpositions'])] = i['kbID']
except:
continue
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
entity_pair_instance = []
for edge in g["edgeSet"]:
sentences_matrix[index, :len(token_ids)] = token_ids
_, property_kbid, _ = graph_utils.edge_to_kb_ids(edge, g)
try:
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
except:
pdb.set_trace()
entity_vector = graph_utils.get_entity_indexed_vector(token_ids, edge, mode="position")
entity_vector = [(-max_entity_index if m1 < -max_entity_index else max_entity_index if m1 > max_entity_index else m1,
-max_entity_index if m2 < -max_entity_index else max_entity_index if m2 > max_entity_index else m2) for _, m1,m2 in entity_vector]
entity_matrix[index, :, :len(token_ids)] = [[position2idx[m] for m,_ in entity_vector],[position2idx[m] for _, m in entity_vector]]
y_matrix[index] = property_kbid
index += 1
entity_pair_instance.append((pos2id[tuple(edge['left'])], pos2id[tuple(edge['right'])]))
entity_pair += entity_pair_instance
return [sentences_matrix, entity_matrix, y_matrix, entity_pair]
def to_indices_with_relative_positions_and_pcnn_mask_and_entity_pair(graphs, word2idx, property2idx, max_sent_len, position2idx, **kwargs):
num_edges = len([e for g in graphs for e in g['edgeSet']])
sentences_matrix = np.zeros((num_edges, max_sent_len), dtype="int32")
entity_matrix = np.zeros((num_edges, 2, max_sent_len), dtype="int8")
pcnn_mask = np.zeros((num_edges, 3, max_sent_len), dtype="float32")
y_matrix = np.zeros(num_edges, dtype="int16")
index = 0
max_entity_index = max_sent_len - 1
entity_pair = []
pos2id = dict()
for g in tqdm.tqdm(graphs, ascii=True):
try:
for i in g['vertexSet']:
pos2id[tuple(i['tokenpositions'])] = i['kbID']
except:
continue
token_ids = embedding_utils.get_idx_sequence(g["tokens"], word2idx)
if len(token_ids) > max_sent_len:
token_ids = token_ids[:max_sent_len]
entity_pair_instance = []
for edge in g["edgeSet"]:
sentences_matrix[index, :len(token_ids)] = token_ids
_, property_kbid, _ = graph_utils.edge_to_kb_ids(edge, g)
try:
property_kbid = property2idx.get(property_kbid, property2idx[embedding_utils.unknown])
except:
pdb.set_trace()
entity_vector = graph_utils.get_entity_indexed_vector(token_ids, edge, mode="position")
entity_vector = [(-max_entity_index if m1 < -max_entity_index else max_entity_index if m1 > max_entity_index else m1,
-max_entity_index if m2 < -max_entity_index else max_entity_index if m2 > max_entity_index else m2) for _, m1,m2 in entity_vector]
entity_matrix[index, :, :len(token_ids)] = [[position2idx[m] for m,_ in entity_vector],[position2idx[m] for _, m in entity_vector]]
pcnn_mask[index, 0, :len(token_ids)], pcnn_mask[index, 1, :len(token_ids)], pcnn_mask[index, 2, :len(token_ids)] = graph_utils.get_pcnn_mask(token_ids, edge)
y_matrix[index] = property_kbid
index += 1
entity_pair_instance.append((pos2id[tuple(edge['left'])], pos2id[tuple(edge['right'])]))
entity_pair += entity_pair_instance
return [sentences_matrix, entity_matrix, y_matrix, pcnn_mask, entity_pair]
def softmax(x):
e_x = np.exp(x)
return e_x / e_x.sum(axis=0)
def create_ghost_edges(sentences_matrix, entity_matrix, embeddings):
ghost_matrix = np.zeros((entity_matrix.shape[0], 2, entity_matrix.shape[1]))
for i in range(sentences_matrix.shape[0]):
entity_vector = entity_matrix[i][entity_matrix[i].nonzero()]
sentence_vector = sentences_matrix[i][sentences_matrix[i].nonzero()]
e1_one_hot = entity_vector == 2
e2_one_hot = entity_vector == 3
entity_embs = np.dot(np.asarray([e1_one_hot,e2_one_hot]), embeddings[sentence_vector])
e1_index = np.nonzero(e1_one_hot)[0]
e2_index = np.nonzero(e2_one_hot)[0]
entity_attention = np.dot(entity_embs, embeddings[sentence_vector].T)
entity_attention = softmax(entity_attention.T).T
entity_attention[:,[np.concatenate([e1_index, e2_index])]] = -np.Inf
ghost_markers = np.tile(entity_vector, (2,1))
ghost_markers[0][e1_index] = 1
ghost_markers[1][e2_index] = 1
if entity_attention.shape[-1] > 0:
selected_entities = np.argmax(entity_attention, axis=-1)
ghost_markers[0][selected_entities[0]] = 2
ghost_markers[1][selected_entities[1]] = 3
ghost_matrix[i,:,:entity_vector.shape[0]] = ghost_markers
return ghost_matrix
def makeup_missing_edges(g):
'''
make up missing edges with N/A relations
============
Arguments:
- g: an instance with tokens, edgeSet, vertexSet
Returns:
- new_g: g with missing edges made up with N/A
'''
negedges = get_all_negative_edges(g)
full_edgeset = g['edgeSet'] + negedges
full_edgeset = sorted(full_edgeset, key = lambda x:(x['left'], x['right']))
new_g = g
new_g['edgeSet'] = full_edgeset
return new_g
def detect_bidirectional_edges(g):
'''
detect bidirectional edges in the data
==========
Arguments:
- g: an instance with tokens, edgeSet, vertexSet
Returns:
- exist: boolean value representing if there exist bidirectional or replicated edges in this instance
'''
cache = set()
for i in g['edgeSet']:
if((tuple(i['left']), tuple(i['right'])) in cache):
return True
else:
cache.add((tuple(i['left']), tuple(i['right'])))
cache.add((tuple(i['right']), tuple(i['left'])))
return False
def remove_replicated_vertices(g):
'''
remove vertices with same tokenpos in the graph
===========
Arguments:
- g: an instance with tokens, edgeSet, vertexSet
Returns:
- new_g: a graph with no vertices of the same tokenpos
'''
new_g = {}
new_g['tokens'] = g['tokens']
new_g['vertexSet'] = []
new_g['edgeSet'] = []
tokenposSet = set()
for i in g['vertexSet']:
if(not tuple(i['tokenpositions']) in tokenposSet):
tokenposSet.add(tuple(i['tokenpositions']))
new_g['vertexSet'].append(i)
tokenpospairSet = set()
for i in g['edgeSet']:
if(not (tuple(i['left']), tuple(i['right'])) in tokenpospairSet and not tuple(i['left']) == tuple(i['right'])):
tokenpospairSet.add((tuple(i['left']), tuple(i['right'])))
new_g['edgeSet'].append(i)
return new_g
def add_reverse_edge(g):
'''
remove vertices with same tokenpos in the graph
===========
Arguments:
- g: an instance with tokens, edgeSet, vertexSet
Returns:
- new_g: a graph with no vertices of the same tokenpos
'''
def compare(item1, item2):
return ((item1['left'], item1['right']) < (item2['left'], item2['right']))
new_g = {}
new_g['tokens'] = g['tokens']
new_g['vertexSet'] = g['vertexSet']
new_g['edgeSet'] = []
tokenpospairSet = set()
for i in g['edgeSet']:
j = dict(i)
new_g['edgeSet'].append(i)
if(i['kbID'] != "P0"):
j['kbID'] = "~" + i['kbID']
tmp = j['left']
j['left'] = j['right']
j['right'] = tmp
new_g['edgeSet'].append(j)
# pdb.set_trace()
# new_g['vertexSet'] = sorted(new_g['vertexSet'])
new_g['edgeSet'] = sorted(new_g['edgeSet'], key=lambda x : (x['left'], x['right']))
return new_g
if __name__ == "__main__":
# Testing
import doctest
print(doctest.testmod())
| 48.153078 | 169 | 0.638908 | 3,875 | 28,940 | 4.487742 | 0.072516 | 0.039563 | 0.036228 | 0.030362 | 0.830017 | 0.820069 | 0.812191 | 0.795285 | 0.778436 | 0.764577 | 0 | 0.015038 | 0.232516 | 28,940 | 600 | 170 | 48.233333 | 0.767908 | 0.101417 | 0 | 0.757848 | 0 | 0 | 0.06206 | 0.000861 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047085 | false | 0 | 0.020179 | 0.002242 | 0.116592 | 0.006726 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b7d1039c9b2c7228ef752ab50285582233c6aae4 | 23,068 | py | Python | scripts/create_db.py | behATL/javaperks-aws-single-server | acf34a7e5020f2e606c38bbfd7f4ac009921b135 | [
"Apache-2.0"
] | null | null | null | scripts/create_db.py | behATL/javaperks-aws-single-server | acf34a7e5020f2e606c38bbfd7f4ac009921b135 | [
"Apache-2.0"
] | 1 | 2020-11-02T17:13:18.000Z | 2020-11-02T17:13:18.000Z | scripts/create_db.py | mocofound/javaperks-aws-single-server | 50ed01214a3d5da14d085bdc1a0f9f59ba681df4 | [
"Apache-2.0"
] | 1 | 2021-11-22T15:05:59.000Z | 2021-11-22T15:05:59.000Z | import MySQLdb # pylint: disable=import-error
import sys
import hvac
import base64
dbname = sys.argv[1]
username = sys.argv[2]
password = sys.argv[3]
roottoken = sys.argv[4]
region = sys.argv[5]
vault = hvac.Client(url="http://vault-main.service."+region+".consul:8200", token=roottoken)
db = MySQLdb.connect(host = dbname,
user = username,
password = password)
cursor = db.cursor()
def encrypt_acct(data):
retval = vault.secrets.transit.encrypt_data(
mount_point = 'transit',
name = 'account',
plaintext = base64.b64encode(data.encode()).decode('ascii')
)
return retval['data']['ciphertext']
def encrypt_cc(data):
retval = vault.secrets.transit.encrypt_data(
mount_point = 'transit',
name = 'payment',
plaintext = base64.b64encode(data.encode()).decode('ascii')
)
return retval['data']['ciphertext']
sql = "create database if not exists javaperks"
x = cursor.execute(sql)
sql = "use javaperks"
x = cursor.execute(sql)
sql = """create table if not exists customer_main(
custid int auto_increment,
custno varchar(20) not null,
firstname varchar(50) not null,
lastname varchar(50) not null,
email varchar(255) not null,
dob varchar(255),
ssn varchar(255),
datecreated datetime,
primary key (custid),
index idx_custno (custno)
) engine=innodb
"""
x = cursor.execute(sql)
sql = """create table if not exists customer_addresses(
addrid int auto_increment,
custid int not null,
contact varchar(255) not null,
address1 varchar(150) not null,
address2 varchar(150),
city varchar(150) not null,
state varchar(2) not null,
zip varchar(20) not null,
phone varchar(35),
addrtype varchar(20),
primary key(addrid),
index idx_custid (custid),
constraint fk_custid_custid
foreign key (custid)
references customer_main (custid)
) engine=innodb
"""
x = cursor.execute(sql)
sql = """create table if not exists customer_payment(
payid int auto_increment,
custid int not null,
cardname varchar(255) not null,
cardnumber varchar(255) not null,
cardtype varchar(2),
cvv varchar(255) not null,
expmonth varchar(2) not null,
expyear varchar(4) not null,
primary key(payid),
index idx_pay_custid (custid)
) engine=innodb
"""
x = cursor.execute(sql)
sql = """create table if not exists customer_invoice(
invid int auto_increment,
invno varchar(30) not null,
custid int not null,
invdate datetime not null,
orderid varchar(30),
title varchar(255) not null,
amount decimal,
tax decimal,
shipping decimal,
total decimal,
datepaid datetime,
contact varchar(255) not null,
address1 varchar(150) not null,
address2 varchar(150),
city varchar(150) not null,
state varchar(2) not null,
zip varchar(20) not null,
phone varchar(35),
primary key(invid),
index idx_inv_custid (custid)
) engine=innodb
"""
x = cursor.execute(sql)
sql = """create table if not exists customer_invoice_item(
itemid int auto_increment,
invid int not null,
product varchar(255) not null,
description text,
amount decimal,
quantity int,
lineno int,
primary key(itemid),
index idx_invoice (invid)
) engine=innodb
"""
x = cursor.execute(sql)
##################################
# Add Customer 1 - Janice Thompson
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS100312',
'Janice',
'Thompson',
'{email}',
'{dob}',
'{ssn}',
'2016-05-01'
)
""".format(
email = encrypt_acct('jthomp4423@example.com'),
dob = encrypt_acct('11/28/1983'),
ssn = encrypt_acct('027-40-7057')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Janice Thompson',
'3611 Farland Street',
'Brockton',
'MA',
'02401',
'774-240-5996',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Janice Thompson',
'3611 Farland Street',
'Brockton',
'MA',
'02401',
'774-240-5996',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'Janice Thompson',
'{cardnum}',
'AX',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('378282246310005'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
##################################
# Add Customer 2 - James Wilson
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS106004',
'James',
'Wilson',
'{email}',
'{dob}',
'{ssn}',
'2013-07-06'
)
""".format(
email = encrypt_acct('wilson@example.com'),
dob = encrypt_acct('6/4/1974'),
ssn = encrypt_acct('309-64-5158')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'James Wilson',
'1437 Capitol Avenue',
'Paragon',
'IN',
'46166',
'765-537-0152',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'James Wilson',
'1437 Capitol Avenue',
'Paragon',
'IN',
'46166',
'765-537-0152',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'James Wilson',
'{cardnum}',
'AX',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('371449635398431'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
##################################
# Add Customer 3 - Tommy Ballinger
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS101438',
'Tommy',
'Ballinger',
'{email}',
'{dob}',
'{ssn}',
'2016-12-28'
)
""".format(
email = encrypt_acct('tommy6677@example.com'),
dob = encrypt_acct('1/5/1984'),
ssn = encrypt_acct('530-02-6158')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Tommy Ballinger',
'2143 Wescam Court',
'Reno',
'NV',
'89502',
'775-856-9045',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Tommy Ballinger',
'2143 Wescam Court',
'Reno',
'NV',
'89502',
'775-856-9045',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'Tommy Ballinger',
'{cardnum}',
'AX',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('378734493671000'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
##################################
# Add Customer 4 - Mary McCann
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS210895',
'Mary',
'McCann',
'{email}',
'{dob}',
'{ssn}',
'2018-05-24'
)
""".format(
email = encrypt_acct('mmccann1212@example.com'),
dob = encrypt_acct('9/4/1981'),
ssn = encrypt_acct('246-98-9817')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Mary McCann',
'4512 Layman Avenue',
'Robbins',
'NC',
'27325',
'910-948-3965',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Mary McCann',
'4512 Layman Avenue',
'Robbins',
'NC',
'27325',
'910-948-3965',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'Mary McCann',
'{cardnum}',
'DI',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('6011111111111117'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
##################################
# Add Customer 5 - Chris Peterson
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS122955',
'Chris',
'Peterson',
'{email}',
'{dob}',
'{ssn}',
'2015-03-04'
)
""".format(
email = encrypt_acct('cjpcomp@example.com'),
dob = encrypt_acct('9/9/1975'),
ssn = encrypt_acct('019-26-9782')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Chris Peterson',
'2329 Joanne Lane',
'Newburyport',
'MA',
'01950',
'978-499-7306',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Chris Peterson',
'2329 Joanne Lane',
'Newburyport',
'MA',
'01950',
'978-499-7306',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'Chris Peterson',
'{cardnum}',
'DI',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('6011000990139424'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
##################################
# Add Customer 6 - Jennifer Jones
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS602934',
'Jennifer',
'Jones',
'{email}',
'{dob}',
'{ssn}',
'2014-10-17'
)
""".format(
email = encrypt_acct('jjhome7823@example.com'),
dob = encrypt_acct('10/31/1983'),
ssn = encrypt_acct('209-62-4365')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Jennifer Jones',
'589 Hidden Valley Road',
'Lancaster',
'PA',
'17670',
'717-224-9902',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Jennifer Jones',
'589 Hidden Valley Road',
'Lancaster',
'PA',
'17670',
'717-224-9902',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'Jennifer Jones',
'{cardnum}',
'MC',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('5555555555554444'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
##################################
# Add Customer 7 - Clint Mason
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS157843',
'Clint',
'Mason',
'{email}',
'{dob}',
'{ssn}',
'2014-08-23'
)
""".format(
email = encrypt_acct('clint.mason312@example.com'),
dob = encrypt_acct('10/7/1983'),
ssn = encrypt_acct('453-37-0205')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Clint Mason',
'3641 Alexander Drive',
'Denton',
'TX',
'76201',
'940-349-9386',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Clint Mason',
'3641 Alexander Drive',
'Denton',
'TX',
'76201',
'940-349-9386',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'Clint Mason',
'{cardnum}',
'MC',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('5105105105105100'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
##################################
# Add Customer 8 - Matt Grey
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS523484',
'Matt',
'Grey',
'{email}',
'{dob}',
'{ssn}',
'2016-11-12'
)
""".format(
email = encrypt_acct('greystone89@example.com'),
dob = encrypt_acct('7/25/1963'),
ssn = encrypt_acct('184-36-8146')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Matt Grey',
'1320 Tree Top Lane',
'Wayne',
'PA',
'19087',
'610-225-6567',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Matt Grey',
'1320 Tree Top Lane',
'Wayne',
'PA',
'19087',
'610-225-6567',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'Matt Grey',
'{cardnum}',
'VS',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('4111111111111111'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
##################################
# Add Customer 9 - Howard Turner
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS658871',
'Howard',
'Turner',
'{email}',
'{dob}',
'{ssn}',
'2014-03-03'
)
""".format(
email = encrypt_acct('runwayyourway@example.com'),
dob = encrypt_acct('6/29/1977'),
ssn = encrypt_acct('019-26-8577')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Howard Turner',
'1179 Lynn Street',
'Woburn',
'MA',
'01801',
'617-251-5420',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Howard Turner',
'1179 Lynn Street',
'Woburn',
'MA',
'01801',
'617-251-5420',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'Howard Turner',
'{cardnum}',
'VS',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('4012888888881881'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
##################################
# Add Customer 10 - Larry Olsen
##################################
sql = """insert into customer_main(
custno,
firstname,
lastname,
email,
dob,
ssn,
datecreated
) values (
'CS103393',
'Larry',
'Olsen',
'{email}',
'{dob}',
'{ssn}',
'2016-02-21'
)
""".format(
email = encrypt_acct('olsendog1979@example.com'),
dob = encrypt_acct('4/17/1992'),
ssn = encrypt_acct('285-70-8598')
)
x = cursor.execute(sql)
sql = "select last_insert_id()"
retval = cursor.execute(sql)
rset = cursor.fetchall()
nextid = rset[0][0]
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Larry Olsen',
'2850 Still Street',
'Oregon',
'OH',
'43616',
'419-698-9890',
'B'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_addresses(
custid,
contact,
address1,
city,
state,
zip,
phone,
addrtype
) values (
{id},
'Larry Olsen',
'2850 Still Street',
'Oregon',
'OH',
'43616',
'419-698-9890',
'S'
)
""".format(
id = str(nextid)
)
x = cursor.execute(sql)
sql = """insert into customer_payment(
custid,
cardname,
cardnumber,
cardtype,
cvv,
expmonth,
expyear
) values (
{id},
'Larry Olsen',
'{cardnum}',
'VS',
'{cvv}',
'08',
'2024'
)
""".format(
id = str(nextid),
cardnum = encrypt_cc('4111111111111111'),
cvv = encrypt_cc('344')
)
x = cursor.execute(sql)
db.commit()
db.close()
| 19.516074 | 92 | 0.472256 | 2,182 | 23,068 | 4.929881 | 0.162695 | 0.068885 | 0.084782 | 0.074277 | 0.762759 | 0.750302 | 0.728921 | 0.722971 | 0.722971 | 0.722971 | 0 | 0.075103 | 0.367956 | 23,068 | 1,181 | 93 | 19.532599 | 0.662689 | 0.014479 | 0 | 0.784787 | 0 | 0 | 0.730605 | 0.009437 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001855 | false | 0.001855 | 0.003711 | 0 | 0.007421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b7df78e9b1076f2be91ec46ba674e2b5057c3065 | 373 | py | Python | src/TER.py | bhavyajeet/Project-PreQL | 9a8fffe450a37b324f09b53fbc1bc762aa7cc556 | [
"MIT"
] | null | null | null | src/TER.py | bhavyajeet/Project-PreQL | 9a8fffe450a37b324f09b53fbc1bc762aa7cc556 | [
"MIT"
] | null | null | null | src/TER.py | bhavyajeet/Project-PreQL | 9a8fffe450a37b324f09b53fbc1bc762aa7cc556 | [
"MIT"
] | null | null | null | import random
tot =400
for i in range (1,tot+1):
lol=""
lol+=str(i)+","+str(400-i+1)+","+str(random.randint(1,tot+1))+","+str(random.randint(1,tot+1))+","+str(random.randint(1,tot+1))+","+str(random.randint(1,tot+1))+","+str(random.randint(1,tot+1))+","+str(random.randint(1,tot+1))+","+str(random.randint(1,tot+1))+","+str(random.randint(1,tot+1))
print (lol)
| 53.285714 | 296 | 0.61126 | 69 | 373 | 3.304348 | 0.188406 | 0.157895 | 0.197368 | 0.596491 | 0.741228 | 0.741228 | 0.741228 | 0.741228 | 0.741228 | 0.741228 | 0 | 0.071839 | 0.067024 | 373 | 6 | 297 | 62.166667 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0.024129 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b7fbfcd119d3eae9dbb049c18cae2771d149655a | 10,419 | py | Python | nn_patterns/utils/tests/networks/base.py | pikinder/nn-patterns | 50c9d9d23512707d2adb2bd7b2cd528f5cb1aaff | [
"MIT"
] | 15 | 2017-09-15T10:04:54.000Z | 2020-07-08T09:16:37.000Z | nn_patterns/utils/tests/networks/base.py | pikinder/nn-patterns | 50c9d9d23512707d2adb2bd7b2cd528f5cb1aaff | [
"MIT"
] | 2 | 2018-03-28T19:45:53.000Z | 2018-08-13T08:00:53.000Z | nn_patterns/utils/tests/networks/base.py | pikinder/nn-patterns | 50c9d9d23512707d2adb2bd7b2cd528f5cb1aaff | [
"MIT"
] | 2 | 2018-03-07T08:09:13.000Z | 2020-06-19T14:54:04.000Z | # Begin: Python 2/3 compatibility header small
# Get Python 3 functionality:
from __future__ import\
absolute_import, print_function, division, unicode_literals
from future.utils import raise_with_traceback, raise_from
# catch exception with: except Exception as e
from builtins import range, map, zip, filter
from io import open
import six
# End: Python 2/3 compatability header small
###############################################################################
###############################################################################
###############################################################################
import lasagne.init
import lasagne.layers
import lasagne.nonlinearities
__all__ = [
"log_reg",
"mlp_1dense",
"mlp_2dense",
"cnn_1convb_1dense",
"cnn_2convb_1dense",
"cnn_2convb_2dense",
"cnn_3convb_2dense",
]
###############################################################################
###############################################################################
###############################################################################
def input_layer(*args, **kwargs):
return lasagne.layers.InputLayer(*args, **kwargs)
def dense_layer(*args, **kwargs):
return lasagne.layers.DenseLayer(*args, **kwargs)
def conv_layer(*args, **kwargs):
return lasagne.layers.Conv2DLayer(*args, **kwargs)
def conv_pool(layer_in, n_conv, prefix, n_filter, **kwargs):
conv_prefix = "%s_%%i" % prefix
ret = {}
current_layer = layer_in
for i in range(n_conv):
conv = conv_layer(current_layer, n_filter,
(3, 3), (1, 1), pad="same",
W=lasagne.init.GlorotUniform(), **kwargs)
current_layer = conv
ret[conv_prefix % i] = conv
ret["%s_pool" % prefix] = lasagne.layers.MaxPool2DLayer(current_layer,
(2, 2))
return ret
def dropout_layer(*args, **kwargs):
return lasagne.layers.DropoutLayer(*args, **kwargs)
###############################################################################
###############################################################################
###############################################################################
def log_reg(input_shape, output_n, nonlinearity=None):
if nonlinearity is None:
nonlinearity = lasagne.nonlinearities.rectify
net = {}
net["in"] = input_layer(shape=input_shape)
net["out"] = dense_layer(net["in"], num_units=output_n,
nonlinearity=lasagne.nonlinearities.softmax,
W=lasagne.init.GlorotUniform())
net.update({
"input_shape": input_shape,
"input_var": net["in"].input_var,
"output_n": output_n,
})
return net
###############################################################################
###############################################################################
###############################################################################
def mlp_1dense(input_shape, output_n, nonlinearity=None,
dense_units=512, dropout_rate=0.25):
if nonlinearity is None:
nonlinearity = lasagne.nonlinearities.rectify
net = {}
net["in"] = input_layer(shape=input_shape)
net["dense_1"] = dense_layer(net["in"], num_units=dense_units,
nonlinearity=nonlinearity,
W=lasagne.init.GlorotUniform())
net['dense_1_dropout'] = dropout_layer(net['dense_1'], p=dropout_rate)
net["out"] = dense_layer(net["dense_1_dropout"], num_units=output_n,
nonlinearity=lasagne.nonlinearities.softmax,
W=lasagne.init.GlorotUniform())
net.update({
"input_shape": input_shape,
"input_var": net["in"].input_var,
"output_n": output_n,
})
return net
def mlp_2dense(input_shape, output_n, nonlinearity=None,
dense_units=512, dropout_rate=0.25):
if nonlinearity is None:
nonlinearity = lasagne.nonlinearities.rectify
net = {}
net["in"] = input_layer(shape=input_shape)
net["dense_1"] = dense_layer(net["in"], num_units=dense_units,
nonlinearity=nonlinearity,
W=lasagne.init.GlorotUniform())
net['dense_1_dropout'] = dropout_layer(net['dense_1'], p=dropout_rate)
net["dense_2"] = dense_layer(net["dense_1_dropout"], num_units=dense_units,
nonlinearity=nonlinearity,
W=lasagne.init.GlorotUniform())
net['dense_2_dropout'] = dropout_layer(net['dense_2'], p=dropout_rate)
net["out"] = dense_layer(net["dense_2_dropout"], num_units=output_n,
nonlinearity=lasagne.nonlinearities.softmax,
W=lasagne.init.GlorotUniform())
net.update({
"input_shape": input_shape,
"input_var": net["in"].input_var,
"output_n": output_n,
})
return net
###############################################################################
###############################################################################
###############################################################################
def cnn_1convb_1dense(input_shape, output_n, nonlinearity=None,
dense_units=512, dropout_rate=0.25):
if nonlinearity is None:
nonlinearity = lasagne.nonlinearities.rectify
net = {}
net["in"] = input_layer(shape=input_shape)
net.update(conv_pool(net["in"], 2, "conv_1", 128,
nonlinearity=nonlinearity))
net["dense_1"] = dense_layer(net["conv_1_pool"], num_units=dense_units,
nonlinearity=nonlinearity,
W=lasagne.init.GlorotUniform())
net['dense_1_dropout'] = dropout_layer(net['dense_1'], p=dropout_rate)
net["out"] = dense_layer(net["dense_1_dropout"], num_units=output_n,
nonlinearity=lasagne.nonlinearities.softmax,
W=lasagne.init.GlorotUniform())
net.update({
"input_shape": input_shape,
"input_var": net["in"].input_var,
"output_n": output_n,
})
return net
def cnn_2convb_1dense(input_shape, output_n, nonlinearity=None,
dense_units=512, dropout_rate=0.25):
if nonlinearity is None:
nonlinearity = lasagne.nonlinearities.rectify
net = {}
net["in"] = input_layer(shape=input_shape)
net.update(conv_pool(net["in"], 2, "conv_1", 128,
nonlinearity=nonlinearity))
net.update(conv_pool(net["conv_1_pool"], 2, "conv_2", 128,
nonlinearity=nonlinearity))
net["dense_1"] = dense_layer(net["conv_2_pool"], num_units=dense_units,
nonlinearity=nonlinearity,
W=lasagne.init.GlorotUniform())
net['dense_1_dropout'] = dropout_layer(net['dense_1'], p=dropout_rate)
net["out"] = dense_layer(net["dense_1_dropout"], num_units=output_n,
nonlinearity=lasagne.nonlinearities.softmax,
W=lasagne.init.GlorotUniform())
net.update({
"input_shape": input_shape,
"input_var": net["in"].input_var,
"output_n": output_n,
})
return net
def cnn_2convb_2dense(input_shape, output_n, nonlinearity=None,
dense_units=512, dropout_rate=0.25):
if nonlinearity is None:
nonlinearity = lasagne.nonlinearities.rectify
net = {}
net["in"] = input_layer(shape=input_shape)
net.update(conv_pool(net["in"], 2, "conv_1", 128,
nonlinearity=nonlinearity))
net.update(conv_pool(net["conv_1_pool"], 2, "conv_2", 128,
nonlinearity=nonlinearity))
net["dense_1"] = dense_layer(net["conv_2_pool"], num_units=dense_units,
nonlinearity=nonlinearity,
W=lasagne.init.GlorotUniform())
net['dense_1_dropout'] = dropout_layer(net['dense_1'], p=dropout_rate)
net["dense_2"] = dense_layer(net["dense_1_dropout"], num_units=dense_units,
nonlinearity=nonlinearity,
W=lasagne.init.GlorotUniform())
net['dense_2_dropout'] = dropout_layer(net['dense_2'], p=dropout_rate)
net["out"] = dense_layer(net["dense_2_dropout"], num_units=output_n,
nonlinearity=lasagne.nonlinearities.softmax,
W=lasagne.init.GlorotUniform())
net.update({
"input_shape": input_shape,
"input_var": net["in"].input_var,
"output_n": output_n,
})
return net
def cnn_3convb_2dense(input_shape, output_n, nonlinearity=None,
dense_units=512, dropout_rate=0.25):
if nonlinearity is None:
nonlinearity = lasagne.nonlinearities.rectify
net = {}
net["in"] = input_layer(shape=input_shape)
net.update(conv_pool(net["in"], 2, "conv_1", 128,
nonlinearity=nonlinearity))
net.update(conv_pool(net["conv_1_pool"], 2, "conv_2", 128,
nonlinearity=nonlinearity))
net.update(conv_pool(net["conv_2_pool"], 2, "conv_3", 128,
nonlinearity=nonlinearity))
net["dense_1"] = dense_layer(net["conv_3_pool"], num_units=dense_units,
nonlinearity=nonlinearity,
W=lasagne.init.GlorotUniform())
net['dense_1_dropout'] = dropout_layer(net['dense_1'], p=dropout_rate)
net["dense_2"] = dense_layer(net["dense_1_dropout"], num_units=dense_units,
nonlinearity=nonlinearity,
W=lasagne.init.GlorotUniform())
net['dense_2_dropout'] = dropout_layer(net['dense_2'], p=dropout_rate)
net["out"] = dense_layer(net["dense_2_dropout"], num_units=output_n,
nonlinearity=lasagne.nonlinearities.softmax,
W=lasagne.init.GlorotUniform())
net.update({
"input_shape": input_shape,
"input_var": net["in"].input_var,
"output_n": output_n,
})
return net
| 37.478417 | 79 | 0.526826 | 1,059 | 10,419 | 4.909348 | 0.093484 | 0.055395 | 0.041546 | 0.081746 | 0.831698 | 0.831121 | 0.79573 | 0.79573 | 0.79573 | 0.791691 | 0 | 0.019681 | 0.253863 | 10,419 | 277 | 80 | 37.613718 | 0.649087 | 0.015261 | 0 | 0.74359 | 0 | 0 | 0.099559 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061538 | false | 0 | 0.046154 | 0.020513 | 0.169231 | 0.005128 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4d1197fc5c7b806288598d9f7ebcd98a1b8e620f | 179,540 | py | Python | test/test_type_registry.py | OAK-Foundation/py-scale-codec | 9c8b3c5cd39e639fad1b5f420d914b5dd6b26ac0 | [
"Apache-2.0"
] | null | null | null | test/test_type_registry.py | OAK-Foundation/py-scale-codec | 9c8b3c5cd39e639fad1b5f420d914b5dd6b26ac0 | [
"Apache-2.0"
] | null | null | null | test/test_type_registry.py | OAK-Foundation/py-scale-codec | 9c8b3c5cd39e639fad1b5f420d914b5dd6b26ac0 | [
"Apache-2.0"
] | null | null | null | # Python SCALE Codec Library
#
# Copyright 2018-2020 Stichting Polkascan (Polkascan Foundation).
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import copy
import os
import unittest
from pathlib import Path
from scalecodec.block import EventsDecoder, ExtrinsicsDecoder
from scalecodec.metadata import MetadataDecoder
from scalecodec.base import RuntimeConfiguration, ScaleBytes
from scalecodec.type_registry import load_type_registry_preset
class TestScaleTypeEncoding(unittest.TestCase):
@classmethod
def setUpClass(cls):
RuntimeConfiguration().clear_type_registry()
RuntimeConfiguration().update_type_registry(load_type_registry_preset("default"))
RuntimeConfiguration().update_type_registry(load_type_registry_preset("kusama"))
metadata_v10_hex = "0x6d6574610a701853797374656d011853797374656d34304163636f756e744e6f6e636501010130543a3a4163636f756e74496420543a3a496e646578001000000000047c2045787472696e73696373206e6f6e636520666f72206163636f756e74732e3845787472696e736963436f756e7400000c753332040004b820546f74616c2065787472696e7369637320636f756e7420666f72207468652063757272656e7420626c6f636b2e4c416c6c45787472696e73696373576569676874000018576569676874040004150120546f74616c2077656967687420666f7220616c6c2065787472696e736963732070757420746f6765746865722c20666f72207468652063757272656e7420626c6f636b2e40416c6c45787472696e736963734c656e00000c753332040004410120546f74616c206c656e6774682028696e2062797465732920666f7220616c6c2065787472696e736963732070757420746f6765746865722c20666f72207468652063757272656e7420626c6f636b2e24426c6f636b4861736801010138543a3a426c6f636b4e756d6265721c543a3a48617368008000000000000000000000000000000000000000000000000000000000000000000498204d6170206f6620626c6f636b206e756d6265727320746f20626c6f636b206861736865732e3445787472696e736963446174610101010c7533321c5665633c75383e000400043d012045787472696e73696373206461746120666f72207468652063757272656e7420626c6f636b20286d61707320616e2065787472696e736963277320696e64657820746f206974732064617461292e184e756d626572010038543a3a426c6f636b4e756d6265721000000000040901205468652063757272656e7420626c6f636b206e756d626572206265696e672070726f6365737365642e205365742062792060657865637574655f626c6f636b602e28506172656e744861736801001c543a3a4861736880000000000000000000000000000000000000000000000000000000000000000004702048617368206f66207468652070726576696f757320626c6f636b2e3845787472696e73696373526f6f7401001c543a3a486173688000000000000000000000000000000000000000000000000000000000000000000415012045787472696e7369637320726f6f74206f66207468652063757272656e7420626c6f636b2c20616c736f2070617274206f662074686520626c6f636b206865616465722e1844696765737401002c4469676573744f663c543e040004f020446967657374206f66207468652063757272656e7420626c6f636b2c20616c736f2070617274206f662074686520626c6f636b206865616465722e184576656e747301008c5665633c4576656e745265636f72643c543a3a4576656e742c20543a3a486173683e3e040004a0204576656e7473206465706f736974656420666f72207468652063757272656e7420626c6f636b2e284576656e74436f756e740100284576656e74496e646578100000000004b820546865206e756d626572206f66206576656e747320696e2074686520604576656e74733c543e60206c6973742e2c4576656e74546f706963730101011c543a3a48617368845665633c28543a3a426c6f636b4e756d6265722c204576656e74496e646578293e000400282501204d617070696e67206265747765656e206120746f7069632028726570726573656e74656420627920543a3a486173682920616e64206120766563746f72206f6620696e646578657394206f66206576656e747320696e2074686520603c4576656e74733c543e3e60206c6973742e00510120416c6c20746f70696320766563746f727320686176652064657465726d696e69737469632073746f72616765206c6f636174696f6e7320646570656e64696e67206f6e2074686520746f7069632e2054686973450120616c6c6f7773206c696768742d636c69656e747320746f206c6576657261676520746865206368616e67657320747269652073746f7261676520747261636b696e67206d656368616e69736d20616e64e420696e2063617365206f66206368616e67657320666574636820746865206c697374206f66206576656e7473206f6620696e7465726573742e004d01205468652076616c756520686173207468652074797065206028543a3a426c6f636b4e756d6265722c204576656e74496e646578296020626563617573652069662077652075736564206f6e6c79206a7573744d012074686520604576656e74496e64657860207468656e20696e20636173652069662074686520746f70696320686173207468652073616d6520636f6e74656e7473206f6e20746865206e65787420626c6f636b0101206e6f206e6f74696669636174696f6e2077696c6c20626520747269676765726564207468757320746865206576656e74206d69676874206265206c6f73742e011c2866696c6c5f626c6f636b0004210120412062696720646973706174636820746861742077696c6c20646973616c6c6f7720616e79206f74686572207472616e73616374696f6e20746f20626520696e636c756465642e1872656d61726b041c5f72656d61726b1c5665633c75383e046c204d616b6520736f6d65206f6e2d636861696e2072656d61726b2e387365745f686561705f7061676573041470616765730c75363404fc2053657420746865206e756d626572206f6620706167657320696e2074686520576562417373656d626c7920656e7669726f6e6d656e74277320686561702e207365745f636f6465040c6e65771c5665633c75383e04482053657420746865206e657720636f64652e2c7365745f73746f7261676504146974656d73345665633c4b657956616c75653e046c2053657420736f6d65206974656d73206f662073746f726167652e306b696c6c5f73746f7261676504106b657973205665633c4b65793e0478204b696c6c20736f6d65206974656d732066726f6d2073746f726167652e2c6b696c6c5f70726566697804187072656669780c4b6579041501204b696c6c20616c6c2073746f72616765206974656d7320776974682061206b657920746861742073746172747320776974682074686520676976656e207072656669782e01084045787472696e7369635375636365737304304469737061746368496e666f049420416e2065787472696e73696320636f6d706c65746564207375636365737366756c6c792e3c45787472696e7369634661696c6564083444697370617463684572726f72304469737061746368496e666f045420416e2065787472696e736963206661696c65642e00006052616e646f6d6e657373436f6c6c656374697665466c6970016052616e646f6d6e657373436f6c6c656374697665466c6970043852616e646f6d4d6174657269616c0100305665633c543a3a486173683e04000c610120536572696573206f6620626c6f636b20686561646572732066726f6d20746865206c61737420383120626c6f636b73207468617420616374732061732072616e646f6d2073656564206d6174657269616c2e2054686973610120697320617272616e67656420617320612072696e672062756666657220776974682060626c6f636b5f6e756d626572202520383160206265696e672074686520696e64657820696e746f20746865206056656360206f664420746865206f6c6465737420686173682e000000001042616265011042616265242845706f6368496e64657801000c75363420000000000000000004542043757272656e742065706f636820696e6465782e2c417574686f72697469657301009c5665633c28417574686f7269747949642c2042616265417574686f72697479576569676874293e0400046c2043757272656e742065706f636820617574686f7269746965732e2c47656e65736973536c6f7401000c75363420000000000000000008f82054686520736c6f74206174207768696368207468652066697273742065706f63682061637475616c6c7920737461727465642e205468697320697320309020756e74696c2074686520666972737420626c6f636b206f662074686520636861696e2e2c43757272656e74536c6f7401000c75363420000000000000000004542043757272656e7420736c6f74206e756d6265722e2852616e646f6d6e6573730100205b75383b2033325d80000000000000000000000000000000000000000000000000000000000000000028b8205468652065706f63682072616e646f6d6e65737320666f7220746865202a63757272656e742a2065706f63682e002c20232053656375726974790005012054686973204d555354204e4f54206265207573656420666f722067616d626c696e672c2061732069742063616e20626520696e666c75656e6365642062792061f8206d616c6963696f75732076616c696461746f7220696e207468652073686f7274207465726d2e204974204d4159206265207573656420696e206d616e7915012063727970746f677261706869632070726f746f636f6c732c20686f77657665722c20736f206c6f6e67206173206f6e652072656d656d6265727320746861742074686973150120286c696b652065766572797468696e6720656c7365206f6e2d636861696e29206974206973207075626c69632e20466f72206578616d706c652c2069742063616e206265050120757365642077686572652061206e756d626572206973206e656564656420746861742063616e6e6f742068617665206265656e2063686f73656e20627920616e0d01206164766572736172792c20666f7220707572706f7365732073756368206173207075626c69632d636f696e207a65726f2d6b6e6f776c656467652070726f6f66732e384e65787452616e646f6d6e6573730100205b75383b2033325d800000000000000000000000000000000000000000000000000000000000000000045c204e6578742065706f63682072616e646f6d6e6573732e305365676d656e74496e64657801000c7533321000000000247c2052616e646f6d6e65737320756e64657220636f6e737472756374696f6e2e00f4205765206d616b6520612074726164656f6666206265747765656e2073746f7261676520616363657373657320616e64206c697374206c656e6774682e01012057652073746f72652074686520756e6465722d636f6e737472756374696f6e2072616e646f6d6e65737320696e207365676d656e7473206f6620757020746f942060554e4445525f434f4e535452554354494f4e5f5345474d454e545f4c454e475448602e00ec204f6e63652061207365676d656e7420726561636865732074686973206c656e6774682c20776520626567696e20746865206e657874206f6e652e090120576520726573657420616c6c207365676d656e747320616e642072657475726e20746f206030602061742074686520626567696e6e696e67206f662065766572791c2065706f63682e44556e646572436f6e737472756374696f6e0101010c753332345665633c5b75383b2033325d3e000400002c496e697469616c697a65640000204d6179626556726604000801012054656d706f726172792076616c75652028636c656172656420617420626c6f636b2066696e616c697a6174696f6e292077686963682069732060536f6d65601d01206966207065722d626c6f636b20696e697469616c697a6174696f6e2068617320616c7265616479206265656e2063616c6c656420666f722063757272656e7420626c6f636b2e010000083445706f63684475726174696f6e0c753634205802000000000000080d0120546865206e756d626572206f66202a2a736c6f74732a2a207468617420616e2065706f63682074616b65732e20576520636f75706c652073657373696f6e7320746ffc2065706f6368732c20692e652e2077652073746172742061206e65772073657373696f6e206f6e636520746865206e65772065706f636820626567696e732e444578706563746564426c6f636b54696d6524543a3a4d6f6d656e7420701700000000000014050120546865206578706563746564206176657261676520626c6f636b2074696d6520617420776869636820424142452073686f756c64206265206372656174696e67110120626c6f636b732e2053696e636520424142452069732070726f626162696c6973746963206974206973206e6f74207472697669616c20746f20666967757265206f75740501207768617420746865206578706563746564206176657261676520626c6f636b2074696d652073686f756c64206265206261736564206f6e2074686520736c6f740901206475726174696f6e20616e642074686520736563757269747920706172616d657465722060636020287768657265206031202d20636020726570726573656e7473a0207468652070726f626162696c697479206f66206120736c6f74206265696e6720656d707479292e002454696d657374616d70012454696d657374616d70080c4e6f77010024543a3a4d6f6d656e7420000000000000000004902043757272656e742074696d6520666f72207468652063757272656e7420626c6f636b2e24446964557064617465010010626f6f6c040004b420446964207468652074696d657374616d7020676574207570646174656420696e207468697320626c6f636b3f01040c736574040c6e6f7748436f6d706163743c543a3a4d6f6d656e743e245820536574207468652063757272656e742074696d652e00590120546869732063616c6c2073686f756c6420626520696e766f6b65642065786163746c79206f6e63652070657220626c6f636b2e2049742077696c6c2070616e6963206174207468652066696e616c697a6174696f6ed82070686173652c20696620746869732063616c6c206861736e2774206265656e20696e766f6b656420627920746861742074696d652e004501205468652074696d657374616d702073686f756c642062652067726561746572207468616e207468652070726576696f7573206f6e652062792074686520616d6f756e74207370656369666965642062794420604d696e696d756d506572696f64602e00d820546865206469737061746368206f726967696e20666f7220746869732063616c6c206d7573742062652060496e686572656e74602e0004344d696e696d756d506572696f6424543a3a4d6f6d656e7420b80b00000000000010690120546865206d696e696d756d20706572696f64206265747765656e20626c6f636b732e204265776172652074686174207468697320697320646966666572656e7420746f20746865202a65787065637465642a20706572696f64690120746861742074686520626c6f636b2070726f64756374696f6e206170706172617475732070726f76696465732e20596f75722063686f73656e20636f6e73656e7375732073797374656d2077696c6c2067656e6572616c6c79650120776f726b2077697468207468697320746f2064657465726d696e6520612073656e7369626c6520626c6f636b2074696d652e20652e672e20466f7220417572612c2069742077696c6c20626520646f75626c6520746869737020706572696f64206f6e2064656661756c742073657474696e67732e001c496e6469636573011c496e6469636573082c4e657874456e756d53657401003c543a3a4163636f756e74496e6465781000000000047c20546865206e657874206672656520656e756d65726174696f6e207365742e1c456e756d5365740101013c543a3a4163636f756e74496e646578445665633c543a3a4163636f756e7449643e00040004582054686520656e756d65726174696f6e20736574732e010001043c4e65774163636f756e74496e64657808244163636f756e744964304163636f756e74496e64657810882041206e6577206163636f756e7420696e646578207761732061737369676e65642e0005012054686973206576656e74206973206e6f7420747269676765726564207768656e20616e206578697374696e6720696e64657820697320726561737369676e65646020746f20616e6f7468657220604163636f756e744964602e00002042616c616e636573012042616c616e6365731434546f74616c49737375616e6365010028543a3a42616c616e6365400000000000000000000000000000000004982054686520746f74616c20756e6974732069737375656420696e207468652073797374656d2e1c56657374696e6700010130543a3a4163636f756e744964ac56657374696e675363686564756c653c543a3a42616c616e63652c20543a3a426c6f636b4e756d6265723e00040004d820496e666f726d6174696f6e20726567617264696e67207468652076657374696e67206f66206120676976656e206163636f756e742e2c4672656542616c616e636501010130543a3a4163636f756e74496428543a3a42616c616e63650040000000000000000000000000000000002c9c20546865202766726565272062616c616e6365206f66206120676976656e206163636f756e742e004101205468697320697320746865206f6e6c792062616c616e63652074686174206d61747465727320696e207465726d73206f66206d6f7374206f7065726174696f6e73206f6e20746f6b656e732e204974750120616c6f6e65206973207573656420746f2064657465726d696e65207468652062616c616e6365207768656e20696e2074686520636f6e747261637420657865637574696f6e20656e7669726f6e6d656e742e205768656e207468697355012062616c616e63652066616c6c732062656c6f77207468652076616c7565206f6620604578697374656e7469616c4465706f736974602c207468656e20746865202763757272656e74206163636f756e74272069733d012064656c657465643a207370656369666963616c6c7920604672656542616c616e6365602e20467572746865722c2074686520604f6e4672656542616c616e63655a65726f602063616c6c6261636b450120697320696e766f6b65642c20676976696e672061206368616e636520746f2065787465726e616c206d6f64756c657320746f20636c65616e2075702064617461206173736f636961746564207769746854207468652064656c65746564206163636f756e742e00750120606672616d655f73797374656d3a3a4163636f756e744e6f6e63656020697320616c736f2064656c657465642069662060526573657276656442616c616e63656020697320616c736f207a65726f2028697420616c736f2067657473150120636f6c6c617073656420746f207a65726f2069662069742065766572206265636f6d6573206c657373207468616e20604578697374656e7469616c4465706f736974602e3c526573657276656442616c616e636501010130543a3a4163636f756e74496428543a3a42616c616e63650040000000000000000000000000000000002c75012054686520616d6f756e74206f66207468652062616c616e6365206f66206120676976656e206163636f756e7420746861742069732065787465726e616c6c792072657365727665643b20746869732063616e207374696c6c206765749c20736c61736865642c20627574206765747320736c6173686564206c617374206f6620616c6c2e006d0120546869732062616c616e63652069732061202772657365727665272062616c616e63652074686174206f746865722073756273797374656d732075736520696e206f7264657220746f2073657420617369646520746f6b656e732501207468617420617265207374696c6c20276f776e65642720627920746865206163636f756e7420686f6c6465722c20627574207768696368206172652073757370656e6461626c652e007501205768656e20746869732062616c616e63652066616c6c732062656c6f77207468652076616c7565206f6620604578697374656e7469616c4465706f736974602c207468656e2074686973202772657365727665206163636f756e7427b42069732064656c657465643a207370656369666963616c6c792c2060526573657276656442616c616e6365602e00650120606672616d655f73797374656d3a3a4163636f756e744e6f6e63656020697320616c736f2064656c6574656420696620604672656542616c616e63656020697320616c736f207a65726f2028697420616c736f2067657473190120636f6c6c617073656420746f207a65726f2069662069742065766572206265636f6d6573206c657373207468616e20604578697374656e7469616c4465706f736974602e29144c6f636b7301010130543a3a4163636f756e744964b05665633c42616c616e63654c6f636b3c543a3a42616c616e63652c20543a3a426c6f636b4e756d6265723e3e00040004b820416e79206c6971756964697479206c6f636b73206f6e20736f6d65206163636f756e742062616c616e6365732e0110207472616e736665720810646573748c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f757263651476616c75654c436f6d706163743c543a3a42616c616e63653e64d8205472616e7366657220736f6d65206c697175696420667265652062616c616e636520746f20616e6f74686572206163636f756e742e00090120607472616e73666572602077696c6c207365742074686520604672656542616c616e636560206f66207468652073656e64657220616e642072656365697665722e21012049742077696c6c2064656372656173652074686520746f74616c2069737375616e6365206f66207468652073797374656d2062792074686520605472616e73666572466565602e1501204966207468652073656e6465722773206163636f756e742069732062656c6f7720746865206578697374656e7469616c206465706f736974206173206120726573756c74b4206f6620746865207472616e736665722c20746865206163636f756e742077696c6c206265207265617065642e00190120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d75737420626520605369676e65646020627920746865207472616e736163746f722e002c2023203c7765696768743e3101202d20446570656e64656e74206f6e20617267756d656e747320627574206e6f7420637269746963616c2c20676976656e2070726f70657220696d706c656d656e746174696f6e7320666f72cc202020696e70757420636f6e6669672074797065732e205365652072656c617465642066756e6374696f6e732062656c6f772e6901202d20497420636f6e7461696e732061206c696d69746564206e756d626572206f6620726561647320616e642077726974657320696e7465726e616c6c7920616e64206e6f20636f6d706c657820636f6d7075746174696f6e2e004c2052656c617465642066756e6374696f6e733a0051012020202d2060656e737572655f63616e5f77697468647261776020697320616c776179732063616c6c656420696e7465726e616c6c792062757420686173206120626f756e64656420636f6d706c65786974792e2d012020202d205472616e7366657272696e672062616c616e63657320746f206163636f756e7473207468617420646964206e6f74206578697374206265666f72652077696c6c206361757365d420202020202060543a3a4f6e4e65774163636f756e743a3a6f6e5f6e65775f6163636f756e746020746f2062652063616c6c65642edc2020202d2052656d6f76696e6720656e6f7567682066756e64732066726f6d20616e206163636f756e742077696c6c20747269676765725901202020202060543a3a4475737452656d6f76616c3a3a6f6e5f756e62616c616e6365646020616e642060543a3a4f6e4672656542616c616e63655a65726f3a3a6f6e5f667265655f62616c616e63655f7a65726f602e49012020202d20607472616e736665725f6b6565705f616c6976656020776f726b73207468652073616d652077617920617320607472616e73666572602c206275742068617320616e206164646974696f6e616cf82020202020636865636b207468617420746865207472616e736665722077696c6c206e6f74206b696c6c20746865206f726967696e206163636f756e742e00302023203c2f7765696768743e2c7365745f62616c616e63650c0c77686f8c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f75726365206e65775f667265654c436f6d706163743c543a3a42616c616e63653e306e65775f72657365727665644c436f6d706163743c543a3a42616c616e63653e349420536574207468652062616c616e636573206f66206120676976656e206163636f756e742e00210120546869732077696c6c20616c74657220604672656542616c616e63656020616e642060526573657276656442616c616e63656020696e2073746f726167652e2069742077696c6c090120616c736f2064656372656173652074686520746f74616c2069737375616e6365206f66207468652073797374656d202860546f74616c49737375616e636560292e190120496620746865206e65772066726565206f722072657365727665642062616c616e63652069732062656c6f7720746865206578697374656e7469616c206465706f7369742c01012069742077696c6c20726573657420746865206163636f756e74206e6f6e63652028606672616d655f73797374656d3a3a4163636f756e744e6f6e636560292e00b420546865206469737061746368206f726967696e20666f7220746869732063616c6c2069732060726f6f74602e002c2023203c7765696768743e80202d20496e646570656e64656e74206f662074686520617267756d656e74732ec4202d20436f6e7461696e732061206c696d69746564206e756d626572206f6620726561647320616e64207772697465732e302023203c2f7765696768743e38666f7263655f7472616e736665720c18736f757263658c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f7572636510646573748c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f757263651476616c75654c436f6d706163743c543a3a42616c616e63653e0851012045786163746c7920617320607472616e73666572602c2065786365707420746865206f726967696e206d75737420626520726f6f7420616e642074686520736f75726365206163636f756e74206d61792062652c207370656369666965642e4c7472616e736665725f6b6565705f616c6976650810646573748c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f757263651476616c75654c436f6d706163743c543a3a42616c616e63653e1851012053616d6520617320746865205b607472616e73666572605d2063616c6c2c206275742077697468206120636865636b207468617420746865207472616e736665722077696c6c206e6f74206b696c6c2074686540206f726967696e206163636f756e742e00bc20393925206f66207468652074696d6520796f752077616e74205b607472616e73666572605d20696e73746561642e00c4205b607472616e73666572605d3a207374727563742e4d6f64756c652e68746d6c236d6574686f642e7472616e736665720114284e65774163636f756e7408244163636f756e7449641c42616c616e6365046c2041206e6577206163636f756e742077617320637265617465642e345265617065644163636f756e7408244163636f756e7449641c42616c616e6365045c20416e206163636f756e7420776173207265617065642e205472616e7366657210244163636f756e744964244163636f756e7449641c42616c616e63651c42616c616e636504b0205472616e7366657220737563636565646564202866726f6d2c20746f2c2076616c75652c2066656573292e2842616c616e63655365740c244163636f756e7449641c42616c616e63651c42616c616e636504c420412062616c616e6365207761732073657420627920726f6f74202877686f2c20667265652c207265736572766564292e1c4465706f73697408244163636f756e7449641c42616c616e636504dc20536f6d6520616d6f756e7420776173206465706f73697465642028652e672e20666f72207472616e73616374696f6e2066656573292e0c484578697374656e7469616c4465706f73697428543a3a42616c616e63654000e40b5402000000000000000000000004d420546865206d696e696d756d20616d6f756e7420726571756972656420746f206b65657020616e206163636f756e74206f70656e2e2c5472616e7366657246656528543a3a42616c616e63654000e40b540200000000000000000000000494205468652066656520726571756972656420746f206d616b652061207472616e736665722e2c4372656174696f6e46656528543a3a42616c616e63654000e40b54020000000000000000000000049c205468652066656520726571756972656420746f2063726561746520616e206163636f756e742e203856657374696e6742616c616e6365049c2056657374696e672062616c616e636520746f6f206869676820746f2073656e642076616c7565544c69717569646974795265737472696374696f6e7304c8204163636f756e74206c6971756964697479207265737472696374696f6e732070726576656e74207769746864726177616c204f766572666c6f77047420476f7420616e206f766572666c6f7720616674657220616464696e674c496e73756666696369656e7442616c616e636504782042616c616e636520746f6f206c6f7720746f2073656e642076616c7565484578697374656e7469616c4465706f73697404ec2056616c756520746f6f206c6f7720746f20637265617465206163636f756e742064756520746f206578697374656e7469616c206465706f736974244b656570416c6976650490205472616e736665722f7061796d656e7420776f756c64206b696c6c206163636f756e745c4578697374696e6756657374696e675363686564756c6504cc20412076657374696e67207363686564756c6520616c72656164792065786973747320666f722074686973206163636f756e742c446561644163636f756e74048c2042656e6566696369617279206163636f756e74206d757374207072652d6578697374485472616e73616374696f6e5061796d656e74012042616c616e63657304444e6578744665654d756c7469706c6965720100284d756c7469706c69657220000000000000000000000008485472616e73616374696f6e426173654665653042616c616e63654f663c543e4000e40b5402000000000000000000000004dc205468652066656520746f206265207061696420666f72206d616b696e672061207472616e73616374696f6e3b2074686520626173652e485472616e73616374696f6e427974654665653042616c616e63654f663c543e4000e1f505000000000000000000000000040d01205468652066656520746f206265207061696420666f72206d616b696e672061207472616e73616374696f6e3b20746865207065722d6279746520706f7274696f6e2e0028417574686f72736869700128417574686f72736869700c18556e636c65730100e85665633c556e636c65456e7472794974656d3c543a3a426c6f636b4e756d6265722c20543a3a486173682c20543a3a4163636f756e7449643e3e0400041c20556e636c657318417574686f72000030543a3a4163636f756e7449640400046420417574686f72206f662063757272656e7420626c6f636b2e30446964536574556e636c6573010010626f6f6c040004bc205768657468657220756e636c6573207765726520616c72656164792073657420696e207468697320626c6f636b2e0104287365745f756e636c657304286e65775f756e636c6573385665633c543a3a4865616465723e04642050726f76696465206120736574206f6620756e636c65732e00001c48496e76616c6964556e636c65506172656e74048c2054686520756e636c6520706172656e74206e6f7420696e2074686520636861696e2e40556e636c6573416c7265616479536574048420556e636c657320616c72656164792073657420696e2074686520626c6f636b2e34546f6f4d616e79556e636c6573044420546f6f206d616e7920756e636c65732e3047656e65736973556e636c6504582054686520756e636c652069732067656e657369732e30546f6f48696768556e636c6504802054686520756e636c6520697320746f6f206869676820696e20636861696e2e50556e636c65416c7265616479496e636c75646564047c2054686520756e636c6520697320616c726561647920696e636c756465642e204f6c64556e636c6504b82054686520756e636c652069736e277420726563656e7420656e6f75676820746f20626520696e636c756465642e1c5374616b696e67011c5374616b696e67683856616c696461746f72436f756e7401000c753332100000000004a82054686520696465616c206e756d626572206f66207374616b696e67207061727469636970616e74732e544d696e696d756d56616c696461746f72436f756e7401000c7533321004000000044101204d696e696d756d206e756d626572206f66207374616b696e67207061727469636970616e7473206265666f726520656d657267656e637920636f6e646974696f6e732061726520696d706f7365642e34496e76756c6e657261626c65730100445665633c543a3a4163636f756e7449643e04000c590120416e792076616c696461746f72732074686174206d6179206e6576657220626520736c6173686564206f7220666f726369626c79206b69636b65642e20497427732061205665632073696e636520746865792772654d01206561737920746f20696e697469616c697a6520616e642074686520706572666f726d616e636520686974206973206d696e696d616c2028776520657870656374206e6f206d6f7265207468616e20666f7572ac20696e76756c6e657261626c65732920616e64207265737472696374656420746f20746573746e6574732e18426f6e64656400010130543a3a4163636f756e74496430543a3a4163636f756e744964000400040101204d61702066726f6d20616c6c206c6f636b65642022737461736822206163636f756e747320746f2074686520636f6e74726f6c6c6572206163636f756e742e184c656467657200010130543a3a4163636f756e744964a45374616b696e674c65646765723c543a3a4163636f756e7449642c2042616c616e63654f663c543e3e000400044501204d61702066726f6d20616c6c2028756e6c6f636b6564292022636f6e74726f6c6c657222206163636f756e747320746f2074686520696e666f20726567617264696e6720746865207374616b696e672e14506179656501010130543a3a4163636f756e7449644452657761726444657374696e6174696f6e00040004e42057686572652074686520726577617264207061796d656e742073686f756c64206265206d6164652e204b657965642062792073746173682e2856616c696461746f727301010130543a3a4163636f756e7449643856616c696461746f72507265667301040004450120546865206d61702066726f6d202877616e6e616265292076616c696461746f72207374617368206b657920746f2074686520707265666572656e636573206f6620746861742076616c696461746f722e284e6f6d696e61746f727300010130543a3a4163636f756e744964644e6f6d696e6174696f6e733c543a3a4163636f756e7449643e01040010650120546865206d61702066726f6d206e6f6d696e61746f72207374617368206b657920746f2074686520736574206f66207374617368206b657973206f6620616c6c2076616c696461746f727320746f206e6f6d696e6174652e003501204e4f54453a206973207072697661746520736f20746861742077652063616e20656e73757265207570677261646564206265666f726520616c6c207479706963616c2061636365737365732ed8204469726563742073746f7261676520415049732063616e207374696c6c2062797061737320746869732070726f74656374696f6e2e1c5374616b65727301010130543a3a4163636f756e744964904578706f737572653c543a3a4163636f756e7449642c2042616c616e63654f663c543e3e000c000000104d01204e6f6d696e61746f727320666f72206120706172746963756c6172206163636f756e74207468617420697320696e20616374696f6e207269676874206e6f772e20596f752063616e277420697465726174651901207468726f7567682076616c696461746f727320686572652c2062757420796f752063616e2066696e64207468656d20696e207468652053657373696f6e206d6f64756c652e00902054686973206973206b6579656420627920746865207374617368206163636f756e742e3843757272656e74456c65637465640100445665633c543a3a4163636f756e7449643e040004fc205468652063757272656e746c7920656c65637465642076616c696461746f7220736574206b65796564206279207374617368206163636f756e742049442e2843757272656e74457261010020457261496e6465781000000000045c205468652063757272656e742065726120696e6465782e3c43757272656e74457261537461727401002c4d6f6d656e744f663c543e200000000000000000047820546865207374617274206f66207468652063757272656e74206572612e6c43757272656e74457261537461727453657373696f6e496e64657801003053657373696f6e496e646578100000000004d0205468652073657373696f6e20696e646578206174207768696368207468652063757272656e742065726120737461727465642e5843757272656e74457261506f696e74734561726e6564010024457261506f696e7473140000000000040d01205265776172647320666f72207468652063757272656e74206572612e205573696e6720696e6469636573206f662063757272656e7420656c6563746564207365742e24536c6f745374616b6501003042616c616e63654f663c543e40000000000000000000000000000000000c31012054686520616d6f756e74206f662062616c616e6365206163746976656c79206174207374616b6520666f7220656163682076616c696461746f7220736c6f742c2063757272656e746c792e00c02054686973206973207573656420746f20646572697665207265776172647320616e642070756e6973686d656e74732e20466f72636545726101001c466f7263696e670400041d01205472756520696620746865206e6578742073657373696f6e206368616e67652077696c6c2062652061206e657720657261207265676172646c657373206f6620696e6465782e4c536c6173685265776172644672616374696f6e01001c50657262696c6c10000000000cf8205468652070657263656e74616765206f662074686520736c617368207468617420697320646973747269627574656420746f207265706f72746572732e00e4205468652072657374206f662074686520736c61736865642076616c75652069732068616e646c6564206279207468652060536c617368602e4c43616e63656c6564536c6173685061796f757401003042616c616e63654f663c543e40000000000000000000000000000000000815012054686520616d6f756e74206f662063757272656e637920676976656e20746f207265706f7274657273206f66206120736c617368206576656e7420776869636820776173ec2063616e63656c65642062792065787472616f7264696e6172792063697263756d7374616e6365732028652e672e20676f7665726e616e6365292e40556e6170706c696564536c617368657301010120457261496e646578bc5665633c556e6170706c696564536c6173683c543a3a4163636f756e7449642c2042616c616e63654f663c543e3e3e00040004c420416c6c20756e6170706c69656420736c61736865732074686174206172652071756575656420666f72206c617465722e28426f6e646564457261730100745665633c28457261496e6465782c2053657373696f6e496e646578293e04000425012041206d617070696e672066726f6d207374696c6c2d626f6e646564206572617320746f207468652066697273742073657373696f6e20696e646578206f662074686174206572612e4c56616c696461746f72536c617368496e45726100020120457261496e64657830543a3a4163636f756e7449645c2850657262696c6c2c2042616c616e63654f663c543e2903040008450120416c6c20736c617368696e67206576656e7473206f6e2076616c696461746f72732c206d61707065642062792065726120746f20746865206869676865737420736c6173682070726f706f7274696f6e7020616e6420736c6173682076616c7565206f6620746865206572612e4c4e6f6d696e61746f72536c617368496e45726100020120457261496e64657830543a3a4163636f756e7449643042616c616e63654f663c543e03040004610120416c6c20736c617368696e67206576656e7473206f6e206e6f6d696e61746f72732c206d61707065642062792065726120746f20746865206869676865737420736c6173682076616c7565206f6620746865206572612e34536c617368696e675370616e7300010130543a3a4163636f756e7449645c736c617368696e673a3a536c617368696e675370616e73000400048c20536c617368696e67207370616e7320666f72207374617368206163636f756e74732e245370616e536c6173680101018c28543a3a4163636f756e7449642c20736c617368696e673a3a5370616e496e6465782988736c617368696e673a3a5370616e5265636f72643c42616c616e63654f663c543e3e00800000000000000000000000000000000000000000000000000000000000000000083d01205265636f72647320696e666f726d6174696f6e2061626f757420746865206d6178696d756d20736c617368206f6620612073746173682077697468696e206120736c617368696e67207370616e2cb82061732077656c6c20617320686f77206d7563682072657761726420686173206265656e2070616964206f75742e584561726c69657374556e6170706c696564536c617368000020457261496e646578040004fc20546865206561726c696573742065726120666f72207768696368207765206861766520612070656e64696e672c20756e6170706c69656420736c6173682e3853746f7261676556657273696f6e01000c75333210000000000490205468652076657273696f6e206f662073746f7261676520666f7220757067726164652e014410626f6e640c28636f6e74726f6c6c65728c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f757263651476616c756554436f6d706163743c42616c616e63654f663c543e3e1470617965654452657761726444657374696e6174696f6e3c65012054616b6520746865206f726967696e206163636f756e74206173206120737461736820616e64206c6f636b207570206076616c756560206f66206974732062616c616e63652e2060636f6e74726f6c6c6572602077696c6c8420626520746865206163636f756e74207468617420636f6e74726f6c732069742e003101206076616c756560206d757374206265206d6f7265207468616e2074686520606d696e696d756d5f62616c616e636560207370656369666965642062792060543a3a43757272656e6379602e00250120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20627920746865207374617368206163636f756e742e002c2023203c7765696768743ed4202d20496e646570656e64656e74206f662074686520617267756d656e74732e204d6f64657261746520636f6d706c65786974792e20202d204f2831292e68202d20546872656520657874726120444220656e74726965732e006d01204e4f54453a2054776f206f66207468652073746f726167652077726974657320286053656c663a3a626f6e646564602c206053656c663a3a7061796565602920617265205f6e657665725f20636c65616e656420756e6c65737325012074686520606f726967696e602066616c6c732062656c6f77205f6578697374656e7469616c206465706f7369745f20616e6420676574732072656d6f76656420617320647573742e302023203c2f7765696768743e28626f6e645f657874726104386d61785f6164646974696f6e616c54436f6d706163743c42616c616e63654f663c543e3e3865012041646420736f6d6520657874726120616d6f756e742074686174206861766520617070656172656420696e207468652073746173682060667265655f62616c616e63656020696e746f207468652062616c616e63652075703420666f72207374616b696e672e00510120557365207468697320696620746865726520617265206164646974696f6e616c2066756e647320696e20796f7572207374617368206163636f756e74207468617420796f75207769736820746f20626f6e642e650120556e6c696b65205b60626f6e64605d206f72205b60756e626f6e64605d20746869732066756e6374696f6e20646f6573206e6f7420696d706f736520616e79206c696d69746174696f6e206f6e2074686520616d6f756e744c20746861742063616e2062652061646465642e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f206279207468652073746173682c206e6f742074686520636f6e74726f6c6c65722e002c2023203c7765696768743ee8202d20496e646570656e64656e74206f662074686520617267756d656e74732e20496e7369676e69666963616e7420636f6d706c65786974792e20202d204f2831292e40202d204f6e6520444220656e7472792e302023203c2f7765696768743e18756e626f6e64041476616c756554436f6d706163743c42616c616e63654f663c543e3e5c5501205363686564756c65206120706f7274696f6e206f662074686520737461736820746f20626520756e6c6f636b656420726561647920666f72207472616e73666572206f75742061667465722074686520626f6e64010120706572696f6420656e64732e2049662074686973206c656176657320616e20616d6f756e74206163746976656c7920626f6e646564206c657373207468616e250120543a3a43757272656e63793a3a6d696e696d756d5f62616c616e636528292c207468656e20697420697320696e6372656173656420746f207468652066756c6c20616d6f756e742e004901204f6e63652074686520756e6c6f636b20706572696f6420697320646f6e652c20796f752063616e2063616c6c206077697468647261775f756e626f6e6465646020746f2061637475616c6c79206d6f7665c0207468652066756e6473206f7574206f66206d616e6167656d656e7420726561647920666f72207472616e736665722e003d01204e6f206d6f7265207468616e2061206c696d69746564206e756d626572206f6620756e6c6f636b696e67206368756e6b73202873656520604d41585f554e4c4f434b494e475f4348554e4b5360293d012063616e20636f2d657869737473206174207468652073616d652074696d652e20496e207468617420636173652c205b6043616c6c3a3a77697468647261775f756e626f6e646564605d206e656564fc20746f2062652063616c6c656420666972737420746f2072656d6f766520736f6d65206f6620746865206368756e6b732028696620706f737369626c65292e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2062792074686520636f6e74726f6c6c65722c206e6f74207468652073746173682e00982053656520616c736f205b6043616c6c3a3a77697468647261775f756e626f6e646564605d2e002c2023203c7765696768743e4101202d20496e646570656e64656e74206f662074686520617267756d656e74732e204c696d697465642062757420706f74656e7469616c6c79206578706c6f697461626c6520636f6d706c65786974792e98202d20436f6e7461696e732061206c696d69746564206e756d626572206f662072656164732e6501202d20456163682063616c6c20287265717569726573207468652072656d61696e646572206f662074686520626f6e6465642062616c616e636520746f2062652061626f766520606d696e696d756d5f62616c616e63656029710120202077696c6c2063617573652061206e657720656e74727920746f20626520696e73657274656420696e746f206120766563746f722028604c65646765722e756e6c6f636b696e676029206b65707420696e2073746f726167652ea501202020546865206f6e6c792077617920746f20636c65616e207468652061666f72656d656e74696f6e65642073746f72616765206974656d20697320616c736f20757365722d636f6e74726f6c6c656420766961206077697468647261775f756e626f6e646564602e40202d204f6e6520444220656e7472792e28203c2f7765696768743e4477697468647261775f756e626f6e64656400402d012052656d6f766520616e7920756e6c6f636b6564206368756e6b732066726f6d207468652060756e6c6f636b696e67602071756575652066726f6d206f7572206d616e6167656d656e742e003501205468697320657373656e7469616c6c7920667265657320757020746861742062616c616e636520746f206265207573656420627920746865207374617368206163636f756e7420746f20646f4c2077686174657665722069742077616e74732e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2062792074686520636f6e74726f6c6c65722c206e6f74207468652073746173682e006c2053656520616c736f205b6043616c6c3a3a756e626f6e64605d2e002c2023203c7765696768743e5501202d20436f756c6420626520646570656e64656e74206f6e2074686520606f726967696e6020617267756d656e7420616e6420686f77206d7563682060756e6c6f636b696e6760206368756e6b732065786973742e45012020497420696d706c6965732060636f6e736f6c69646174655f756e6c6f636b656460207768696368206c6f6f7073206f76657220604c65646765722e756e6c6f636b696e67602c207768696368206973f42020696e6469726563746c7920757365722d636f6e74726f6c6c65642e20536565205b60756e626f6e64605d20666f72206d6f72652064657461696c2e7901202d20436f6e7461696e732061206c696d69746564206e756d626572206f662072656164732c20796574207468652073697a65206f6620776869636820636f756c64206265206c61726765206261736564206f6e20606c6564676572602ec8202d2057726974657320617265206c696d6974656420746f2074686520606f726967696e60206163636f756e74206b65792e302023203c2f7765696768743e2076616c6964617465041470726566733856616c696461746f7250726566732ce8204465636c617265207468652064657369726520746f2076616c696461746520666f7220746865206f726967696e20636f6e74726f6c6c65722e00dc20456666656374732077696c6c2062652066656c742061742074686520626567696e6e696e67206f6620746865206e657874206572612e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2062792074686520636f6e74726f6c6c65722c206e6f74207468652073746173682e002c2023203c7765696768743ee8202d20496e646570656e64656e74206f662074686520617267756d656e74732e20496e7369676e69666963616e7420636f6d706c65786974792e98202d20436f6e7461696e732061206c696d69746564206e756d626572206f662072656164732ec8202d2057726974657320617265206c696d6974656420746f2074686520606f726967696e60206163636f756e74206b65792e302023203c2f7765696768743e206e6f6d696e617465041c74617267657473a05665633c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f757263653e2c1101204465636c617265207468652064657369726520746f206e6f6d696e6174652060746172676574736020666f7220746865206f726967696e20636f6e74726f6c6c65722e00dc20456666656374732077696c6c2062652066656c742061742074686520626567696e6e696e67206f6620746865206e657874206572612e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2062792074686520636f6e74726f6c6c65722c206e6f74207468652073746173682e002c2023203c7765696768743e2501202d20546865207472616e73616374696f6e277320636f6d706c65786974792069732070726f706f7274696f6e616c20746f207468652073697a65206f66206074617267657473602c982077686963682069732063617070656420617420604d41585f4e4f4d494e4154494f4e53602ed8202d20426f74682074686520726561647320616e642077726974657320666f6c6c6f7720612073696d696c6172207061747465726e2e302023203c2f7765696768743e146368696c6c002cc8204465636c617265206e6f2064657369726520746f206569746865722076616c6964617465206f72206e6f6d696e6174652e00dc20456666656374732077696c6c2062652066656c742061742074686520626567696e6e696e67206f6620746865206e657874206572612e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2062792074686520636f6e74726f6c6c65722c206e6f74207468652073746173682e002c2023203c7765696768743ee8202d20496e646570656e64656e74206f662074686520617267756d656e74732e20496e7369676e69666963616e7420636f6d706c65786974792e54202d20436f6e7461696e73206f6e6520726561642ec8202d2057726974657320617265206c696d6974656420746f2074686520606f726967696e60206163636f756e74206b65792e302023203c2f7765696768743e247365745f7061796565041470617965654452657761726444657374696e6174696f6e2cb8202852652d2973657420746865207061796d656e742074617267657420666f72206120636f6e74726f6c6c65722e00dc20456666656374732077696c6c2062652066656c742061742074686520626567696e6e696e67206f6620746865206e657874206572612e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2062792074686520636f6e74726f6c6c65722c206e6f74207468652073746173682e002c2023203c7765696768743ee8202d20496e646570656e64656e74206f662074686520617267756d656e74732e20496e7369676e69666963616e7420636f6d706c65786974792e98202d20436f6e7461696e732061206c696d69746564206e756d626572206f662072656164732ec8202d2057726974657320617265206c696d6974656420746f2074686520606f726967696e60206163636f756e74206b65792e302023203c2f7765696768743e387365745f636f6e74726f6c6c65720428636f6e74726f6c6c65728c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f757263652c90202852652d297365742074686520636f6e74726f6c6c6572206f6620612073746173682e00dc20456666656374732077696c6c2062652066656c742061742074686520626567696e6e696e67206f6620746865206e657874206572612e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f206279207468652073746173682c206e6f742074686520636f6e74726f6c6c65722e002c2023203c7765696768743ee8202d20496e646570656e64656e74206f662074686520617267756d656e74732e20496e7369676e69666963616e7420636f6d706c65786974792e98202d20436f6e7461696e732061206c696d69746564206e756d626572206f662072656164732ec8202d2057726974657320617265206c696d6974656420746f2074686520606f726967696e60206163636f756e74206b65792e302023203c2f7765696768743e4c7365745f76616c696461746f725f636f756e74040c6e657730436f6d706163743c7533323e04802054686520696465616c206e756d626572206f662076616c696461746f72732e34666f7263655f6e6f5f657261730014b020466f72636520746865726520746f206265206e6f206e6577206572617320696e646566696e6974656c792e002c2023203c7765696768743e40202d204e6f20617267756d656e74732e302023203c2f7765696768743e34666f7263655f6e65775f65726100184d0120466f72636520746865726520746f2062652061206e6577206572612061742074686520656e64206f6620746865206e6578742073657373696f6e2e20416674657220746869732c2069742077696c6c206265a020726573657420746f206e6f726d616c20286e6f6e2d666f7263656429206265686176696f75722e002c2023203c7765696768743e40202d204e6f20617267756d656e74732e302023203c2f7765696768743e447365745f696e76756c6e657261626c6573042876616c696461746f7273445665633c543a3a4163636f756e7449643e04cc20536574207468652076616c696461746f72732077686f2063616e6e6f7420626520736c61736865642028696620616e79292e34666f7263655f756e7374616b650414737461736830543a3a4163636f756e744964040d0120466f72636520612063757272656e74207374616b657220746f206265636f6d6520636f6d706c6574656c7920756e7374616b65642c20696d6d6564696174656c792e50666f7263655f6e65775f6572615f616c776179730014050120466f72636520746865726520746f2062652061206e6577206572612061742074686520656e64206f662073657373696f6e7320696e646566696e6974656c792e002c2023203c7765696768743e50202d204f6e652073746f72616765207772697465302023203c2f7765696768743e5463616e63656c5f64656665727265645f736c617368080c65726120457261496e64657834736c6173685f696e6469636573205665633c7533323e1c45012043616e63656c20656e6163746d656e74206f66206120646566657272656420736c6173682e2043616e2062652063616c6c6564206279206569746865722074686520726f6f74206f726967696e206f7270207468652060543a3a536c61736843616e63656c4f726967696e602e05012070617373696e67207468652065726120616e6420696e6469636573206f662074686520736c617368657320666f7220746861742065726120746f206b696c6c2e002c2023203c7765696768743e54202d204f6e652073746f726167652077726974652e302023203c2f7765696768743e187265626f6e64041476616c756554436f6d706163743c42616c616e63654f663c543e3e18e0205265626f6e64206120706f7274696f6e206f6620746865207374617368207363686564756c656420746f20626520756e6c6f636b65642e002c2023203c7765696768743ef0202d2054696d6520636f6d706c65786974793a204f2831292e20426f756e64656420627920604d41585f554e4c4f434b494e475f4348554e4b53602ef4202d2053746f72616765206368616e6765733a2043616e277420696e6372656173652073746f726167652c206f6e6c792064656372656173652069742e302023203c2f7765696768743e010c18526577617264081c42616c616e63651c42616c616e636508510120416c6c2076616c696461746f72732068617665206265656e207265776172646564206279207468652066697273742062616c616e63653b20746865207365636f6e64206973207468652072656d61696e6465728c2066726f6d20746865206d6178696d756d20616d6f756e74206f66207265776172642e14536c61736808244163636f756e7449641c42616c616e6365042501204f6e652076616c696461746f722028616e6420697473206e6f6d696e61746f72732920686173206265656e20736c61736865642062792074686520676976656e20616d6f756e742e684f6c64536c617368696e675265706f7274446973636172646564043053657373696f6e496e646578081d0120416e206f6c6420736c617368696e67207265706f72742066726f6d2061207072696f72206572612077617320646973636172646564206265636175736520697420636f756c6448206e6f742062652070726f6365737365642e083853657373696f6e735065724572613053657373696f6e496e64657810060000000470204e756d626572206f662073657373696f6e7320706572206572612e3c426f6e64696e674475726174696f6e20457261496e646578101c00000004e4204e756d626572206f6620657261732074686174207374616b65642066756e6473206d7573742072656d61696e20626f6e64656420666f722e28344e6f74436f6e74726f6c6c65720468204e6f74206120636f6e74726f6c6c6572206163636f756e742e204e6f7453746173680454204e6f742061207374617368206163636f756e742e34416c7265616479426f6e646564046420537461736820697320616c726561647920626f6e6465642e34416c7265616479506169726564047820436f6e74726f6c6c657220697320616c7265616479207061697265642e30456d70747954617267657473046420546172676574732063616e6e6f7420626520656d7074792e384475706c6963617465496e6465780444204475706c696361746520696e6465782e44496e76616c6964536c617368496e646578048820536c617368207265636f726420696e646578206f7574206f6620626f756e64732e44496e73756666696369656e7456616c756504cc2043616e206e6f7420626f6e6420776974682076616c7565206c657373207468616e206d696e696d756d2062616c616e63652e304e6f4d6f72654368756e6b7304942043616e206e6f74207363686564756c65206d6f726520756e6c6f636b206368756e6b732e344e6f556e6c6f636b4368756e6b04a42043616e206e6f74207265626f6e6420776974686f757420756e6c6f636b696e67206368756e6b732e204f6666656e63657301204f6666656e6365730c1c5265706f727473000101345265706f727449644f663c543ed04f6666656e636544657461696c733c543a3a4163636f756e7449642c20543a3a4964656e74696669636174696f6e5475706c653e00040004490120546865207072696d61727920737472756374757265207468617420686f6c647320616c6c206f6666656e6365207265636f726473206b65796564206279207265706f7274206964656e746966696572732e58436f6e63757272656e745265706f727473496e646578010201104b696e64384f706171756554696d65536c6f74485665633c5265706f727449644f663c543e3e010400042901204120766563746f72206f66207265706f727473206f66207468652073616d65206b696e6420746861742068617070656e6564206174207468652073616d652074696d6520736c6f742e485265706f72747342794b696e64496e646578010101104b696e641c5665633c75383e00040018110120456e756d65726174657320616c6c207265706f727473206f662061206b696e6420616c6f6e672077697468207468652074696d6520746865792068617070656e65642e00bc20416c6c207265706f7274732061726520736f72746564206279207468652074696d65206f66206f6666656e63652e004901204e6f74652074686174207468652061637475616c2074797065206f662074686973206d617070696e6720697320605665633c75383e602c207468697320697320626563617573652076616c756573206f66690120646966666572656e7420747970657320617265206e6f7420737570706f7274656420617420746865206d6f6d656e7420736f2077652061726520646f696e6720746865206d616e75616c2073657269616c697a6174696f6e2e010001041c4f6666656e636508104b696e64384f706171756554696d65536c6f7408550120546865726520697320616e206f6666656e6365207265706f72746564206f662074686520676976656e20606b696e64602068617070656e656420617420746865206073657373696f6e5f696e6465786020616e64390120286b696e642d7370656369666963292074696d6520736c6f742e2054686973206576656e74206973206e6f74206465706f736974656420666f72206475706c696361746520736c61736865732e00001c53657373696f6e011c53657373696f6e1c2856616c696461746f727301004c5665633c543a3a56616c696461746f7249643e0400047c205468652063757272656e7420736574206f662076616c696461746f72732e3043757272656e74496e64657801003053657373696f6e496e646578100000000004782043757272656e7420696e646578206f66207468652073657373696f6e2e345175657565644368616e676564010010626f6f6c040008390120547275652069662074686520756e6465726c79696e672065636f6e6f6d6963206964656e746974696573206f7220776569676874696e6720626568696e64207468652076616c696461746f7273a420686173206368616e67656420696e20746865207175657565642076616c696461746f72207365742e285175657565644b6579730100785665633c28543a3a56616c696461746f7249642c20543a3a4b657973293e0400083d012054686520717565756564206b65797320666f7220746865206e6578742073657373696f6e2e205768656e20746865206e6578742073657373696f6e20626567696e732c207468657365206b657973e02077696c6c206265207573656420746f2064657465726d696e65207468652076616c696461746f7227732073657373696f6e206b6579732e4844697361626c656456616c696461746f72730100205665633c7533323e04000c8020496e6469636573206f662064697361626c65642076616c696461746f72732e003501205468652073657420697320636c6561726564207768656e20606f6e5f73657373696f6e5f656e64696e67602072657475726e732061206e657720736574206f66206964656e7469746965732e204e6578744b6579730002051c5665633c75383e38543a3a56616c696461746f7249641c543a3a4b657973010400109c20546865206e6578742073657373696f6e206b65797320666f7220612076616c696461746f722e00590120546865206669727374206b657920697320616c77617973206044454455505f4b45595f5052454649586020746f206861766520616c6c20746865206461746120696e207468652073616d65206272616e6368206f6661012074686520747269652e20486176696e6720616c6c206461746120696e207468652073616d65206272616e63682073686f756c642070726576656e7420736c6f77696e6720646f776e206f7468657220717565726965732e204b65794f776e65720002051c5665633c75383e50284b65795479706549642c205665633c75383e2938543a3a56616c696461746f72496401040010250120546865206f776e6572206f662061206b65792e20546865207365636f6e64206b65792069732074686520604b657954797065496460202b2074686520656e636f646564206b65792e00590120546865206669727374206b657920697320616c77617973206044454455505f4b45595f5052454649586020746f206861766520616c6c20746865206461746120696e207468652073616d65206272616e6368206f6661012074686520747269652e20486176696e6720616c6c206461746120696e207468652073616d65206272616e63682073686f756c642070726576656e7420736c6f77696e6720646f776e206f7468657220717565726965732e0104207365745f6b65797308106b6579731c543a3a4b6579731470726f6f661c5665633c75383e28e42053657473207468652073657373696f6e206b6579287329206f66207468652066756e6374696f6e2063616c6c657220746f20606b6579602e210120416c6c6f777320616e206163636f756e7420746f20736574206974732073657373696f6e206b6579207072696f7220746f206265636f6d696e6720612076616c696461746f722ec4205468697320646f65736e27742074616b652065666665637420756e74696c20746865206e6578742073657373696f6e2e00d420546865206469737061746368206f726967696e206f6620746869732066756e6374696f6e206d757374206265207369676e65642e002c2023203c7765696768743e88202d204f286c6f67206e2920696e206e756d626572206f66206163636f756e74732e58202d204f6e6520657874726120444220656e7472792e302023203c2f7765696768743e0104284e657753657373696f6e043053657373696f6e496e646578085501204e65772073657373696f6e206861732068617070656e65642e204e6f746520746861742074686520617267756d656e74206973207468652073657373696f6e20696e6465782c206e6f742074686520626c6f636b88206e756d626572206173207468652074797065206d6967687420737567676573742e044044454455505f4b45595f50524546495814265b75385d38343a73657373696f6e3a6b6579730865012055736564206173206669727374206b657920666f7220604e6578744b6579736020616e6420604b65794f776e65726020746f2070757420616c6c20746865206461746120696e746f207468652073616d65206272616e636834206f662074686520747269652e0c30496e76616c696450726f6f66046420496e76616c6964206f776e6572736869702070726f6f662e5c4e6f4173736f63696174656456616c696461746f72496404a0204e6f206173736f6369617465642076616c696461746f7220494420666f72206163636f756e742e344475706c6963617465644b657904682052656769737465726564206475706c6963617465206b65792e3c46696e616c697479547261636b65720001042866696e616c5f68696e74041068696e745c436f6d706163743c543a3a426c6f636b4e756d6265723e08f42048696e7420746861742074686520617574686f72206f66207468697320626c6f636b207468696e6b732074686520626573742066696e616c697a65646c20626c6f636b2069732074686520676976656e206e756d6265722e00082857696e646f7753697a6538543a3a426c6f636b4e756d626572106500000004190120546865206e756d626572206f6620726563656e742073616d706c657320746f206b6565702066726f6d207468697320636861696e2e2044656661756c74206973203130312e345265706f72744c6174656e637938543a3a426c6f636b4e756d62657210e8030000041d01205468652064656c617920616674657220776869636820706f696e74207468696e6773206265636f6d6520737573706963696f75732e2044656661756c7420697320313030302e0838416c72656164795570646174656404c82046696e616c2068696e74206d7573742062652075706461746564206f6e6c79206f6e636520696e2074686520626c6f636b1c42616448696e7404902046696e616c697a6564206865696768742061626f766520626c6f636b206e756d6265721c4772616e647061013c4772616e64706146696e616c6974791c2c417574686f726974696573010034417574686f726974794c6973740400102c20444550524543415445440061012054686973207573656420746f2073746f7265207468652063757272656e7420617574686f72697479207365742c20776869636820686173206265656e206d6967726174656420746f207468652077656c6c2d6b6e6f776e94204752414e4450415f415554484f52495445535f4b455920756e686173686564206b65792e14537461746501006c53746f72656453746174653c543a3a426c6f636b4e756d6265723e04000490205374617465206f66207468652063757272656e7420617574686f72697479207365742e3450656e64696e674368616e676500008c53746f72656450656e64696e674368616e67653c543a3a426c6f636b4e756d6265723e040004c42050656e64696e67206368616e67653a20287369676e616c65642061742c207363686564756c6564206368616e6765292e284e657874466f72636564000038543a3a426c6f636b4e756d626572040004bc206e65787420626c6f636b206e756d6265722077686572652077652063616e20666f7263652061206368616e67652e1c5374616c6c656400008028543a3a426c6f636b4e756d6265722c20543a3a426c6f636b4e756d626572290400049020607472756560206966207765206172652063757272656e746c79207374616c6c65642e3043757272656e7453657449640100145365744964200000000000000000085d0120546865206e756d626572206f66206368616e6765732028626f746820696e207465726d73206f66206b65797320616e6420756e6465726c79696e672065636f6e6f6d696320726573706f6e736962696c697469657329c420696e20746865202273657422206f66204772616e6470612076616c696461746f72732066726f6d2067656e657369732e30536574496453657373696f6e0001011453657449643053657373696f6e496e64657800040004c1012041206d617070696e672066726f6d206772616e6470612073657420494420746f2074686520696e646578206f6620746865202a6d6f737420726563656e742a2073657373696f6e20666f7220776869636820697473206d656d62657273207765726520726573706f6e7369626c652e0104487265706f72745f6d69736265686176696f72041c5f7265706f72741c5665633c75383e0464205265706f727420736f6d65206d69736265686176696f722e010c384e6577417574686f7269746965730434417574686f726974794c6973740490204e657720617574686f726974792073657420686173206265656e206170706c6965642e1850617573656400049c2043757272656e7420617574686f726974792073657420686173206265656e207061757365642e1c526573756d65640004a02043757272656e7420617574686f726974792073657420686173206265656e20726573756d65642e00102c50617573654661696c656408090120417474656d707420746f207369676e616c204752414e445041207061757365207768656e2074686520617574686f72697479207365742069736e2774206c697665a8202865697468657220706175736564206f7220616c72656164792070656e64696e67207061757365292e30526573756d654661696c656408150120417474656d707420746f207369676e616c204752414e44504120726573756d65207768656e2074686520617574686f72697479207365742069736e277420706175736564a42028656974686572206c697665206f7220616c72656164792070656e64696e6720726573756d65292e344368616e676550656e64696e6704ec20417474656d707420746f207369676e616c204752414e445041206368616e67652077697468206f6e6520616c72656164792070656e64696e672e1c546f6f536f6f6e04c02043616e6e6f74207369676e616c20666f72636564206368616e676520736f20736f6f6e206166746572206c6173742e20496d4f6e6c696e650120496d4f6e6c696e651020476f737369704174010038543a3a426c6f636b4e756d626572100000000004a02054686520626c6f636b206e756d626572207768656e2077652073686f756c6420676f737369702e104b65797301004c5665633c543a3a417574686f7269747949643e040004d0205468652063757272656e7420736574206f66206b6579732074686174206d61792069737375652061206865617274626561742e485265636569766564486561727462656174730002013053657373696f6e496e6465782441757468496e6465781c5665633c75383e01040008e420466f7220656163682073657373696f6e20696e6465782c207765206b6565702061206d617070696e67206f66206041757468496e646578608c20746f20606f6666636861696e3a3a4f70617175654e6574776f726b5374617465602e38417574686f726564426c6f636b730102013053657373696f6e496e64657838543a3a56616c696461746f7249640c75333201100000000008150120466f7220656163682073657373696f6e20696e6465782c207765206b6565702061206d617070696e67206f662060543a3a56616c696461746f7249646020746f20746865c8206e756d626572206f6620626c6f636b7320617574686f7265642062792074686520676976656e20617574686f726974792e0104246865617274626561740824686561727462656174644865617274626561743c543a3a426c6f636b4e756d6265723e285f7369676e6174757265bc3c543a3a417574686f7269747949642061732052756e74696d654170705075626c69633e3a3a5369676e617475726500010c444865617274626561745265636569766564042c417574686f72697479496404c02041206e657720686561727462656174207761732072656365697665642066726f6d2060417574686f726974794964601c416c6c476f6f640004d42041742074686520656e64206f66207468652073657373696f6e2c206e6f206f6666656e63652077617320636f6d6d69747465642e2c536f6d654f66666c696e6504605665633c4964656e74696669636174696f6e5475706c653e0431012041742074686520656e64206f66207468652073657373696f6e2c206174206c65617374206f6e63652076616c696461746f722077617320666f756e6420746f206265206f66666c696e652e000828496e76616c69644b65790464204e6f6e206578697374656e74207075626c6963206b65792e4c4475706c6963617465644865617274626561740458204475706c696361746564206865617274626561742e48417574686f72697479446973636f766572790001000000002444656d6f6372616379012444656d6f6372616379403c5075626c696350726f70436f756e7401002450726f70496e646578100000000004f420546865206e756d626572206f6620287075626c6963292070726f706f73616c7320746861742068617665206265656e206d61646520736f206661722e2c5075626c696350726f707301009c5665633c2850726f70496e6465782c20543a3a486173682c20543a3a4163636f756e744964293e040004210120546865207075626c69632070726f706f73616c732e20556e736f727465642e20546865207365636f6e64206974656d206973207468652070726f706f73616c277320686173682e24507265696d616765730001011c543a3a48617368d4285665633c75383e2c20543a3a4163636f756e7449642c2042616c616e63654f663c543e2c20543a3a426c6f636b4e756d62657229000400086101204d6170206f662068617368657320746f207468652070726f706f73616c20707265696d6167652c20616c6f6e6720776974682077686f207265676973746572656420697420616e64207468656972206465706f7369742ee42054686520626c6f636b206e756d6265722069732074686520626c6f636b20617420776869636820697420776173206465706f73697465642e244465706f7369744f660001012450726f70496e646578842842616c616e63654f663c543e2c205665633c543a3a4163636f756e7449643e2900040004842054686f73652077686f2068617665206c6f636b65642061206465706f7369742e3c5265666572656e64756d436f756e7401003c5265666572656e64756d496e646578100000000004310120546865206e6578742066726565207265666572656e64756d20696e6465782c20616b6120746865206e756d626572206f66207265666572656e6461207374617274656420736f206661722e344c6f77657374556e62616b656401003c5265666572656e64756d496e646578100000000008250120546865206c6f77657374207265666572656e64756d20696e64657820726570726573656e74696e6720616e20756e62616b6564207265666572656e64756d2e20457175616c20746fdc20605265666572656e64756d436f756e74602069662074686572652069736e2774206120756e62616b6564207265666572656e64756d2e405265666572656e64756d496e666f4f660001013c5265666572656e64756d496e6465789c5265666572656e64756d496e666f3c543a3a426c6f636b4e756d6265722c20543a3a486173683e00040004b420496e666f726d6174696f6e20636f6e6365726e696e6720616e7920676976656e207265666572656e64756d2e34446973706174636851756575650100bc5665633c28543a3a426c6f636b4e756d6265722c20543a3a486173682c205265666572656e64756d496e646578293e0400044101205175657565206f66207375636365737366756c207265666572656e646120746f20626520646973706174636865642e2053746f726564206f72646572656420627920626c6f636b206e756d6265722e24566f74657273466f720101013c5265666572656e64756d496e646578445665633c543a3a4163636f756e7449643e00040004a4204765742074686520766f7465727320666f72207468652063757272656e742070726f706f73616c2e18566f74654f660101017c285265666572656e64756d496e6465782c20543a3a4163636f756e7449642910566f7465000400106101204765742074686520766f746520696e206120676976656e207265666572656e64756d206f66206120706172746963756c617220766f7465722e2054686520726573756c74206973206d65616e696e6766756c206f6e6c794d012069662060766f746572735f666f726020696e636c756465732074686520766f746572207768656e2063616c6c6564207769746820746865207265666572656e64756d2028796f75276c6c20676574207468655d012064656661756c742060566f7465602076616c7565206f7468657277697365292e20496620796f7520646f6e27742077616e7420746f20636865636b2060766f746572735f666f72602c207468656e20796f752063616ef420616c736f20636865636b20666f722073696d706c65206578697374656e636520776974682060566f74654f663a3a657869737473602066697273742e1450726f787900010130543a3a4163636f756e74496430543a3a4163636f756e7449640004000831012057686f2069732061626c6520746f20766f746520666f722077686f6d2e2056616c7565206973207468652066756e642d686f6c64696e67206163636f756e742c206b6579206973207468658820766f74652d7472616e73616374696f6e2d73656e64696e67206163636f756e742e2c44656c65676174696f6e7301010130543a3a4163636f756e7449646828543a3a4163636f756e7449642c20436f6e76696374696f6e2901840000000000000000000000000000000000000000000000000000000000000000000441012047657420746865206163636f756e742028616e64206c6f636b20706572696f64732920746f20776869636820616e6f74686572206163636f756e742069732064656c65676174696e6720766f74652e544c6173745461626c656457617345787465726e616c010010626f6f6c0400085901205472756520696620746865206c617374207265666572656e64756d207461626c656420776173207375626d69747465642065787465726e616c6c792e2046616c7365206966206974207761732061207075626c6963282070726f706f73616c2e304e65787445787465726e616c00006028543a3a486173682c20566f74655468726573686f6c6429040010590120546865207265666572656e64756d20746f206265207461626c6564207768656e6576657220697420776f756c642062652076616c696420746f207461626c6520616e2065787465726e616c2070726f706f73616c2e550120546869732068617070656e73207768656e2061207265666572656e64756d206e6565647320746f206265207461626c656420616e64206f6e65206f662074776f20636f6e646974696f6e7320617265206d65743aa4202d20604c6173745461626c656457617345787465726e616c60206973206066616c7365603b206f7268202d20605075626c696350726f70736020697320656d7074792e24426c61636b6c6973740001011c543a3a486173688c28543a3a426c6f636b4e756d6265722c205665633c543a3a4163636f756e7449643e290004000851012041207265636f7264206f662077686f207665746f656420776861742e204d6170732070726f706f73616c206861736820746f206120706f737369626c65206578697374656e7420626c6f636b206e756d626572e82028756e74696c207768656e206974206d6179206e6f742062652072657375626d69747465642920616e642077686f207665746f65642069742e3443616e63656c6c6174696f6e730101011c543a3a4861736810626f6f6c000400042901205265636f7264206f6620616c6c2070726f706f73616c7320746861742068617665206265656e207375626a65637420746f20656d657267656e63792063616e63656c6c6174696f6e2e01541c70726f706f7365083470726f706f73616c5f686173681c543a3a486173681476616c756554436f6d706163743c42616c616e63654f663c543e3e18a02050726f706f736520612073656e73697469766520616374696f6e20746f2062652074616b656e2e002c2023203c7765696768743e20202d204f2831292e80202d2054776f204442206368616e6765732c206f6e6520444220656e7472792e302023203c2f7765696768743e187365636f6e64042070726f706f73616c48436f6d706163743c50726f70496e6465783e18a02050726f706f736520612073656e73697469766520616374696f6e20746f2062652074616b656e2e002c2023203c7765696768743e20202d204f2831292e40202d204f6e6520444220656e7472792e302023203c2f7765696768743e10766f746508247265665f696e64657860436f6d706163743c5265666572656e64756d496e6465783e10766f746510566f74651c350120566f746520696e2061207265666572656e64756d2e2049662060766f74652e69735f6179652829602c2074686520766f746520697320746f20656e616374207468652070726f706f73616c3bbc206f7468657277697365206974206973206120766f746520746f206b65657020746865207374617475732071756f2e002c2023203c7765696768743e20202d204f2831292e7c202d204f6e65204442206368616e67652c206f6e6520444220656e7472792e302023203c2f7765696768743e2870726f78795f766f746508247265665f696e64657860436f6d706163743c5265666572656e64756d496e6465783e10766f746510566f74651c510120566f746520696e2061207265666572656e64756d206f6e20626568616c66206f6620612073746173682e2049662060766f74652e69735f6179652829602c2074686520766f746520697320746f20656e616374f8207468652070726f706f73616c3b20206f7468657277697365206974206973206120766f746520746f206b65657020746865207374617475732071756f2e002c2023203c7765696768743e20202d204f2831292e7c202d204f6e65204442206368616e67652c206f6e6520444220656e7472792e302023203c2f7765696768743e40656d657267656e63795f63616e63656c04247265665f696e6465783c5265666572656e64756d496e646578085101205363686564756c6520616e20656d657267656e63792063616e63656c6c6174696f6e206f662061207265666572656e64756d2e2043616e6e6f742068617070656e20747769636520746f207468652073616d6530207265666572656e64756d2e4065787465726e616c5f70726f706f7365043470726f706f73616c5f686173681c543a3a48617368083101205363686564756c652061207265666572656e64756d20746f206265207461626c6564206f6e6365206974206973206c6567616c20746f207363686564756c6520616e2065787465726e616c30207265666572656e64756d2e6465787465726e616c5f70726f706f73655f6d616a6f72697479043470726f706f73616c5f686173681c543a3a48617368145901205363686564756c652061206d616a6f726974792d63617272696573207265666572656e64756d20746f206265207461626c6564206e657874206f6e6365206974206973206c6567616c20746f207363686564756c656020616e2065787465726e616c207265666572656e64756d2e004d0120556e6c696b65206065787465726e616c5f70726f706f7365602c20626c61636b6c697374696e6720686173206e6f20656666656374206f6e207468697320616e64206974206d6179207265706c61636520619c207072652d7363686564756c6564206065787465726e616c5f70726f706f7365602063616c6c2e6065787465726e616c5f70726f706f73655f64656661756c74043470726f706f73616c5f686173681c543a3a48617368144901205363686564756c652061206e656761746976652d7475726e6f75742d62696173207265666572656e64756d20746f206265207461626c6564206e657874206f6e6365206974206973206c6567616c20746f84207363686564756c6520616e2065787465726e616c207265666572656e64756d2e004d0120556e6c696b65206065787465726e616c5f70726f706f7365602c20626c61636b6c697374696e6720686173206e6f20656666656374206f6e207468697320616e64206974206d6179207265706c61636520619c207072652d7363686564756c6564206065787465726e616c5f70726f706f7365602063616c6c2e28666173745f747261636b0c3470726f706f73616c5f686173681c543a3a4861736834766f74696e675f706572696f6438543a3a426c6f636b4e756d6265721464656c617938543a3a426c6f636b4e756d626572245101205363686564756c65207468652063757272656e746c792065787465726e616c6c792d70726f706f736564206d616a6f726974792d63617272696573207265666572656e64756d20746f206265207461626c6564650120696d6d6564696174656c792e204966207468657265206973206e6f2065787465726e616c6c792d70726f706f736564207265666572656e64756d2063757272656e746c792c206f72206966207468657265206973206f6e65ec20627574206974206973206e6f742061206d616a6f726974792d63617272696573207265666572656e64756d207468656e206974206661696c732e00f8202d206070726f706f73616c5f68617368603a205468652068617368206f66207468652063757272656e742065787465726e616c2070726f706f73616c2e6101202d2060766f74696e675f706572696f64603a2054686520706572696f64207468617420697320616c6c6f77656420666f7220766f74696e67206f6e20746869732070726f706f73616c2e20496e6372656173656420746f9820202060456d657267656e6379566f74696e67506572696f646020696620746f6f206c6f772e5501202d206064656c6179603a20546865206e756d626572206f6620626c6f636b20616674657220766f74696e672068617320656e64656420696e20617070726f76616c20616e6420746869732073686f756c64206265bc202020656e61637465642e205468697320646f65736e277420686176652061206d696e696d756d20616d6f756e742e347665746f5f65787465726e616c043470726f706f73616c5f686173681c543a3a4861736804bc205665746f20616e6420626c61636b6c697374207468652065787465726e616c2070726f706f73616c20686173682e4463616e63656c5f7265666572656e64756d04247265665f696e64657860436f6d706163743c5265666572656e64756d496e6465783e04542052656d6f76652061207265666572656e64756d2e3463616e63656c5f717565756564041477686963683c5265666572656e64756d496e64657804a02043616e63656c20612070726f706f73616c2071756575656420666f7220656e6163746d656e742e247365745f70726f7879041470726f787930543a3a4163636f756e7449641498205370656369667920612070726f78792e2043616c6c6564206279207468652073746173682e002c2023203c7765696768743e58202d204f6e6520657874726120444220656e7472792e302023203c2f7765696768743e3072657369676e5f70726f787900149820436c656172207468652070726f78792e2043616c6c6564206279207468652070726f78792e002c2023203c7765696768743e40202d204f6e6520444220636c6561722e302023203c2f7765696768743e3072656d6f76655f70726f7879041470726f787930543a3a4163636f756e744964149820436c656172207468652070726f78792e2043616c6c6564206279207468652073746173682e002c2023203c7765696768743e40202d204f6e6520444220636c6561722e302023203c2f7765696768743e2064656c65676174650808746f30543a3a4163636f756e74496428636f6e76696374696f6e28436f6e76696374696f6e143c2044656c656761746520766f74652e002c2023203c7765696768743e58202d204f6e6520657874726120444220656e7472792e302023203c2f7765696768743e28756e64656c656761746500144420556e64656c656761746520766f74652e002c2023203c7765696768743e20202d204f2831292e302023203c2f7765696768743e58636c6561725f7075626c69635f70726f706f73616c7300040101205665746f20616e6420626c61636b6c697374207468652070726f706f73616c20686173682e204d7573742062652066726f6d20526f6f74206f726967696e2e346e6f74655f707265696d6167650440656e636f6465645f70726f706f73616c1c5665633c75383e0861012052656769737465722074686520707265696d61676520666f7220616e207570636f6d696e672070726f706f73616c2e205468697320646f65736e27742072657175697265207468652070726f706f73616c20746f206265250120696e207468652064697370617463682071756575652062757420646f657320726571756972652061206465706f7369742c2072657475726e6564206f6e636520656e61637465642e586e6f74655f696d6d696e656e745f707265696d6167650440656e636f6465645f70726f706f73616c1c5665633c75383e0845012052656769737465722074686520707265696d61676520666f7220616e207570636f6d696e672070726f706f73616c2e2054686973207265717569726573207468652070726f706f73616c20746f206265b420696e207468652064697370617463682071756575652e204e6f206465706f736974206973206e65656465642e34726561705f707265696d616765043470726f706f73616c5f686173681c543a3a4861736814f42052656d6f766520616e20657870697265642070726f706f73616c20707265696d61676520616e6420636f6c6c65637420746865206465706f7369742e00510120546869732077696c6c206f6e6c7920776f726b2061667465722060566f74696e67506572696f646020626c6f636b732066726f6d207468652074696d6520746861742074686520707265696d616765207761735d01206e6f7465642c2069662069742773207468652073616d65206163636f756e7420646f696e672069742e2049662069742773206120646966666572656e74206163636f756e742c207468656e206974276c6c206f6e6c79b020776f726b20616e206164646974696f6e616c2060456e6163746d656e74506572696f6460206c617465722e01402050726f706f736564082450726f70496e6465781c42616c616e636504c02041206d6f74696f6e20686173206265656e2070726f706f7365642062792061207075626c6963206163636f756e742e185461626c65640c2450726f70496e6465781c42616c616e6365385665633c4163636f756e7449643e04dc2041207075626c69632070726f706f73616c20686173206265656e207461626c656420666f72207265666572656e64756d20766f74652e3845787465726e616c5461626c656400049820416e2065787465726e616c2070726f706f73616c20686173206265656e207461626c65642e1c53746172746564083c5265666572656e64756d496e64657834566f74655468726573686f6c6404602041207265666572656e64756d2068617320626567756e2e18506173736564043c5265666572656e64756d496e64657804b020412070726f706f73616c20686173206265656e20617070726f766564206279207265666572656e64756d2e244e6f74506173736564043c5265666572656e64756d496e64657804b020412070726f706f73616c20686173206265656e2072656a6563746564206279207265666572656e64756d2e2443616e63656c6c6564043c5265666572656e64756d496e64657804842041207265666572656e64756d20686173206265656e2063616e63656c6c65642e204578656375746564083c5265666572656e64756d496e64657810626f6f6c047420412070726f706f73616c20686173206265656e20656e61637465642e2444656c65676174656408244163636f756e744964244163636f756e74496404e020416e206163636f756e74206861732064656c65676174656420746865697220766f746520746f20616e6f74686572206163636f756e742e2c556e64656c65676174656404244163636f756e74496404e820416e206163636f756e74206861732063616e63656c6c656420612070726576696f75732064656c65676174696f6e206f7065726174696f6e2e185665746f65640c244163636f756e74496410486173682c426c6f636b4e756d626572049820416e2065787465726e616c2070726f706f73616c20686173206265656e207665746f65642e34507265696d6167654e6f7465640c1048617368244163636f756e7449641c42616c616e636504e020412070726f706f73616c277320707265696d61676520776173206e6f7465642c20616e6420746865206465706f7369742074616b656e2e30507265696d616765557365640c1048617368244163636f756e7449641c42616c616e636504150120412070726f706f73616c20707265696d616765207761732072656d6f76656420616e6420757365642028746865206465706f736974207761732072657475726e6564292e3c507265696d616765496e76616c69640810486173683c5265666572656e64756d496e646578040d0120412070726f706f73616c20636f756c64206e6f7420626520657865637574656420626563617573652069747320707265696d6167652077617320696e76616c69642e3c507265696d6167654d697373696e670810486173683c5265666572656e64756d496e646578040d0120412070726f706f73616c20636f756c64206e6f7420626520657865637574656420626563617573652069747320707265696d61676520776173206d697373696e672e38507265696d616765526561706564101048617368244163636f756e7449641c42616c616e6365244163636f756e744964045d012041207265676973746572656420707265696d616765207761732072656d6f76656420616e6420746865206465706f73697420636f6c6c6563746564206279207468652072656170657220286c617374206974656d292e1c3c456e6163746d656e74506572696f6438543a3a426c6f636b4e756d6265721000c2010014710120546865206d696e696d756d20706572696f64206f66206c6f636b696e6720616e642074686520706572696f64206265747765656e20612070726f706f73616c206265696e6720617070726f76656420616e6420656e61637465642e0031012049742073686f756c642067656e6572616c6c792062652061206c6974746c65206d6f7265207468616e2074686520756e7374616b6520706572696f6420746f20656e737572652074686174690120766f74696e67207374616b657273206861766520616e206f70706f7274756e69747920746f2072656d6f7665207468656d73656c7665732066726f6d207468652073797374656d20696e2074686520636173652077686572659c207468657920617265206f6e20746865206c6f73696e672073696465206f66206120766f74652e304c61756e6368506572696f6438543a3a426c6f636b4e756d62657210c089010004e420486f77206f6674656e2028696e20626c6f636b7329206e6577207075626c6963207265666572656e646120617265206c61756e636865642e30566f74696e67506572696f6438543a3a426c6f636b4e756d62657210c089010004b820486f77206f6674656e2028696e20626c6f636b732920746f20636865636b20666f72206e657720766f7465732e384d696e696d756d4465706f7369743042616c616e63654f663c543e400010a5d4e8000000000000000000000004350120546865206d696e696d756d20616d6f756e7420746f20626520757365642061732061206465706f73697420666f722061207075626c6963207265666572656e64756d2070726f706f73616c2e54456d657267656e6379566f74696e67506572696f6438543a3a426c6f636b4e756d626572100807000004ec204d696e696d756d20766f74696e6720706572696f6420616c6c6f77656420666f7220616e20656d657267656e6379207265666572656e64756d2e34436f6f6c6f6666506572696f6438543a3a426c6f636b4e756d62657210c089010004610120506572696f6420696e20626c6f636b7320776865726520616e2065787465726e616c2070726f706f73616c206d6179206e6f742062652072652d7375626d6974746564206166746572206265696e67207665746f65642e4c507265696d616765427974654465706f7369743042616c616e63654f663c543e4000e1f5050000000000000000000000000429012054686520616d6f756e74206f662062616c616e63652074686174206d757374206265206465706f7369746564207065722062797465206f6620707265696d6167652073746f7265642e582056616c75654c6f7704382056616c756520746f6f206c6f773c50726f706f73616c4d697373696e6704602050726f706f73616c20646f6573206e6f74206578697374204e6f7450726f78790430204e6f7420612070726f787920426164496e646578043820556e6b6e6f776e20696e6465783c416c726561647943616e63656c656404982043616e6e6f742063616e63656c207468652073616d652070726f706f73616c207477696365444475706c696361746550726f706f73616c04582050726f706f73616c20616c7265616479206d6164654c50726f706f73616c426c61636b6c6973746564046c2050726f706f73616c207374696c6c20626c61636b6c6973746564444e6f7453696d706c654d616a6f7269747904ac204e6578742065787465726e616c2070726f706f73616c206e6f742073696d706c65206d616a6f726974792c496e76616c696448617368043420496e76616c69642068617368284e6f50726f706f73616c0454204e6f2065787465726e616c2070726f706f73616c34416c72656164795665746f6564049c204964656e74697479206d6179206e6f74207665746f20612070726f706f73616c20747769636530416c726561647950726f7879044020416c726561647920612070726f78792857726f6e6750726f787904302057726f6e672070726f7879304e6f7444656c6567617465640438204e6f742064656c656761746564444475706c6963617465507265696d616765045c20507265696d61676520616c7265616479206e6f7465642c4e6f74496d6d696e656e740434204e6f7420696d6d696e656e74144561726c79042820546f6f206561726c7920496d6d696e656e74042420496d6d696e656e743c507265696d6167654d697373696e67044c20507265696d616765206e6f7420666f756e64445265666572656e64756d496e76616c6964048820566f746520676976656e20666f7220696e76616c6964207265666572656e64756d3c507265696d616765496e76616c6964044420496e76616c696420707265696d6167652c4e6f6e6557616974696e670454204e6f2070726f706f73616c732077616974696e671c436f756e63696c014c496e7374616e636531436f6c6c656374697665142450726f706f73616c730100305665633c543a3a486173683e040004902054686520686173686573206f6620746865206163746976652070726f706f73616c732e2850726f706f73616c4f660001011c543a3a48617368643c542061732054726169743c493e3e3a3a50726f706f73616c00040004cc2041637475616c2070726f706f73616c20666f72206120676976656e20686173682c20696620697427732063757272656e742e18566f74696e670001011c543a3a486173684c566f7465733c543a3a4163636f756e7449643e00040004b420566f746573206f6e206120676976656e2070726f706f73616c2c206966206974206973206f6e676f696e672e3450726f706f73616c436f756e7401000c753332100000000004482050726f706f73616c7320736f206661722e1c4d656d626572730100445665633c543a3a4163636f756e7449643e0400043901205468652063757272656e74206d656d62657273206f662074686520636f6c6c6563746976652e20546869732069732073746f72656420736f7274656420286a7573742062792076616c7565292e01102c7365745f6d656d62657273042c6e65775f6d656d62657273445665633c543a3a4163636f756e7449643e105101205365742074686520636f6c6c6563746976652773206d656d62657273686970206d616e75616c6c7920746f20606e65775f6d656d62657273602e204265206e69636520746f2074686520636861696e20616e645c2070726f76696465206974207072652d736f727465642e005820526571756972657320726f6f74206f726967696e2e1c65786563757465042070726f706f73616c78426f783c3c542061732054726169743c493e3e3a3a50726f706f73616c3e0cf420446973706174636820612070726f706f73616c2066726f6d2061206d656d626572207573696e672074686520604d656d62657260206f726967696e2e00ac204f726967696e206d7573742062652061206d656d626572206f662074686520636f6c6c6563746976652e1c70726f706f736508247468726573686f6c6450436f6d706163743c4d656d626572436f756e743e2070726f706f73616c78426f783c3c542061732054726169743c493e3e3a3a50726f706f73616c3e102c2023203c7765696768743e90202d20426f756e6465642073746f7261676520726561647320616e64207772697465732eb8202d20417267756d656e7420607468726573686f6c6460206861732062656172696e67206f6e207765696768742e302023203c2f7765696768743e10766f74650c2070726f706f73616c1c543a3a4861736814696e64657858436f6d706163743c50726f706f73616c496e6465783e1c617070726f766510626f6f6c102c2023203c7765696768743e8c202d20426f756e6465642073746f72616765207265616420616e64207772697465732e5501202d2057696c6c20626520736c696768746c792068656176696572206966207468652070726f706f73616c20697320617070726f766564202f20646973617070726f7665642061667465722074686520766f74652e302023203c2f7765696768743e01182050726f706f73656410244163636f756e7449643450726f706f73616c496e64657810486173682c4d656d626572436f756e74084d012041206d6f74696f6e2028676976656e20686173682920686173206265656e2070726f706f7365642028627920676976656e206163636f756e742920776974682061207468726573686f6c642028676976656e4020604d656d626572436f756e7460292e14566f74656414244163636f756e744964104861736810626f6f6c2c4d656d626572436f756e742c4d656d626572436f756e740809012041206d6f74696f6e2028676976656e20686173682920686173206265656e20766f746564206f6e20627920676976656e206163636f756e742c206c656176696e67190120612074616c6c79202879657320766f74657320616e64206e6f20766f74657320676976656e20726573706563746976656c7920617320604d656d626572436f756e7460292e20417070726f76656404104861736804c42041206d6f74696f6e2077617320617070726f76656420627920746865207265717569726564207468726573686f6c642e2c446973617070726f76656404104861736804d42041206d6f74696f6e20776173206e6f7420617070726f76656420627920746865207265717569726564207468726573686f6c642e20457865637574656408104861736810626f6f6c0405012041206d6f74696f6e207761732065786563757465643b2060626f6f6c6020697320747275652069662072657475726e656420776974686f7574206572726f722e384d656d626572457865637574656408104861736810626f6f6c042d0120412073696e676c65206d656d6265722064696420736f6d6520616374696f6e3b2060626f6f6c6020697320747275652069662072657475726e656420776974686f7574206572726f722e0018244e6f744d656d6265720460204163636f756e74206973206e6f742061206d656d626572444475706c696361746550726f706f73616c0480204475706c69636174652070726f706f73616c73206e6f7420616c6c6f7765643c50726f706f73616c4d697373696e6704502050726f706f73616c206d7573742065786973742857726f6e67496e6465780444204d69736d61746368656420696e646578344475706c6963617465566f7465045c204475706c696361746520766f74652069676e6f72656448416c7265616479496e697469616c697a65640484204d656d626572732061726520616c726561647920696e697469616c697a65642148546563686e6963616c436f6d6d6974746565014c496e7374616e636532436f6c6c656374697665142450726f706f73616c730100305665633c543a3a486173683e040004902054686520686173686573206f6620746865206163746976652070726f706f73616c732e2850726f706f73616c4f660001011c543a3a48617368643c542061732054726169743c493e3e3a3a50726f706f73616c00040004cc2041637475616c2070726f706f73616c20666f72206120676976656e20686173682c20696620697427732063757272656e742e18566f74696e670001011c543a3a486173684c566f7465733c543a3a4163636f756e7449643e00040004b420566f746573206f6e206120676976656e2070726f706f73616c2c206966206974206973206f6e676f696e672e3450726f706f73616c436f756e7401000c753332100000000004482050726f706f73616c7320736f206661722e1c4d656d626572730100445665633c543a3a4163636f756e7449643e0400043901205468652063757272656e74206d656d62657273206f662074686520636f6c6c6563746976652e20546869732069732073746f72656420736f7274656420286a7573742062792076616c7565292e01102c7365745f6d656d62657273042c6e65775f6d656d62657273445665633c543a3a4163636f756e7449643e105101205365742074686520636f6c6c6563746976652773206d656d62657273686970206d616e75616c6c7920746f20606e65775f6d656d62657273602e204265206e69636520746f2074686520636861696e20616e645c2070726f76696465206974207072652d736f727465642e005820526571756972657320726f6f74206f726967696e2e1c65786563757465042070726f706f73616c78426f783c3c542061732054726169743c493e3e3a3a50726f706f73616c3e0cf420446973706174636820612070726f706f73616c2066726f6d2061206d656d626572207573696e672074686520604d656d62657260206f726967696e2e00ac204f726967696e206d7573742062652061206d656d626572206f662074686520636f6c6c6563746976652e1c70726f706f736508247468726573686f6c6450436f6d706163743c4d656d626572436f756e743e2070726f706f73616c78426f783c3c542061732054726169743c493e3e3a3a50726f706f73616c3e102c2023203c7765696768743e90202d20426f756e6465642073746f7261676520726561647320616e64207772697465732eb8202d20417267756d656e7420607468726573686f6c6460206861732062656172696e67206f6e207765696768742e302023203c2f7765696768743e10766f74650c2070726f706f73616c1c543a3a4861736814696e64657858436f6d706163743c50726f706f73616c496e6465783e1c617070726f766510626f6f6c102c2023203c7765696768743e8c202d20426f756e6465642073746f72616765207265616420616e64207772697465732e5501202d2057696c6c20626520736c696768746c792068656176696572206966207468652070726f706f73616c20697320617070726f766564202f20646973617070726f7665642061667465722074686520766f74652e302023203c2f7765696768743e01182050726f706f73656410244163636f756e7449643450726f706f73616c496e64657810486173682c4d656d626572436f756e74084d012041206d6f74696f6e2028676976656e20686173682920686173206265656e2070726f706f7365642028627920676976656e206163636f756e742920776974682061207468726573686f6c642028676976656e4020604d656d626572436f756e7460292e14566f74656414244163636f756e744964104861736810626f6f6c2c4d656d626572436f756e742c4d656d626572436f756e740809012041206d6f74696f6e2028676976656e20686173682920686173206265656e20766f746564206f6e20627920676976656e206163636f756e742c206c656176696e67190120612074616c6c79202879657320766f74657320616e64206e6f20766f74657320676976656e20726573706563746976656c7920617320604d656d626572436f756e7460292e20417070726f76656404104861736804c42041206d6f74696f6e2077617320617070726f76656420627920746865207265717569726564207468726573686f6c642e2c446973617070726f76656404104861736804d42041206d6f74696f6e20776173206e6f7420617070726f76656420627920746865207265717569726564207468726573686f6c642e20457865637574656408104861736810626f6f6c0405012041206d6f74696f6e207761732065786563757465643b2060626f6f6c6020697320747275652069662072657475726e656420776974686f7574206572726f722e384d656d626572457865637574656408104861736810626f6f6c042d0120412073696e676c65206d656d6265722064696420736f6d6520616374696f6e3b2060626f6f6c6020697320747275652069662072657475726e656420776974686f7574206572726f722e0018244e6f744d656d6265720460204163636f756e74206973206e6f742061206d656d626572444475706c696361746550726f706f73616c0480204475706c69636174652070726f706f73616c73206e6f7420616c6c6f7765643c50726f706f73616c4d697373696e6704502050726f706f73616c206d7573742065786973742857726f6e67496e6465780444204d69736d61746368656420696e646578344475706c6963617465566f7465045c204475706c696361746520766f74652069676e6f72656448416c7265616479496e697469616c697a65640484204d656d626572732061726520616c726561647920696e697469616c697a65642144456c656374696f6e7350687261676d656e014050687261676d656e456c656374696f6e181c4d656d626572730100845665633c28543a3a4163636f756e7449642c2042616c616e63654f663c543e293e040004f0205468652063757272656e7420656c6563746564206d656d626572736869702e20536f72746564206261736564206f6e206163636f756e742069642e2452756e6e65727355700100845665633c28543a3a4163636f756e7449642c2042616c616e63654f663c543e293e0400044901205468652063757272656e742072756e6e6572735f75702e20536f72746564206261736564206f6e206c6f7720746f2068696768206d657269742028776f72736520746f20626573742072756e6e6572292e38456c656374696f6e526f756e647301000c75333210000000000441012054686520746f74616c206e756d626572206f6620766f746520726f756e6473207468617420686176652068617070656e65642c206578636c7564696e6720746865207570636f6d696e67206f6e652e1c566f7465734f6601010130543a3a4163636f756e744964445665633c543a3a4163636f756e7449643e01040004010120566f746573206f66206120706172746963756c617220766f7465722c20776974682074686520726f756e6420696e646578206f662074686520766f7465732e1c5374616b654f6601010130543a3a4163636f756e7449643042616c616e63654f663c543e0040000000000000000000000000000000000464204c6f636b6564207374616b65206f66206120766f7465722e2843616e646964617465730100445665633c543a3a4163636f756e7449643e0400086501205468652070726573656e742063616e646964617465206c6973742e20536f72746564206261736564206f6e206163636f756e742d69642e20412063757272656e74206d656d626572206f7220612072756e6e65722063616e3101206e6576657220656e746572207468697320766563746f7220616e6420697320616c7761797320696d706c696369746c7920617373756d656420746f20626520612063616e6469646174652e011810766f74650814766f746573445665633c543a3a4163636f756e7449643e1476616c756554436f6d706163743c42616c616e63654f663c543e3e3c050120566f746520666f72206120736574206f662063616e6469646174657320666f7220746865207570636f6d696e6720726f756e64206f6620656c656374696f6e2e0050205468652060766f746573602073686f756c643a482020202d206e6f7420626520656d7074792eac2020202d206265206c657373207468616e20746865206e756d626572206f662063616e646964617465732e005d012055706f6e20766f74696e672c206076616c75656020756e697473206f66206077686f6027732062616c616e6365206973206c6f636b656420616e64206120626f6e6420616d6f756e742069732072657365727665642e5d012049742069732074686520726573706f6e736962696c697479206f66207468652063616c6c657220746f206e6f7420706c61636520616c6c206f662074686569722062616c616e636520696e746f20746865206c6f636ba020616e64206b65657020736f6d6520666f722066757274686572207472616e73616374696f6e732e002c2023203c7765696768743e2c2023232323205374617465302052656164733a204f283129c8205772697465733a204f28562920676976656e2060566020766f7465732e205620697320626f756e6465642062792031362e302023203c2f7765696768743e3072656d6f76655f766f746572001c21012052656d6f766520606f726967696e60206173206120766f7465722e20546869732072656d6f76657320746865206c6f636b20616e642072657475726e732074686520626f6e642e002c2023203c7765696768743e2c2023232323205374617465302052656164733a204f28312934205772697465733a204f283129302023203c2f7765696768743e507265706f72745f646566756e63745f766f74657204187461726765748c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f75726365345d01205265706f727420607461726765746020666f72206265696e6720616e20646566756e637420766f7465722e20496e2063617365206f6620612076616c6964207265706f72742c20746865207265706f727465722069735d012072657761726465642062792074686520626f6e6420616d6f756e74206f662060746172676574602e204f74686572776973652c20746865207265706f7274657220697473656c662069732072656d6f76656420616e645c20746865697220626f6e6420697320736c61736865642e0088204120646566756e637420766f74657220697320646566696e656420746f2062653a4d012020202d206120766f7465722077686f73652063757272656e74207375626d697474656420766f7465732061726520616c6c20696e76616c69642e20692e652e20616c6c206f66207468656d20617265206e6fb420202020206c6f6e67657220612063616e646964617465206e6f7220616e20616374697665206d656d6265722e002c2023203c7765696768743e2c202323232320537461746515012052656164733a204f284e4c6f674d2920676976656e204d2063757272656e742063616e6469646174657320616e64204e20766f74657320666f722060746172676574602e34205772697465733a204f283129302023203c2f7765696768743e407375626d69745f63616e646964616379003478205375626d6974206f6e6573656c6620666f722063616e6469646163792e006420412063616e6469646174652077696c6c206569746865723aec2020202d204c6f73652061742074686520656e64206f6620746865207465726d20616e6420666f7266656974207468656972206465706f7369742e2d012020202d2057696e20616e64206265636f6d652061206d656d6265722e204d656d626572732077696c6c206576656e7475616c6c7920676574207468656972207374617368206261636b2e55012020202d204265636f6d6520612072756e6e65722d75702e2052756e6e6572732d75707320617265207265736572766564206d656d6265727320696e2063617365206f6e65206765747320666f72636566756c6c7934202020202072656d6f7665642e002c2023203c7765696768743e2c20232323232053746174658c2052656164733a204f284c6f674e2920476976656e204e2063616e646964617465732e34205772697465733a204f283129302023203c2f7765696768743e4872656e6f756e63655f63616e646964616379002451012052656e6f756e6365206f6e65277320696e74656e74696f6e20746f20626520612063616e64696461746520666f7220746865206e65787420656c656374696f6e20726f756e642e203320706f74656e7469616c40206f7574636f6d65732065786973743a4101202d20606f726967696e6020697320612063616e64696461746520616e64206e6f7420656c656374656420696e20616e79207365742e20496e207468697320636173652c2074686520626f6e64206973f4202020756e72657365727665642c2072657475726e656420616e64206f726967696e2069732072656d6f76656420617320612063616e6469646174652e5901202d20606f726967696e6020697320612063757272656e742072756e6e65722075702e20496e207468697320636173652c2074686520626f6e6420697320756e72657365727665642c2072657475726e656420616e64842020206f726967696e2069732072656d6f76656420617320612072756e6e65722e4d01202d20606f726967696e6020697320612063757272656e74206d656d6265722e20496e207468697320636173652c2074686520626f6e6420697320756e726573657276656420616e64206f726967696e206973590120202072656d6f7665642061732061206d656d6265722c20636f6e73657175656e746c79206e6f74206265696e6720612063616e64696461746520666f7220746865206e65787420726f756e6420616e796d6f72652e650120202053696d696c617220746f205b6072656d6f76655f766f746572605d2c206966207265706c6163656d656e742072756e6e657273206578697374732c20746865792061726520696d6d6564696174656c7920757365642e3472656d6f76655f6d656d626572040c77686f8c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f75726365345d012052656d6f7665206120706172746963756c6172206d656d6265722066726f6d20746865207365742e20546869732069732065666665637469766520696d6d6564696174656c7920616e642074686520626f6e64206f668020746865206f7574676f696e67206d656d62657220697320736c61736865642e00590120496620612072756e6e65722d757020697320617661696c61626c652c207468656e2074686520626573742072756e6e65722d75702077696c6c2062652072656d6f76656420616e64207265706c6163657320746865f4206f7574676f696e67206d656d6265722e204f74686572776973652c2061206e65772070687261676d656e20726f756e6420697320737461727465642e004501204e6f74652074686174207468697320646f6573206e6f7420616666656374207468652064657369676e6174656420626c6f636b206e756d626572206f6620746865206e65787420656c656374696f6e2e002c2023203c7765696768743e2c2023232323205374617465582052656164733a204f28646f5f70687261676d656e295c205772697465733a204f28646f5f70687261676d656e29302023203c2f7765696768743e01141c4e65775465726d04645665633c284163636f756e7449642c2042616c616e6365293e0855012041206e6577207465726d2077697468206e6577206d656d626572732e205468697320696e64696361746573207468617420656e6f7567682063616e6469646174657320657869737465642c206e6f742074686174450120656e6f756768206861766520686173206265656e20656c65637465642e2054686520696e6e65722076616c7565206d757374206265206578616d696e656420666f72207468697320707572706f73652e24456d7074795465726d0004d8204e6f20286f72206e6f7420656e6f756768292063616e64696461746573206578697374656420666f72207468697320726f756e642e304d656d6265724b69636b656404244163636f756e7449640845012041206d656d62657220686173206265656e2072656d6f7665642e20546869732073686f756c6420616c7761797320626520666f6c6c6f7765642062792065697468657220604e65775465726d60206f74342060456d7074795465726d602e3c4d656d62657252656e6f756e63656404244163636f756e74496404a02041206d656d626572206861732072656e6f756e6365642074686569722063616e6469646163792e34566f7465725265706f727465640c244163636f756e744964244163636f756e74496410626f6f6c086101204120766f7465722028666972737420656c656d656e742920776173207265706f72746564202862797420746865207365636f6e6420656c656d656e742920776974682074686520746865207265706f7274206265696e678c207375636365737366756c206f72206e6f742028746869726420656c656d656e74292e143443616e646964616379426f6e643042616c616e63654f663c543e400010a5d4e800000000000000000000000028566f74696e67426f6e643042616c616e63654f663c543e4000743ba40b00000000000000000000000038446573697265644d656d626572730c753332100d00000000404465736972656452756e6e65727355700c753332100700000000305465726d4475726174696f6e38543a3a426c6f636b4e756d6265721040380000003830556e61626c65546f566f746504c42043616e6e6f7420766f7465207768656e206e6f2063616e64696461746573206f72206d656d626572732065786973742e1c4e6f566f7465730498204d75737420766f746520666f72206174206c65617374206f6e652063616e6469646174652e30546f6f4d616e79566f74657304882043616e6e6f7420766f7465206d6f7265207468616e2063616e646964617465732e504d6178696d756d566f7465734578636565646564049c2043616e6e6f7420766f7465206d6f7265207468616e206d6178696d756d20616c6c6f7765642e284c6f7742616c616e636504c82043616e6e6f7420766f74652077697468207374616b65206c657373207468616e206d696e696d756d2062616c616e63652e3c556e61626c65546f506179426f6e64047c20566f7465722063616e206e6f742070617920766f74696e6720626f6e642e2c4d7573744265566f7465720444204d757374206265206120766f7465722e285265706f727453656c6604502043616e6e6f74207265706f72742073656c662e4c4475706c69636174656443616e6469646174650484204475706c6963617465642063616e646964617465207375626d697373696f6e2e304d656d6265725375626d6974048c204d656d6265722063616e6e6f742072652d7375626d69742063616e6469646163792e3052756e6e65725375626d6974048c2052756e6e65722063616e6e6f742072652d7375626d69742063616e6469646163792e68496e73756666696369656e7443616e64696461746546756e647304982043616e64696461746520646f6573206e6f74206861766520656e6f7567682066756e64732e34496e76616c69644f726967696e04c8204f726967696e206973206e6f7420612063616e6469646174652c206d656d626572206f7220612072756e6e65722075702e244e6f744d656d6265720438204e6f742061206d656d6265722e4c546563686e6963616c4d656d62657273686970014c496e7374616e6365314d656d62657273686970041c4d656d626572730100445665633c543a3a4163636f756e7449643e040004c8205468652063757272656e74206d656d626572736869702c2073746f72656420617320616e206f726465726564205665632e0114286164645f6d656d626572040c77686f30543a3a4163636f756e7449640c7c204164642061206d656d626572206077686f6020746f20746865207365742e00b4204d6179206f6e6c792062652063616c6c65642066726f6d20604164644f726967696e60206f7220726f6f742e3472656d6f76655f6d656d626572040c77686f30543a3a4163636f756e7449640c902052656d6f76652061206d656d626572206077686f602066726f6d20746865207365742e00c0204d6179206f6e6c792062652063616c6c65642066726f6d206052656d6f76654f726967696e60206f7220726f6f742e2c737761705f6d656d626572081872656d6f766530543a3a4163636f756e7449640c61646430543a3a4163636f756e7449640cc02053776170206f7574206f6e65206d656d626572206072656d6f76656020666f7220616e6f746865722060616464602e00b8204d6179206f6e6c792062652063616c6c65642066726f6d2060537761704f726967696e60206f7220726f6f742e3472657365745f6d656d62657273041c6d656d62657273445665633c543a3a4163636f756e7449643e105901204368616e676520746865206d656d6265727368697020746f2061206e6577207365742c20646973726567617264696e6720746865206578697374696e67206d656d626572736869702e204265206e69636520616e646c207061737320606d656d6265727360207072652d736f727465642e00bc204d6179206f6e6c792062652063616c6c65642066726f6d206052657365744f726967696e60206f7220726f6f742e286368616e67655f6b6579040c6e657730543a3a4163636f756e7449640cd82053776170206f7574207468652073656e64696e67206d656d62657220666f7220736f6d65206f74686572206b657920606e6577602e00f4204d6179206f6e6c792062652063616c6c65642066726f6d20605369676e656460206f726967696e206f6620612063757272656e74206d656d6265722e01182c4d656d62657241646465640004e42054686520676976656e206d656d626572207761732061646465643b2073656520746865207472616e73616374696f6e20666f722077686f2e344d656d62657252656d6f7665640004ec2054686520676976656e206d656d626572207761732072656d6f7665643b2073656520746865207472616e73616374696f6e20666f722077686f2e384d656d62657273537761707065640004dc2054776f206d656d62657273207765726520737761707065643b2073656520746865207472616e73616374696f6e20666f722077686f2e304d656d6265727352657365740004190120546865206d656d62657273686970207761732072657365743b2073656520746865207472616e73616374696f6e20666f722077686f20746865206e6577207365742069732e284b65794368616e676564000488204f6e65206f6620746865206d656d6265727327206b657973206368616e6765642e1444756d6d7904bc73705f7374643a3a6d61726b65723a3a5068616e746f6d446174613c284163636f756e7449642c204576656e74293e0470205068616e746f6d206d656d6265722c206e6576657220757365642e000020547265617375727901205472656173757279143450726f706f73616c436f756e7401003450726f706f73616c496e646578100000000004a4204e756d626572206f662070726f706f73616c7320746861742068617665206265656e206d6164652e2450726f706f73616c730001013450726f706f73616c496e6465789050726f706f73616c3c543a3a4163636f756e7449642c2042616c616e63654f663c543e3e000400047c2050726f706f73616c7320746861742068617665206265656e206d6164652e24417070726f76616c730100485665633c50726f706f73616c496e6465783e040004f82050726f706f73616c20696e646963657320746861742068617665206265656e20617070726f76656420627574206e6f742079657420617761726465642e10546970730001051c543a3a48617368f04f70656e5469703c543a3a4163636f756e7449642c2042616c616e63654f663c543e2c20543a3a426c6f636b4e756d6265722c20543a3a486173683e0004000c59012054697073207468617420617265206e6f742079657420636f6d706c657465642e204b65796564206279207468652068617368206f66206028726561736f6e2c2077686f29602066726f6d207468652076616c75652e3d012054686973206861732074686520696e73656375726520656e756d657261626c6520686173682066756e6374696f6e2073696e636520746865206b657920697473656c6620697320616c7265616479802067756172616e7465656420746f20626520612073656375726520686173682e1c526561736f6e730001051c543a3a486173681c5665633c75383e0004000849012053696d706c6520707265696d616765206c6f6f6b75702066726f6d2074686520726561736f6e2773206861736820746f20746865206f726967696e616c20646174612e20416761696e2c2068617320616e610120696e73656375726520656e756d657261626c6520686173682073696e636520746865206b65792069732067756172616e7465656420746f2062652074686520726573756c74206f6620612073656375726520686173682e01203470726f706f73655f7370656e64081476616c756554436f6d706163743c42616c616e63654f663c543e3e2c62656e65666963696172798c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f75726365242d012050757420666f727761726420612073756767657374696f6e20666f72207370656e64696e672e2041206465706f7369742070726f706f7274696f6e616c20746f207468652076616c7565350120697320726573657276656420616e6420736c6173686564206966207468652070726f706f73616c2069732072656a65637465642e2049742069732072657475726e6564206f6e636520746865542070726f706f73616c20697320617761726465642e002c2023203c7765696768743e20202d204f2831292e64202d204c696d697465642073746f726167652072656164732e94202d204f6e65204442206368616e67652c206f6e6520657874726120444220656e7472792e302023203c2f7765696768743e3c72656a6563745f70726f706f73616c042c70726f706f73616c5f696458436f6d706163743c50726f706f73616c496e6465783e1cfc2052656a65637420612070726f706f736564207370656e642e20546865206f726967696e616c206465706f7369742077696c6c20626520736c61736865642e002c2023203c7765696768743e20202d204f2831292e64202d204c696d697465642073746f726167652072656164732e40202d204f6e6520444220636c6561722e302023203c2f7765696768743e40617070726f76655f70726f706f73616c042c70726f706f73616c5f696458436f6d706163743c50726f706f73616c496e6465783e205d0120417070726f766520612070726f706f73616c2e2041742061206c617465722074696d652c207468652070726f706f73616c2077696c6c20626520616c6c6f636174656420746f207468652062656e6566696369617279ac20616e6420746865206f726967696e616c206465706f7369742077696c6c2062652072657475726e65642e002c2023203c7765696768743e20202d204f2831292e64202d204c696d697465642073746f726167652072656164732e44202d204f6e65204442206368616e67652e302023203c2f7765696768743e387265706f72745f617765736f6d650818726561736f6e1c5665633c75383e0c77686f30543a3a4163636f756e7449644c5d01205265706f727420736f6d657468696e672060726561736f6e60207468617420646573657276657320612074697020616e6420636c61696d20616e79206576656e7475616c207468652066696e6465722773206665652e00d020546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2e005501205061796d656e743a20605469705265706f72744465706f73697442617365602077696c6c2062652072657365727665642066726f6d20746865206f726967696e206163636f756e742c2061732077656c6c206173d420605469705265706f72744465706f736974506572427974656020666f722065616368206279746520696e2060726561736f6e602e006101202d2060726561736f6e603a2054686520726561736f6e20666f722c206f7220746865207468696e6720746861742064657365727665732c20746865207469703b2067656e6572616c6c7920746869732077696c6c2062655c20202061205554462d382d656e636f6465642055524c2eec202d206077686f603a20546865206163636f756e742077686963682073686f756c6420626520637265646974656420666f7220746865207469702e007820456d69747320604e657754697060206966207375636365737366756c2e002c2023203c7765696768743e9c202d20604f2852296020776865726520605260206c656e677468206f662060726561736f6e602e64202d204f6e652062616c616e6365206f7065726174696f6e2e9c202d204f6e652073746f72616765206d75746174696f6e2028636f64656320604f28522960292e34202d204f6e65206576656e742e302023203c2f7765696768743e2c726574726163745f7469700410686173681c543a3a486173684c550120526574726163742061207072696f72207469702d7265706f72742066726f6d20607265706f72745f617765736f6d65602c20616e642063616e63656c207468652070726f63657373206f662074697070696e672e00e0204966207375636365737366756c2c20746865206f726967696e616c206465706f7369742077696c6c20626520756e72657365727665642e00510120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e642074686520746970206964656e746966696564206279206068617368604501206d7573742068617665206265656e207265706f7274656420627920746865207369676e696e67206163636f756e74207468726f75676820607265706f72745f617765736f6d65602028616e64206e6f7450207468726f75676820607469705f6e657760292e006501202d206068617368603a20546865206964656e74697479206f6620746865206f70656e2074697020666f722077686963682061207469702076616c7565206973206465636c617265642e205468697320697320666f726d656461012020206173207468652068617368206f6620746865207475706c65206f6620746865206f726967696e616c207469702060726561736f6e6020616e64207468652062656e6566696369617279206163636f756e742049442e009020456d697473206054697052657472616374656460206966207375636365737366756c2e002c2023203c7765696768743e24202d20604f2854296064202d204f6e652062616c616e6365206f7065726174696f6e2ec4202d2054776f2073746f726167652072656d6f76616c7320286f6e6520726561642c20636f64656320604f28542960292e34202d204f6e65206576656e742e302023203c2f7765696768743e1c7469705f6e65770c18726561736f6e1c5665633c75383e0c77686f30543a3a4163636f756e744964247469705f76616c75653042616c616e63654f663c543e4cf4204769766520612074697020666f7220736f6d657468696e67206e65773b206e6f2066696e6465722773206665652077696c6c2062652074616b656e2e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e6420746865207369676e696e67206163636f756e74206d757374206265206174206d656d626572206f662074686520605469707065727360207365742e006101202d2060726561736f6e603a2054686520726561736f6e20666f722c206f7220746865207468696e6720746861742064657365727665732c20746865207469703b2067656e6572616c6c7920746869732077696c6c2062655c20202061205554462d382d656e636f6465642055524c2eec202d206077686f603a20546865206163636f756e742077686963682073686f756c6420626520637265646974656420666f7220746865207469702e5101202d20607469705f76616c7565603a2054686520616d6f756e74206f66207469702074686174207468652073656e64657220776f756c64206c696b6520746f20676976652e20546865206d656469616e20746970d820202076616c7565206f662061637469766520746970706572732077696c6c20626520676976656e20746f20746865206077686f602e007820456d69747320604e657754697060206966207375636365737366756c2e002c2023203c7765696768743e4101202d20604f2852202b2054296020776865726520605260206c656e677468206f662060726561736f6e602c2060546020697320746865206e756d626572206f6620746970706572732e2060546020697345012020206e61747572616c6c79206361707065642061732061206d656d62657273686970207365742c20605260206973206c696d69746564207468726f756768207472616e73616374696f6e2d73697a652e0d01202d2054776f2073746f7261676520696e73657274696f6e732028636f6465637320604f285229602c20604f28542960292c206f6e65207265616420604f283129602e34202d204f6e65206576656e742e302023203c2f7765696768743e0c7469700810686173681c543a3a48617368247469705f76616c75653042616c616e63654f663c543e4cb4204465636c6172652061207469702076616c756520666f7220616e20616c72656164792d6f70656e207469702e00550120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e6420746865207369676e696e67206163636f756e74206d757374206265206174206d656d626572206f662074686520605469707065727360207365742e006501202d206068617368603a20546865206964656e74697479206f6620746865206f70656e2074697020666f722077686963682061207469702076616c7565206973206465636c617265642e205468697320697320666f726d656461012020206173207468652068617368206f6620746865207475706c65206f66207468652068617368206f6620746865206f726967696e616c207469702060726561736f6e6020616e64207468652062656e6566696369617279382020206163636f756e742049442e5101202d20607469705f76616c7565603a2054686520616d6f756e74206f66207469702074686174207468652073656e64657220776f756c64206c696b6520746f20676976652e20546865206d656469616e20746970d820202076616c7565206f662061637469766520746970706572732077696c6c20626520676976656e20746f20746865206077686f602e00650120456d6974732060546970436c6f73696e676020696620746865207468726573686f6c64206f66207469707065727320686173206265656e207265616368656420616e642074686520636f756e74646f776e20706572696f64342068617320737461727465642e002c2023203c7765696768743e24202d20604f285429600101202d204f6e652073746f72616765206d75746174696f6e2028636f64656320604f28542960292c206f6e652073746f72616765207265616420604f283129602e4c202d20557020746f206f6e65206576656e742e302023203c2f7765696768743e24636c6f73655f7469700410686173681c543a3a48617368386020436c6f736520616e64207061796f75742061207469702e00d020546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2e0019012054686520746970206964656e74696669656420627920606861736860206d75737420686176652066696e69736865642069747320636f756e74646f776e20706572696f642e006501202d206068617368603a20546865206964656e74697479206f6620746865206f70656e2074697020666f722077686963682061207469702076616c7565206973206465636c617265642e205468697320697320666f726d656461012020206173207468652068617368206f6620746865207475706c65206f6620746865206f726967696e616c207469702060726561736f6e6020616e64207468652062656e6566696369617279206163636f756e742049442e002c2023203c7765696768743e24202d20604f28542960e4202d204f6e652073746f726167652072657472696576616c2028636f64656320604f285429602920616e642074776f2072656d6f76616c732e88202d20557020746f2074687265652062616c616e6365206f7065726174696f6e732e302023203c2f7765696768743e012c2050726f706f736564043450726f706f73616c496e6465780438204e65772070726f706f73616c2e205370656e64696e67041c42616c616e636504e8205765206861766520656e6465642061207370656e6420706572696f6420616e642077696c6c206e6f7720616c6c6f636174652066756e64732e1c417761726465640c3450726f706f73616c496e6465781c42616c616e6365244163636f756e744964048020536f6d652066756e64732068617665206265656e20616c6c6f63617465642e2052656a6563746564083450726f706f73616c496e6465781c42616c616e636504b420412070726f706f73616c207761732072656a65637465643b2066756e6473207765726520736c61736865642e144275726e74041c42616c616e6365048c20536f6d65206f66206f75722066756e64732068617665206265656e206275726e742e20526f6c6c6f766572041c42616c616e6365043101205370656e64696e67206861732066696e69736865643b20746869732069732074686520616d6f756e74207468617420726f6c6c73206f76657220756e74696c206e657874207370656e642e1c4465706f736974041c42616c616e6365048020536f6d652066756e64732068617665206265656e206465706f73697465642e184e657754697004104861736804982041206e6577207469702073756767657374696f6e20686173206265656e206f70656e65642e28546970436c6f73696e6704104861736804dc2041207469702073756767657374696f6e206861732072656163686564207468726573686f6c6420616e6420697320636c6f73696e672e24546970436c6f7365640c1048617368244163636f756e7449641c42616c616e636504882041207469702073756767657374696f6e20686173206265656e20636c6f7365642e3054697052657472616374656404104861736804942041207469702073756767657374696f6e20686173206265656e207265747261637465642e203050726f706f73616c426f6e641c5065726d696c6c1050c30000085501204672616374696f6e206f6620612070726f706f73616c27732076616c756520746861742073686f756c6420626520626f6e64656420696e206f7264657220746f20706c616365207468652070726f706f73616c2e110120416e2061636365707465642070726f706f73616c2067657473207468657365206261636b2e20412072656a65637465642070726f706f73616c20646f6573206e6f742e4c50726f706f73616c426f6e644d696e696d756d3042616c616e63654f663c543e400040e59c301200000000000000000000044901204d696e696d756d20616d6f756e74206f662066756e647320746861742073686f756c6420626520706c6163656420696e2061206465706f73697420666f72206d616b696e6720612070726f706f73616c2e2c5370656e64506572696f6438543a3a426c6f636b4e756d6265721080510100048820506572696f64206265747765656e2073756363657373697665207370656e64732e104275726e1c5065726d696c6c10000000000411012050657263656e74616765206f662073706172652066756e64732028696620616e7929207468617420617265206275726e7420706572207370656e6420706572696f642e30546970436f756e74646f776e38543a3a426c6f636b4e756d62657210403800000445012054686520706572696f6420666f722077686963682061207469702072656d61696e73206f70656e20616674657220697320686173206163686965766564207468726573686f6c6420746970706572732e3454697046696e646572734665651c50657263656e7404140431012054686520616d6f756e74206f66207468652066696e616c2074697020776869636820676f657320746f20746865206f726967696e616c207265706f72746572206f6620746865207469702e505469705265706f72744465706f736974426173653042616c616e63654f663c543e400010a5d4e8000000000000000000000004d42054686520616d6f756e742068656c64206f6e206465706f73697420666f7220706c6163696e67206120746970207265706f72742e5c5469705265706f72744465706f736974506572427974653042616c616e63654f663c543e4000e40b540200000000000000000000000409012054686520616d6f756e742068656c64206f6e206465706f7369742070657220627974652077697468696e2074686520746970207265706f727420726561736f6e2e2070496e73756666696369656e7450726f706f7365727342616c616e6365047c2050726f706f73657227732062616c616e636520697320746f6f206c6f772e50496e76616c696450726f706f73616c496e646578046c204e6f2070726f706f73616c206174207468617420696e6465782e30526561736f6e546f6f42696704882054686520726561736f6e20676976656e206973206a75737420746f6f206269672e30416c72656164794b6e6f776e048c20546865207469702077617320616c726561647920666f756e642f737461727465642e28556e6b6e6f776e54697004642054686520746970206861736820697320756e6b6e6f776e2e244e6f7446696e64657204210120546865206163636f756e7420617474656d7074696e6720746f20726574726163742074686520746970206973206e6f74207468652066696e646572206f6620746865207469702e245374696c6c4f70656e042d0120546865207469702063616e6e6f7420626520636c61696d65642f636c6f736564206265636175736520746865726520617265206e6f7420656e6f7567682074697070657273207965742e245072656d617475726504350120546865207469702063616e6e6f7420626520636c61696d65642f636c6f73656420626563617573652069742773207374696c6c20696e2074686520636f756e74646f776e20706572696f642e18436c61696d730118436c61696d730c18436c61696d730001013c457468657265756d416464726573733042616c616e63654f663c543e0004000014546f74616c01003042616c616e63654f663c543e4000000000000000000000000000000000001c56657374696e670001013c457468657265756d41646472657373b02842616c616e63654f663c543e2c2042616c616e63654f663c543e2c20543a3a426c6f636b4e756d6265722900040010782056657374696e67207363686564756c6520666f72206120636c61696d2e0d012046697273742062616c616e63652069732074686520746f74616c20616d6f756e7420746861742073686f756c642062652068656c6420666f722076657374696e672ee4205365636f6e642062616c616e636520697320686f77206d7563682073686f756c6420626520756e6c6f636b65642070657220626c6f636b2ecc2054686520626c6f636b206e756d626572206973207768656e207468652076657374696e672073686f756c642073746172742e010814636c61696d08106465737430543a3a4163636f756e74496448657468657265756d5f7369676e61747572653845636473615369676e61747572650438204d616b65206120636c61696d2e286d696e745f636c61696d0c0c77686f3c457468657265756d416464726573731476616c75653042616c616e63654f663c543e4076657374696e675f7363686564756c65d04f7074696f6e3c2842616c616e63654f663c543e2c2042616c616e63654f663c543e2c20543a3a426c6f636b4e756d626572293e0488204164642061206e657720636c61696d2c20696620796f752061726520726f6f742e01041c436c61696d65640c244163636f756e7449643c457468657265756d416464726573731c42616c616e6365046c20536f6d656f6e6520636c61696d656420736f6d6520444f54732e041850726566697814265b75385d807c506179204b534d7320746f20746865204b7573616d61206163636f756e743a04150120546865205072656669782074686174206973207573656420696e207369676e656420457468657265756d206d6573736167657320666f722074686973206e6574776f726b002850617261636861696e73012850617261636861696e73242c417574686f7269746965730100405665633c56616c696461746f7249643e0400049420416c6c20617574686f72697469657327206b65797320617420746865206d6f6d656e742e10436f6465000101185061726149641c5665633c75383e0004000498205468652070617261636861696e7320726567697374657265642061742070726573656e742e144865616473000101185061726149641c5665633c75383e00040004cc20546865206865616473206f66207468652070617261636861696e7320726567697374657265642061742070726573656e742e2857617465726d61726b730001011850617261496438543a3a426c6f636b4e756d6265720004000cfc205468652077617465726d61726b2068656967687473206f66207468652070617261636861696e7320726567697374657265642061742070726573656e742e410120466f722065766572792070617261636861696e2c20746869732069732074686520626c6f636b206865696768742066726f6d20776869636820616c6c206d6573736167657320746172676574696e675d0120746861742070617261636861696e2068617665206265656e2070726f6365737365642e2043616e20626520604e6f6e6560206f6e6c79206966207468652070617261636861696e20646f65736e27742065786973742e3c556e726f75746564496e67726573730001016028543a3a426c6f636b4e756d6265722c20506172614964294c5665633c285061726149642c2048617368293e00040010550120556e726f7574656420696e67726573732e204d6170732028426c6f636b4e756d6265722c20746f5f636861696e2920706169727320746f205b2866726f6d5f636861696e2c206567726573735f726f6f74295d2e004d01205468657265206d617920626520616e20656e74727920756e6465722028692c20702920696e2074686973206d617020666f722065766572792069206265747765656e207468652070617261636861696e2773842077617465726d61726b20616e64207468652063757272656e7420626c6f636b2e4852656c61794469737061746368517565756501010118506172614964485665633c5570776172644d6573736167653e000400081d01204d6573736167657320726561647920746f2062652064697370617463686564206f6e746f207468652072656c617920636861696e2e204974206973207375626a65637420746fc820604d41585f4d4553534147455f434f554e546020616e64206057415445524d41524b5f4d4553534147455f53495a45602e5852656c61794469737061746368517565756553697a650101011850617261496428287533322c2075333229002000000000000000000c45012053697a65206f6620746865206469737061746368207175657565732e205365706172617465642066726f6d2061637475616c206461746120696e206f7264657220746f2061766f696420636f73746c795901206465636f64696e67207768656e20636865636b696e6720726563656970742076616c69646974792e204669727374206974656d20696e207475706c652069732074686520636f756e74206f66206d65737361676573fc097365636f6e642069662074686520746f74616c206c656e6774682028696e20627974657329206f6620746865206d657373616765207061796c6f6164732e344e65656473446973706174636801002c5665633c5061726149643e040004110120546865206f726465726564206c697374206f662050617261496473207468617420686176652061206052656c6179446973706174636851756575656020656e7472792e2444696455706461746500002c5665633c5061726149643e040010650120536f6d65206966207468652070617261636861696e20686561647320676574207570646174656420696e207468697320626c6f636b2c20616c6f6e672077697468207468652070617261636861696e204944732074686174350120646964207570646174652e204f72646572656420696e207468652073616d652077617920617320607265676973747261723a3a416374697665602028692e652e20627920506172614964292e0064204e6f6e65206966206e6f742079657420757064617465642e0104247365745f686561647304146865616473585665633c417474657374656443616e6469646174653e0415012050726f766964652063616e64696461746520726563656970747320666f722070617261636861696e732c20696e20617363656e64696e67206f726465722062792069642e000000304174746573746174696f6e7301304174746573746174696f6e730c40526563656e7450617261426c6f636b7300010138543a3a426c6f636b4e756d62657244496e636c75646564426c6f636b733c543e00040008f02041206d617070696e672066726f6d206d6f64756c617220626c6f636b206e756d62657220286e2025204174746573746174696f6e506572696f6429cc20746f2073657373696f6e20696e64657820616e6420746865206c697374206f662063616e646964617465206861736865732e5450617261426c6f636b4174746573746174696f6e7300020138543a3a426c6f636b4e756d626572104861736850426c6f636b4174746573746174696f6e733c543e00040004a8204174746573746174696f6e73206f6e206120726563656e742070617261636861696e20626c6f636b2e24446964557064617465010010626f6f6c0400000104446d6f72655f6174746573746174696f6e7304145f6d6f7265404d6f72654174746573746174696f6e730415012050726f766964652063616e64696461746520726563656970747320666f722070617261636861696e732c20696e20617363656e64696e67206f726465722062792069642e00000014536c6f74730114536c6f7473243841756374696f6e436f756e74657201003041756374696f6e496e646578100000000004d820546865206e756d626572206f662061756374696f6e7320746861742068617665206265656e207374617274656420736f206661722e284d616e6167656449647301002c5665633c5061726149643e0400084d01204f726465726564206c697374206f6620616c6c2060506172614964602076616c756573207468617420617265206d616e616765642062792074686973206d6f64756c652e205468697320696e636c75646573290120636861696e73207468617420617265206e6f7420796574206465706c6f7965642028627574206861766520776f6e20616e2061756374696f6e20696e2074686520667574757265292e204465706f7369747301010118506172614964445665633c42616c616e63654f663c543e3e000400345d0120566172696f757320616d6f756e7473206f6e206465706f73697420666f7220656163682070617261636861696e2e20416e20656e74727920696e20604d616e616765644964736020696d706c6965732061206e6f6e2d502064656661756c7420656e74727920686572652e006501205468652061637475616c20616d6f756e74206c6f636b6564206f6e2069747320626568616c6620617420616e792074696d6520697320746865206d6178696d756d206974656d20696e2074686973206c6973742e205468655101206669727374206974656d20696e20746865206c6973742069732074686520616d6f756e74206c6f636b656420666f72207468652063757272656e74204c6561736520506572696f642e20466f6c6c6f77696e67b0206974656d732061726520666f72207468652073756273657175656e74206c6561736520706572696f64732e006101205468652064656661756c742076616c75652028616e20656d707479206c6973742920696d706c6965732074686174207468652070617261636861696e206e6f206c6f6e6765722065786973747320286f72206e65766572b4206578697374656429206173206661722061732074686973206d6f64756c6520697320636f6e6365726e65642e00510120496620612070617261636861696e20646f65736e2774206578697374202a7965742a20627574206973207363686564756c656420746f20657869737420696e20746865206675747572652c207468656e2069745d012077696c6c206265206c6566742d7061646465642077697468206f6e65206f72206d6f7265207a65726f657320746f2064656e6f74652074686520666163742074686174206e6f7468696e672069732068656c64206f6e5d01206465706f73697420666f7220746865206e6f6e2d6578697374656e7420636861696e2063757272656e746c792c206275742069732068656c6420617420736f6d6520706f696e7420696e20746865206675747572652e2c41756374696f6e496e666f000088284c65617365506572696f644f663c543e2c20543a3a426c6f636b4e756d62657229040014f820496e666f726d6174696f6e2072656c6174696e6720746f207468652063757272656e742061756374696f6e2c206966207468657265206973206f6e652e00450120546865206669727374206974656d20696e20746865207475706c6520697320746865206c6561736520706572696f6420696e646578207468617420746865206669727374206f662074686520666f7572510120636f6e746967756f7573206c6561736520706572696f6473206f6e2061756374696f6e20697320666f722e20546865207365636f6e642069732074686520626c6f636b206e756d626572207768656e207468655d012061756374696f6e2077696c6c2022626567696e20746f20656e64222c20692e652e2074686520666972737420626c6f636b206f662074686520456e64696e6720506572696f64206f66207468652061756374696f6e2e1c57696e6e696e6700010138543a3a426c6f636b4e756d6265723857696e6e696e67446174613c543e0004000c5d01205468652077696e6e696e67206269647320666f722065616368206f66207468652031302072616e676573206174206561636820626c6f636b20696e207468652066696e616c20456e64696e6720506572696f64206f665101207468652063757272656e742061756374696f6e2e20546865206d61702773206b65792069732074686520302d626173656420696e64657820696e746f2074686520456e64696e6720506572696f642e205468651d0120666972737420626c6f636b206f662074686520656e64696e6720706572696f6420697320303b20746865206c6173742069732060456e64696e67506572696f64202d2031602e3c5265736572766564416d6f756e7473000101504269646465723c543a3a4163636f756e7449643e3042616c616e63654f663c543e00040008310120416d6f756e74732063757272656e746c7920726573657276656420696e20746865206163636f756e7473206f662074686520626964646572732063757272656e746c792077696e6e696e673820287375622d2972616e6765732e304f6e626f6172645175657565010101404c65617365506572696f644f663c543e2c5665633c5061726149643e0004000865012054686520736574206f662050617261204944732074686174206861766520776f6e20616e64206e65656420746f206265206f6e2d626f617264656420617420616e207570636f6d696e67206c656173652d706572696f642ef0205468697320697320636c6561726564206f7574206f6e2074686520666972737420626c6f636b206f6620746865206c6561736520706572696f642e284f6e626f617264696e6700010118506172614964f0284c65617365506572696f644f663c543e2c20496e636f6d696e6750617261636861696e3c543a3a4163636f756e7449642c20543a3a486173683e29000400104d01205468652061637475616c206f6e2d626f617264696e6720696e666f726d6174696f6e2e204f6e6c7920657869737473207768656e206f6e65206f662074686520666f6c6c6f77696e6720697320747275653a2501202d204974206973206265666f726520746865206c6561736520706572696f642074686174207468652070617261636861696e2073686f756c64206265206f6e2d626f61726465642e5901202d205468652066756c6c206f6e2d626f617264696e6720696e666f726d6174696f6e20686173206e6f7420796574206265656e2070726f766964656420616e64207468652070617261636861696e206973206e6f746c207965742064756520746f206265206f66662d626f61726465642e2c4f6666626f617264696e670101011850617261496430543a3a4163636f756e74496400800000000000000000000000000000000000000000000000000000000000000000086501204f66662d626f617264696e67206163636f756e743b2063757272656e63792068656c64206f6e206465706f73697420666f72207468652070617261636861696e206765747320706c6163656420686572652069662074686539012070617261636861696e2067657473206f66662d626f61726465643b20692e652e20697473206c6561736520706572696f6420697320757020616e642069742069736e27742072656e657765642e01182c6e65775f61756374696f6e08206475726174696f6e5c436f6d706163743c543a3a426c6f636b4e756d6265723e486c656173655f706572696f645f696e64657864436f6d706163743c4c65617365506572696f644f663c543e3e1458204372656174652061206e65772061756374696f6e2e00550120546869732063616e206f6e6c792068617070656e207768656e2074686572652069736e277420616c726561647920616e2061756374696f6e20696e2070726f677265737320616e64206d6179206f6e6c7920626529012063616c6c65642062792074686520726f6f74206f726967696e2e20416363657074732074686520606475726174696f6e60206f6620746869732061756374696f6e20616e64207468655d0120606c656173655f706572696f645f696e64657860206f662074686520696e697469616c206c6561736520706572696f64206f662074686520666f757220746861742061726520746f2062652061756374696f6e65642e0c626964140c73756238436f6d706163743c53756249643e3461756374696f6e5f696e64657854436f6d706163743c41756374696f6e496e6465783e2866697273745f736c6f7464436f6d706163743c4c65617365506572696f644f663c543e3e246c6173745f736c6f7464436f6d706163743c4c65617365506572696f644f663c543e3e18616d6f756e7454436f6d706163743c42616c616e63654f663c543e3e404d01204d616b652061206e6577206269642066726f6d20616e206163636f756e742028696e636c7564696e6720612070617261636861696e206163636f756e742920666f72206465706c6f79696e672061206e65772c2070617261636861696e2e005d01204d756c7469706c652073696d756c74616e656f757320626964732066726f6d207468652073616d65206269646465722061726520616c6c6f776564206f6e6c79206173206c6f6e6720617320616c6c2061637469766541012062696473206f7665726c61702065616368206f746865722028692e652e20617265206d757475616c6c79206578636c7573697665292e20426964732063616e6e6f742062652072656461637465642e005901202d20607375626020697320746865207375622d6269646465722049442c20616c6c6f77696e6720666f72206d756c7469706c6520636f6d706574696e67206269647320746f206265206d6164652062792028616e64742066756e64656420627929207468652073616d65206163636f756e742e5101202d206061756374696f6e5f696e646578602069732074686520696e646578206f66207468652061756374696f6e20746f20626964206f6e2e2053686f756c64206a757374206265207468652070726573656e746c2076616c7565206f66206041756374696f6e436f756e746572602e4d01202d206066697273745f736c6f746020697320746865206669727374206c6561736520706572696f6420696e646578206f66207468652072616e676520746f20626964206f6e2e2054686973206973207468650d01206162736f6c757465206c6561736520706572696f6420696e6465782076616c75652c206e6f7420616e2061756374696f6e2d7370656369666963206f66667365742e4501202d20606c6173745f736c6f746020697320746865206c617374206c6561736520706572696f6420696e646578206f66207468652072616e676520746f20626964206f6e2e2054686973206973207468650d01206162736f6c757465206c6561736520706572696f6420696e6465782076616c75652c206e6f7420616e2061756374696f6e2d7370656369666963206f66667365742e4d01202d2060616d6f756e74602069732074686520616d6f756e7420746f2062696420746f2062652068656c64206173206465706f73697420666f72207468652070617261636861696e2073686f756c6420746865cc206269642077696e2e205468697320616d6f756e742069732068656c64207468726f7567686f7574207468652072616e67652e246269645f72656e6577103461756374696f6e5f696e64657854436f6d706163743c41756374696f6e496e6465783e2866697273745f736c6f7464436f6d706163743c4c65617365506572696f644f663c543e3e246c6173745f736c6f7464436f6d706163743c4c65617365506572696f644f663c543e3e18616d6f756e7454436f6d706163743c42616c616e63654f663c543e3e3c5101204d616b652061206e6577206269642066726f6d20612070617261636861696e206163636f756e7420666f722072656e6577696e67207468617420287072652d6578697374696e67292070617261636861696e2e00a820546865206f726967696e202a6d7573742a20626520612070617261636861696e206163636f756e742e005d01204d756c7469706c652073696d756c74616e656f757320626964732066726f6d207468652073616d65206269646465722061726520616c6c6f776564206f6e6c79206173206c6f6e6720617320616c6c2061637469766541012062696473206f7665726c61702065616368206f746865722028692e652e20617265206d757475616c6c79206578636c7573697665292e20426964732063616e6e6f742062652072656461637465642e005101202d206061756374696f6e5f696e646578602069732074686520696e646578206f66207468652061756374696f6e20746f20626964206f6e2e2053686f756c64206a757374206265207468652070726573656e746c2076616c7565206f66206041756374696f6e436f756e746572602e4d01202d206066697273745f736c6f746020697320746865206669727374206c6561736520706572696f6420696e646578206f66207468652072616e676520746f20626964206f6e2e2054686973206973207468650d01206162736f6c757465206c6561736520706572696f6420696e6465782076616c75652c206e6f7420616e2061756374696f6e2d7370656369666963206f66667365742e4501202d20606c6173745f736c6f746020697320746865206c617374206c6561736520706572696f6420696e646578206f66207468652072616e676520746f20626964206f6e2e2054686973206973207468650d01206162736f6c757465206c6561736520706572696f6420696e6465782076616c75652c206e6f7420616e2061756374696f6e2d7370656369666963206f66667365742e4d01202d2060616d6f756e74602069732074686520616d6f756e7420746f2062696420746f2062652068656c64206173206465706f73697420666f72207468652070617261636861696e2073686f756c6420746865cc206269642077696e2e205468697320616d6f756e742069732068656c64207468726f7567686f7574207468652072616e67652e3c7365745f6f6666626f617264696e670410646573748c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f7572636514c82053657420746865206f66662d626f617264696e6720696e666f726d6174696f6e20666f7220612070617261636861696e2e00a820546865206f726967696e202a6d7573742a20626520612070617261636861696e206163636f756e742e002101202d20606465737460206973207468652064657374696e6174696f6e206163636f756e7420746f2072656365697665207468652070617261636861696e2773206465706f7369742e3c6669785f6465706c6f795f64617461100c73756238436f6d706163743c53756249643e1c706172615f69643c436f6d706163743c5061726149643e24636f64655f686173681c543a3a4861736844696e697469616c5f686561645f646174611c5665633c75383e1c2d012053657420746865206465706c6f7920696e666f726d6174696f6e20666f722061207375636365737366756c2062696420746f206465706c6f792061206e65772070617261636861696e2e00c8202d20606f726967696e60206d75737420626520746865207375636365737366756c20626964646572206163636f756e742eb0202d20607375626020697320746865207375622d626964646572204944206f6620746865206269646465722e0101202d2060706172615f696460206973207468652070617261636861696e20494420616c6c6f7474656420746f207468652077696e6e696e67206269646465722e1d01202d2060636f64655f6861736860206973207468652068617368206f66207468652070617261636861696e2773205761736d2076616c69646174696f6e2066756e6374696f6e2ef0202d2060696e697469616c5f686561645f6461746160206973207468652070617261636861696e277320696e697469616c206865616420646174612e54656c61626f726174655f6465706c6f795f64617461081c706172615f69643c436f6d706163743c5061726149643e10636f64651c5665633c75383e3074204e6f74652061206e65772070617261636861696e277320636f64652e004d012054686973206d7573742062652063616c6c656420616674657220606669785f6465706c6f795f646174616020616e642060636f646560206d7573742062652074686520707265696d616765206f6620746865c42060636f64655f68617368602070617373656420746865726520666f72207468652073616d652060706172615f6964602e0061012054686973206d61792062652063616c6c6564206265666f7265206f722061667465722074686520626567696e6e696e67206f66207468652070617261636861696e2773206669727374206c6561736520706572696f642e45012049662063616c6c6564206265666f7265207468656e207468652070617261636861696e2077696c6c206265636f6d65206163746976652061742074686520666972737420626c6f636b206f66206974736501207374617274696e67206c6561736520706572696f642e2049662061667465722c207468656e2069742077696c6c206265636f6d652061637469766520696d6d6564696174656c7920616674657220746869732063616c6c2e006c202d20605f6f726967696e6020697320697272656c6576616e742efc202d2060706172615f696460206973207468652070617261636861696e2049442077686f736520636f64652077696c6c20626520656c61626f72617465642e1501202d2060636f6465602069732074686520707265696d616765206f662074686520726567697374657265642060636f64655f6861736860206f662060706172615f6964602e011c384e65774c65617365506572696f64042c4c65617365506572696f6404842041206e6577206c6561736520706572696f6420697320626567696e6e696e672e3841756374696f6e537461727465640c3041756374696f6e496e6465782c4c65617365506572696f642c426c6f636b4e756d626572084d0120416e2061756374696f6e20737461727465642e2050726f76696465732069747320696e64657820616e642074686520626c6f636b206e756d6265722077686572652069742077696c6c20626567696e20746f190120636c6f736520616e6420746865206669727374206c6561736520706572696f64206f662074686520717561647275706c657420746861742069732061756374696f6e65642e3441756374696f6e436c6f736564043041756374696f6e496e64657804bc20416e2061756374696f6e20656e6465642e20416c6c2066756e6473206265636f6d6520756e72657365727665642e24576f6e4465706c6f7910504e65774269646465723c4163636f756e7449643e24536c6f7452616e6765185061726149641c42616c616e636504550120536f6d656f6e6520776f6e2074686520726967687420746f206465706c6f7920612070617261636861696e2e2042616c616e636520616d6f756e7420697320646564756374656420666f72206465706f7369742e28576f6e52656e6577616c101850617261496424536c6f7452616e67651c42616c616e63651c42616c616e636508c420416e206578697374696e672070617261636861696e20776f6e2074686520726967687420746f20636f6e74696e75652e41012046697273742062616c616e63652069732074686520657874726120616d6f756e7420726573657665642e205365636f6e642069732074686520746f74616c20616d6f756e742072657365727665642e2052657365727665640c244163636f756e7449641c42616c616e63651c42616c616e6365084d012046756e6473207765726520726573657276656420666f7220612077696e6e696e67206269642e2046697273742062616c616e63652069732074686520657874726120616d6f756e742072657365727665642e54205365636f6e642069732074686520746f74616c2e28556e726573657276656408244163636f756e7449641c42616c616e636504e02046756e6473207765726520756e72657365727665642073696e636520626964646572206973206e6f206c6f6e676572206163746976652e0000245265676973747261720124526567697374726172242850617261636861696e7301002c5665633c5061726149643e0400002c546872656164436f756e7401000c753332100000000004b420546865206e756d626572206f66207468726561647320746f207363686564756c652070657220626c6f636b2e3c53656c6563746564546872656164730100785665633c5665633c285061726149642c20436f6c6c61746f724964293e3e040008510120416e206172726179206f6620746865207175657565206f6620736574206f662074687265616473207363686564756c656420666f722074686520636f6d696e6720626c6f636b733b206f726465726564206279310120617363656e64696e6720706172612049442e2054686572652063616e206265206e6f206475706c696361746573206f66207061726120494420696e2065616368206c697374206974656d2e184163746976650100b85665633c285061726149642c204f7074696f6e3c28436f6c6c61746f7249642c20526574726961626c65293e293e0400185d012050617261746872656164732f636861696e73207363686564756c656420666f7220657865637574696f6e207468697320626c6f636b2e2049662074686520636f6c6c61746f72204944206973207365742c207468656e6101206120706172746963756c617220636f6c6c61746f722068617320616c7265616479206265656e2063686f73656e20666f7220746865206e65787420626c6f636b2c20616e64206e6f206f7468657220636f6c6c61746f725901206d61792070726f766964652074686520626c6f636b2e20496e2074686973206361736520776520616c6c6f772074686520706f73736962696c697479206f662074686520636f6d62696e6174696f6e206265696e67d0207265747269656420696e2061206c6174657220626c6f636b2c206578707265737365642062792060526574726961626c65602e004c204f726465726564206279205061726149642e284e65787446726565496401001850617261496410e8030000083d0120546865206e65787420756e75736564205061726149642076616c75652e2053746172742074686973206869676820696e206f7264657220746f206b656570206c6f77206e756d6265727320666f72542073797374656d2d6c6576656c20636861696e732e2c50656e64696e6753776170000101185061726149641850617261496400040004642050656e64696e672073776170206f7065726174696f6e732e145061726173000101185061726149642050617261496e666f00040004a8204d6170206f6620616c6c20726567697374657265642070617261746872656164732f636861696e732e28526574727951756575650100785665633c5665633c285061726149642c20436f6c6c61746f724964293e3e040004e8205468652063757272656e7420717565756520666f7220706172617468726561647320746861742073686f756c6420626520726574726965642e1c446562746f72730101011850617261496430543a3a4163636f756e7449640080000000000000000000000000000000000000000000000000000000000000000004ac2055736572732077686f20686176652070616964206120706172617468726561642773206465706f736974011c3472656769737465725f70617261100869643c436f6d706163743c5061726149643e10696e666f2050617261496e666f10636f64651c5665633c75383e44696e697469616c5f686561645f646174611c5665633c75383e089820526567697374657220612070617261636861696e207769746820676976656e20636f64652e8c204661696c7320696620676976656e20494420697320616c726561647920757365642e3c646572656769737465725f70617261040869643c436f6d706163743c5061726149643e0494204465726567697374657220612070617261636861696e207769746820676976656e206964407365745f7468726561645f636f756e740414636f756e740c75333214410120526573657420746865206e756d626572206f6620706172617468726561647320746861742063616e2070617920746f206265207363686564756c656420696e20612073696e676c6520626c6f636b2e0098202d2060636f756e74603a20546865206e756d626572206f662070617261746872656164732e0084204d7573742062652063616c6c65642066726f6d20526f6f74206f726967696e2e4c72656769737465725f706172617468726561640810636f64651c5665633c75383e44696e697469616c5f686561645f646174611c5665633c75383e10a42052656769737465722061207061726174687265616420666f7220696d6d656469617465207573652e004d01204d7573742062652073656e742066726f6d2061205369676e6564206f726967696e20746861742069732061626c6520746f206861766520506172617468726561644465706f7369742072657365727665642e39012060636f64656020616e642060696e697469616c5f686561645f646174616020617265207573656420746f20696e697469616c697a6520746865207061726174687265616427732073746174652e4473656c6563745f706172617468726561640c0c5f69643c436f6d706163743c5061726149643e245f636f6c6c61746f7228436f6c6c61746f724964285f686561645f686173681c543a3a4861736814050120506c61636520612062696420666f722061207061726174687265616420746f2062652070726f6772657373656420696e20746865206e65787420626c6f636b2e00410120546869732069732061206b696e64206f66207370656369616c207472616e73616374696f6e20746861742073686f756c642062652068656176696c79207072696f726974697a656420696e207468655d01207472616e73616374696f6e20706f6f6c206163636f7264696e6720746f20746865206076616c7565603b206f6e6c792060546872656164436f756e7460206f66207468656d206d61792062652070726573656e7465645420696e20616e792073696e676c6520626c6f636b2e54646572656769737465725f70617261746872656164001cc820446572656769737465722061207061726174687265616420616e6420726574726965766520746865206465706f7369742e002101204d7573742062652073656e742066726f6d2061206050617261636861696e60206f726967696e2077686963682069732063757272656e746c79206120706172617468726561642e00590120456e737572652074686174206265666f72652063616c6c696e672074686973207468617420616e792066756e647320796f752077616e7420656d70746965642066726f6d20746865207061726174687265616427734501206163636f756e74206973206d6f766564206f75743b20616674657220746869732069742077696c6c20626520696d706f737369626c6520746f207265747269657665207468656d2028776974686f75746820676f7665726e616e636520696e74657276656e74696f6e292e107377617004146f746865723c436f6d706163743c5061726149643e206501205377617020612070617261636861696e207769746820616e6f746865722070617261636861696e206f7220706172617468726561642e20546865206f726967696e206d7573742062652061206050617261636861696e602e65012054686520737761702077696c6c2068617070656e206f6e6c7920696620746865726520697320616c726561647920616e206f70706f7369746520737761702070656e64696e672e204966207468657265206973206e6f742c5d012074686520737761702077696c6c2062652073746f72656420696e207468652070656e64696e67207377617073206d61702c20726561647920666f722061206c6174657220636f6e6669726d61746f727920737761702e00610120546865206050617261496460732072656d61696e206d617070656420746f207468652073616d652068656164206461746120616e6420636f646520736f2065787465726e616c20636f64652063616e2072656c79206f6e410120605061726149646020746f2062652061206c6f6e672d7465726d206964656e746966696572206f662061206e6f74696f6e616c202270617261636861696e222e20486f77657665722c2074686569725901207363686564756c696e6720696e666f2028692e652e2077686574686572207468657927726520612070617261746872656164206f722070617261636861696e292c2061756374696f6e20696e666f726d6174696f6e9820616e64207468652061756374696f6e206465706f736974206172652073776974636865642e0108505061726174687265616452656769737465726564041850617261496404d4204120706172617468726561642077617320726567697374657265643b20697473206e657720494420697320737570706c6965642e5850617261746872656164446572656769737465726564041850617261496404d4205468652070617261746872656164206f662074686520737570706c696564204944207761732064652d726567697374657265642e00001c5574696c697479011c5574696c69747904244d756c74697369677300020530543a3a4163636f756e744964205b75383b2033325dd04d756c74697369673c543a3a426c6f636b4e756d6265722c2042616c616e63654f663c543e2c20543a3a4163636f756e7449643e02040004942054686520736574206f66206f70656e206d756c7469736967206f7065726174696f6e732e0114146261746368041463616c6c735c5665633c3c542061732054726169743e3a3a43616c6c3e48802053656e642061206261746368206f662064697370617463682063616c6c732e00ec20546869732077696c6c206578656375746520756e74696c20746865206669727374206f6e65206661696c7320616e64207468656e2073746f702e007c204d61792062652063616c6c65642066726f6d20616e79206f726967696e2e00f0202d206063616c6c73603a205468652063616c6c7320746f20626520646973706174636865642066726f6d207468652073616d65206f726967696e2e002c2023203c7765696768743ea4202d205468652073756d206f66207468652077656967687473206f6620746865206063616c6c73602e34202d204f6e65206576656e742e302023203c2f7765696768743e00590120546869732077696c6c2072657475726e20604f6b6020696e20616c6c2063697263756d7374616e6365732e20546f2064657465726d696e65207468652073756363657373206f66207468652062617463682c20616e3501206576656e74206973206465706f73697465642e20496620612063616c6c206661696c656420616e64207468652062617463682077617320696e7465727275707465642c207468656e20746865590120604261746368496e74657272757074656460206576656e74206973206465706f73697465642c20616c6f6e67207769746820746865206e756d626572206f66207375636365737366756c2063616c6c73206d616465510120616e6420746865206572726f72206f6620746865206661696c65642063616c6c2e20496620616c6c2077657265207375636365737366756c2c207468656e2074686520604261746368436f6d706c657465646050206576656e74206973206465706f73697465642e1861735f7375620814696e6465780c7531361063616c6c5c426f783c3c542061732054726169743e3a3a43616c6c3e1ce02053656e6420612063616c6c207468726f75676820616e20696e64657865642070736575646f6e796d206f66207468652073656e6465722e00d020546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2e002c2023203c7765696768743e70202d2054686520776569676874206f6620746865206063616c6c602e302023203c2f7765696768743e2061735f6d756c746910247468726573686f6c640c753136446f746865725f7369676e61746f72696573445665633c543a3a4163636f756e7449643e3c6d617962655f74696d65706f696e74844f7074696f6e3c54696d65706f696e743c543a3a426c6f636b4e756d6265723e3e1063616c6c5c426f783c3c542061732054726169743e3a3a43616c6c3ea4590120526567697374657220617070726f76616c20666f72206120646973706174636820746f206265206d6164652066726f6d20612064657465726d696e697374696320636f6d706f73697465206163636f756e74206966fc20617070726f766564206279206120746f74616c206f6620607468726573686f6c64202d203160206f6620606f746865725f7369676e61746f72696573602e00b42049662074686572652061726520656e6f7567682c207468656e206469737061746368207468652063616c6c2e005101205061796d656e743a20604d756c74697369674465706f73697442617365602077696c6c20626520726573657276656420696620746869732069732074686520666972737420617070726f76616c2c20706c7573610120607468726573686f6c64602074696d657320604d756c74697369674465706f736974466163746f72602e2049742069732072657475726e6564206f6e636520746869732064697370617463682068617070656e73206f72382069732063616e63656c6c65642e00d020546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2e005901202d20607468726573686f6c64603a2054686520746f74616c206e756d626572206f6620617070726f76616c7320666f722074686973206469737061746368206265666f72652069742069732065786563757465642e4501202d20606f746865725f7369676e61746f72696573603a20546865206163636f756e747320286f74686572207468616e207468652073656e646572292077686f2063616e20617070726f76652074686973702064697370617463682e204d6179206e6f7420626520656d7074792e5d01202d20606d617962655f74696d65706f696e74603a20496620746869732069732074686520666972737420617070726f76616c2c207468656e2074686973206d75737420626520604e6f6e65602e2049662069742069735501206e6f742074686520666972737420617070726f76616c2c207468656e206974206d7573742062652060536f6d65602c2077697468207468652074696d65706f696e742028626c6f636b206e756d62657220616e64d8207472616e73616374696f6e20696e64657829206f662074686520666972737420617070726f76616c207472616e73616374696f6e2e8c202d206063616c6c603a205468652063616c6c20746f2062652065786563757465642e002101204e4f54453a20556e6c6573732074686973206973207468652066696e616c20617070726f76616c2c20796f752077696c6c2067656e6572616c6c792077616e7420746f207573651d012060617070726f76655f61735f6d756c74696020696e73746561642c2073696e6365206974206f6e6c7920726571756972657320612068617368206f66207468652063616c6c2e005d0120526573756c74206973206571756976616c656e7420746f20746865206469737061746368656420726573756c7420696620607468726573686f6c64602069732065786163746c79206031602e204f74686572776973655901206f6e20737563636573732c20726573756c7420697320604f6b6020616e642074686520726573756c742066726f6d2074686520696e746572696f722063616c6c2c206966206974207761732065786563757465642ce0206d617920626520666f756e6420696e20746865206465706f736974656420604d756c7469736967457865637574656460206576656e742e002c2023203c7765696768743e54202d20604f2853202b205a202b2043616c6c29602ed0202d20557020746f206f6e652062616c616e63652d72657365727665206f7220756e72657365727665206f7065726174696f6e2e4101202d204f6e6520706173737468726f756768206f7065726174696f6e2c206f6e6520696e736572742c20626f746820604f285329602077686572652060536020697320746865206e756d626572206f6649012020207369676e61746f726965732e206053602069732063617070656420627920604d61785369676e61746f72696573602c207769746820776569676874206265696e672070726f706f7274696f6e616c2e2501202d204f6e652063616c6c20656e636f6465202620686173682c20626f7468206f6620636f6d706c657869747920604f285a296020776865726520605a602069732074782d6c656e2ec0202d204f6e6520656e636f6465202620686173682c20626f7468206f6620636f6d706c657869747920604f285329602ed8202d20557020746f206f6e652062696e6172792073656172636820616e6420696e736572742028604f286c6f6753202b20532960292efc202d20492f4f3a2031207265616420604f285329602c20757020746f2031206d757461746520604f285329602e20557020746f206f6e652072656d6f76652e34202d204f6e65206576656e742e70202d2054686520776569676874206f6620746865206063616c6c602e3101202d2053746f726167653a20696e7365727473206f6e65206974656d2c2076616c75652073697a6520626f756e64656420627920604d61785369676e61746f72696573602c20776974682061902020206465706f7369742074616b656e20666f7220697473206c69666574696d65206f66f4202020604d756c74697369674465706f73697442617365202b207468726573686f6c64202a204d756c74697369674465706f736974466163746f72602e302023203c2f7765696768743e40617070726f76655f61735f6d756c746910247468726573686f6c640c753136446f746865725f7369676e61746f72696573445665633c543a3a4163636f756e7449643e3c6d617962655f74696d65706f696e74844f7074696f6e3c54696d65706f696e743c543a3a426c6f636b4e756d6265723e3e2463616c6c5f68617368205b75383b2033325d80590120526567697374657220617070726f76616c20666f72206120646973706174636820746f206265206d6164652066726f6d20612064657465726d696e697374696320636f6d706f73697465206163636f756e74206966fc20617070726f766564206279206120746f74616c206f6620607468726573686f6c64202d203160206f6620606f746865725f7369676e61746f72696573602e005101205061796d656e743a20604d756c74697369674465706f73697442617365602077696c6c20626520726573657276656420696620746869732069732074686520666972737420617070726f76616c2c20706c7573610120607468726573686f6c64602074696d657320604d756c74697369674465706f736974466163746f72602e2049742069732072657475726e6564206f6e636520746869732064697370617463682068617070656e73206f72382069732063616e63656c6c65642e00d020546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2e005901202d20607468726573686f6c64603a2054686520746f74616c206e756d626572206f6620617070726f76616c7320666f722074686973206469737061746368206265666f72652069742069732065786563757465642e4501202d20606f746865725f7369676e61746f72696573603a20546865206163636f756e747320286f74686572207468616e207468652073656e646572292077686f2063616e20617070726f76652074686973702064697370617463682e204d6179206e6f7420626520656d7074792e5d01202d20606d617962655f74696d65706f696e74603a20496620746869732069732074686520666972737420617070726f76616c2c207468656e2074686973206d75737420626520604e6f6e65602e2049662069742069735501206e6f742074686520666972737420617070726f76616c2c207468656e206974206d7573742062652060536f6d65602c2077697468207468652074696d65706f696e742028626c6f636b206e756d62657220616e64d8207472616e73616374696f6e20696e64657829206f662074686520666972737420617070726f76616c207472616e73616374696f6e2ed0202d206063616c6c5f68617368603a205468652068617368206f66207468652063616c6c20746f2062652065786563757465642e003901204e4f54453a2049662074686973206973207468652066696e616c20617070726f76616c2c20796f752077696c6c2077616e7420746f20757365206061735f6d756c74696020696e73746561642e002c2023203c7765696768743e28202d20604f285329602ed0202d20557020746f206f6e652062616c616e63652d72657365727665206f7220756e72657365727665206f7065726174696f6e2e4101202d204f6e6520706173737468726f756768206f7065726174696f6e2c206f6e6520696e736572742c20626f746820604f285329602077686572652060536020697320746865206e756d626572206f6649012020207369676e61746f726965732e206053602069732063617070656420627920604d61785369676e61746f72696573602c207769746820776569676874206265696e672070726f706f7274696f6e616c2ec0202d204f6e6520656e636f6465202620686173682c20626f7468206f6620636f6d706c657869747920604f285329602ed8202d20557020746f206f6e652062696e6172792073656172636820616e6420696e736572742028604f286c6f6753202b20532960292efc202d20492f4f3a2031207265616420604f285329602c20757020746f2031206d757461746520604f285329602e20557020746f206f6e652072656d6f76652e34202d204f6e65206576656e742e3101202d2053746f726167653a20696e7365727473206f6e65206974656d2c2076616c75652073697a6520626f756e64656420627920604d61785369676e61746f72696573602c20776974682061902020206465706f7369742074616b656e20666f7220697473206c69666574696d65206f66f4202020604d756c74697369674465706f73697442617365202b207468726573686f6c64202a204d756c74697369674465706f736974466163746f72602e302023203c2f7765696768743e3c63616e63656c5f61735f6d756c746910247468726573686f6c640c753136446f746865725f7369676e61746f72696573445665633c543a3a4163636f756e7449643e2474696d65706f696e746454696d65706f696e743c543a3a426c6f636b4e756d6265723e2463616c6c5f68617368205b75383b2033325d5859012043616e63656c2061207072652d6578697374696e672c206f6e2d676f696e67206d756c7469736967207472616e73616374696f6e2e20416e79206465706f7369742072657365727665642070726576696f75736c79c820666f722074686973206f7065726174696f6e2077696c6c20626520756e7265736572766564206f6e20737563636573732e00d020546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f2e005901202d20607468726573686f6c64603a2054686520746f74616c206e756d626572206f6620617070726f76616c7320666f722074686973206469737061746368206265666f72652069742069732065786563757465642e4501202d20606f746865725f7369676e61746f72696573603a20546865206163636f756e747320286f74686572207468616e207468652073656e646572292077686f2063616e20617070726f76652074686973702064697370617463682e204d6179206e6f7420626520656d7074792e6101202d206074696d65706f696e74603a205468652074696d65706f696e742028626c6f636b206e756d62657220616e64207472616e73616374696f6e20696e64657829206f662074686520666972737420617070726f76616c7c207472616e73616374696f6e20666f7220746869732064697370617463682ed0202d206063616c6c5f68617368603a205468652068617368206f66207468652063616c6c20746f2062652065786563757465642e002c2023203c7765696768743e28202d20604f285329602ed0202d20557020746f206f6e652062616c616e63652d72657365727665206f7220756e72657365727665206f7065726174696f6e2e4101202d204f6e6520706173737468726f756768206f7065726174696f6e2c206f6e6520696e736572742c20626f746820604f285329602077686572652060536020697320746865206e756d626572206f6649012020207369676e61746f726965732e206053602069732063617070656420627920604d61785369676e61746f72696573602c207769746820776569676874206265696e672070726f706f7274696f6e616c2ec0202d204f6e6520656e636f6465202620686173682c20626f7468206f6620636f6d706c657869747920604f285329602e34202d204f6e65206576656e742e88202d20492f4f3a2031207265616420604f285329602c206f6e652072656d6f76652e74202d2053746f726167653a2072656d6f766573206f6e65206974656d2e302023203c2f7765696768743e0118404261746368496e746572727570746564080c7533323444697370617463684572726f72085901204261746368206f66206469737061746368657320646964206e6f7420636f6d706c6574652066756c6c792e20496e646578206f66206669727374206661696c696e6720646973706174636820676976656e2c2061734c2077656c6c20617320746865206572726f722e384261746368436f6d706c657465640004cc204261746368206f66206469737061746368657320636f6d706c657465642066756c6c792077697468206e6f206572726f722e2c4e65774d756c746973696708244163636f756e744964244163636f756e7449640849012041206e6577206d756c7469736967206f7065726174696f6e2068617320626567756e2e20466972737420706172616d20697320746865206163636f756e74207468617420697320617070726f76696e672c80207365636f6e6420697320746865206d756c7469736967206163636f756e742e404d756c7469736967417070726f76616c0c244163636f756e7449645854696d65706f696e743c426c6f636b4e756d6265723e244163636f756e7449640859012041206d756c7469736967206f7065726174696f6e20686173206265656e20617070726f76656420627920736f6d656f6e652e20466972737420706172616d20697320746865206163636f756e742074686174206973a820617070726f76696e672c20746869726420697320746865206d756c7469736967206163636f756e742e404d756c7469736967457865637574656410244163636f756e7449645854696d65706f696e743c426c6f636b4e756d6265723e244163636f756e744964384469737061746368526573756c74082d012041206d756c7469736967206f7065726174696f6e20686173206265656e2065786563757465642e20466972737420706172616d20697320746865206163636f756e742074686174206973a820617070726f76696e672c20746869726420697320746865206d756c7469736967206163636f756e742e444d756c746973696743616e63656c6c65640c244163636f756e7449645854696d65706f696e743c426c6f636b4e756d6265723e244163636f756e7449640831012041206d756c7469736967206f7065726174696f6e20686173206265656e2063616e63656c6c65642e20466972737420706172616d20697320746865206163636f756e742074686174206973ac2063616e63656c6c696e672c20746869726420697320746865206d756c7469736967206163636f756e742e0000204964656e7469747901105375646f10284964656e746974794f6600010130543a3a4163636f756e74496468526567697374726174696f6e3c42616c616e63654f663c543e3e00040004210120496e666f726d6174696f6e20746861742069732070657274696e656e7420746f206964656e746966792074686520656e7469747920626568696e6420616e206163636f756e742e1c53757065724f6600010130543a3a4163636f756e7449645028543a3a4163636f756e7449642c204461746129000400086101205468652073757065722d6964656e74697479206f6620616e20616c7465726e6174697665202273756222206964656e7469747920746f676574686572207769746820697473206e616d652c2077697468696e2074686174510120636f6e746578742e20496620746865206163636f756e74206973206e6f7420736f6d65206f74686572206163636f756e742773207375622d6964656e746974792c207468656e206a75737420604e6f6e65602e18537562734f6601010130543a3a4163636f756e744964842842616c616e63654f663c543e2c205665633c543a3a4163636f756e7449643e29004400000000000000000000000000000000000cb820416c7465726e6174697665202273756222206964656e746974696573206f662074686973206163636f756e742e001d0120546865206669727374206974656d20697320746865206465706f7369742c20746865207365636f6e64206973206120766563746f72206f6620746865206163636f756e74732e28526567697374726172730100d85665633c4f7074696f6e3c526567697374726172496e666f3c42616c616e63654f663c543e2c20543a3a4163636f756e7449643e3e3e0400104d012054686520736574206f6620726567697374726172732e204e6f7420657870656374656420746f206765742076657279206269672061732063616e206f6e6c79206265206164646564207468726f7567682061a8207370656369616c206f726967696e20286c696b656c79206120636f756e63696c206d6f74696f6e292e0029012054686520696e64657820696e746f20746869732063616e206265206361737420746f2060526567697374726172496e6465786020746f2067657420612076616c69642076616c75652e012c346164645f726567697374726172041c6163636f756e7430543a3a4163636f756e744964347c2041646420612072656769737472617220746f207468652073797374656d2e001d0120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d75737420626520605265676973747261724f726967696e60206f722060526f6f74602e00ac202d20606163636f756e74603a20746865206163636f756e74206f6620746865207265676973747261722e009820456d6974732060526567697374726172416464656460206966207375636365737366756c2e002c2023203c7765696768743ee4202d20604f2852296020776865726520605260207265676973747261722d636f756e742028676f7665726e616e63652d626f756e646564292e9c202d204f6e652073746f72616765206d75746174696f6e2028636f64656320604f28522960292e34202d204f6e65206576656e742e302023203c2f7765696768743e307365745f6964656e746974790410696e666f304964656e74697479496e666f482d012053657420616e206163636f756e742773206964656e7469747920696e666f726d6174696f6e20616e6420726573657276652074686520617070726f707269617465206465706f7369742e00590120496620746865206163636f756e7420616c726561647920686173206964656e7469747920696e666f726d6174696f6e2c20746865206465706f7369742069732074616b656e2061732070617274207061796d656e745420666f7220746865206e6577206465706f7369742e00650120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e64207468652073656e646572206d75737420686176652061207265676973746572656428206964656e746974792e0090202d2060696e666f603a20546865206964656e7469747920696e666f726d6174696f6e2e008c20456d69747320604964656e7469747953657460206966207375636365737366756c2e002c2023203c7765696768743e0501202d20604f2858202b2052296020776865726520605860206164646974696f6e616c2d6669656c642d636f756e7420286465706f7369742d626f756e646564292e88202d204174206d6f73742074776f2062616c616e6365206f7065726174696f6e732eac202d204f6e652073746f72616765206d75746174696f6e2028636f64656320604f2858202b20522960292e34202d204f6e65206576656e742e302023203c2f7765696768743e207365745f73756273041073756273645665633c28543a3a4163636f756e7449642c2044617461293e40902053657420746865207375622d6163636f756e7473206f66207468652073656e6465722e005901205061796d656e743a20416e79206167677265676174652062616c616e63652072657365727665642062792070726576696f757320607365745f73756273602063616c6c732077696c6c2062652072657475726e6564310120616e6420616e20616d6f756e7420605375624163636f756e744465706f736974602077696c6c20626520726573657276656420666f722065616368206974656d20696e206073756273602e00650120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e64207468652073656e646572206d75737420686176652061207265676973746572656428206964656e746974792e009c202d206073756273603a20546865206964656e746974792773207375622d6163636f756e74732e002c2023203c7765696768743eec202d20604f285329602077686572652060536020737562732d636f756e742028686172642d20616e64206465706f7369742d626f756e646564292e88202d204174206d6f73742074776f2062616c616e6365206f7065726174696f6e732e4101202d204174206d6f7374204f2832202a2053202b2031292073746f72616765206d75746174696f6e733b20636f64656320636f6d706c657869747920604f2831202a2053202b2053202a20312960293b582020206f6e652073746f726167652d6578697374732e302023203c2f7765696768743e38636c6561725f6964656e74697479003c390120436c65617220616e206163636f756e742773206964656e7469747920696e666f20616e6420616c6c207375622d6163636f756e7420616e642072657475726e20616c6c206465706f736974732e00f0205061796d656e743a20416c6c2072657365727665642062616c616e636573206f6e20746865206163636f756e74206172652072657475726e65642e00650120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e64207468652073656e646572206d75737420686176652061207265676973746572656428206964656e746974792e009c20456d69747320604964656e74697479436c656172656460206966207375636365737366756c2e002c2023203c7765696768743e48202d20604f2852202b2053202b205829602e84202d204f6e652062616c616e63652d72657365727665206f7065726174696f6e2e74202d206053202b2032602073746f726167652064656c6574696f6e732e34202d204f6e65206576656e742e302023203c2f7765696768743e44726571756573745f6a756467656d656e7408247265675f696e6465785c436f6d706163743c526567697374726172496e6465783e1c6d61785f66656554436f6d706163743c42616c616e63654f663c543e3e5c9820526571756573742061206a756467656d656e742066726f6d2061207265676973747261722e005901205061796d656e743a204174206d6f737420606d61785f666565602077696c6c20626520726573657276656420666f72207061796d656e7420746f2074686520726567697374726172206966206a756467656d656e741c20676976656e2e00390120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e64207468652073656e646572206d75737420686176652061542072656769737465726564206964656e746974792e002101202d20607265675f696e646578603a2054686520696e646578206f6620746865207265676973747261722077686f7365206a756467656d656e74206973207265717565737465642e5901202d20606d61785f666565603a20546865206d6178696d756d206665652074686174206d617920626520706169642e20546869732073686f756c64206a757374206265206175746f2d706f70756c617465642061733a0034206060606e6f636f6d70696c65a42053656c663a3a72656769737472617273287265675f696e646578292e75776e72617028292e666565102060606000a820456d69747320604a756467656d656e7452657175657374656460206966207375636365737366756c2e002c2023203c7765696768743e38202d20604f2852202b205829602e84202d204f6e652062616c616e63652d72657365727665206f7065726174696f6e2ebc202d2053746f726167653a2031207265616420604f285229602c2031206d757461746520604f2858202b205229602e34202d204f6e65206576656e742e302023203c2f7765696768743e3863616e63656c5f7265717565737404247265675f696e64657838526567697374726172496e646578446c2043616e63656c20612070726576696f757320726571756573742e00fc205061796d656e743a20412070726576696f75736c79207265736572766564206465706f7369742069732072657475726e6564206f6e20737563636573732e00390120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e64207468652073656e646572206d75737420686176652061542072656769737465726564206964656e746974792e004901202d20607265675f696e646578603a2054686520696e646578206f6620746865207265676973747261722077686f7365206a756467656d656e74206973206e6f206c6f6e676572207265717565737465642e00b020456d69747320604a756467656d656e74556e72657175657374656460206966207375636365737366756c2e002c2023203c7765696768743e38202d20604f2852202b205829602e84202d204f6e652062616c616e63652d72657365727665206f7065726174696f6e2e8c202d204f6e652073746f72616765206d75746174696f6e20604f2852202b205829602e34202d204f6e65206576656e742e302023203c2f7765696768743e1c7365745f6665650814696e6465785c436f6d706163743c526567697374726172496e6465783e0c66656554436f6d706163743c42616c616e63654f663c543e3e301d0120536574207468652066656520726571756972656420666f722061206a756467656d656e7420746f206265207265717565737465642066726f6d2061207265676973747261722e00590120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e64207468652073656e646572206d75737420626520746865206163636f756e74a4206f6620746865207265676973747261722077686f736520696e6465782069732060696e646578602e00f8202d2060696e646578603a2074686520696e646578206f6620746865207265676973747261722077686f73652066656520697320746f206265207365742e58202d2060666565603a20746865206e6577206665652e002c2023203c7765696768743e28202d20604f285229602e7c202d204f6e652073746f72616765206d75746174696f6e20604f285229602e302023203c2f7765696768743e387365745f6163636f756e745f69640814696e6465785c436f6d706163743c526567697374726172496e6465783e0c6e657730543a3a4163636f756e74496430c0204368616e676520746865206163636f756e74206173736f63696174656420776974682061207265676973747261722e00590120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e64207468652073656e646572206d75737420626520746865206163636f756e74a4206f6620746865207265676973747261722077686f736520696e6465782069732060696e646578602e00f8202d2060696e646578603a2074686520696e646578206f6620746865207265676973747261722077686f73652066656520697320746f206265207365742e74202d20606e6577603a20746865206e6577206163636f756e742049442e002c2023203c7765696768743e28202d20604f285229602e7c202d204f6e652073746f72616765206d75746174696f6e20604f285229602e302023203c2f7765696768743e287365745f6669656c64730814696e6465785c436f6d706163743c526567697374726172496e6465783e186669656c6473384964656e746974794669656c647330ac2053657420746865206669656c6420696e666f726d6174696f6e20666f722061207265676973747261722e00590120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e64207468652073656e646572206d75737420626520746865206163636f756e74a4206f6620746865207265676973747261722077686f736520696e6465782069732060696e646578602e00f8202d2060696e646578603a2074686520696e646578206f6620746865207265676973747261722077686f73652066656520697320746f206265207365742e1101202d20606669656c6473603a20746865206669656c64732074686174207468652072656769737472617220636f6e6365726e73207468656d73656c76657320776974682e002c2023203c7765696768743e28202d20604f285229602e7c202d204f6e652073746f72616765206d75746174696f6e20604f285229602e302023203c2f7765696768743e4470726f766964655f6a756467656d656e740c247265675f696e6465785c436f6d706163743c526567697374726172496e6465783e187461726765748c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f75726365246a756467656d656e745c4a756467656d656e743c42616c616e63654f663c543e3e4cbc2050726f766964652061206a756467656d656e7420666f7220616e206163636f756e742773206964656e746974792e00590120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f5369676e65645f20616e64207468652073656e646572206d75737420626520746865206163636f756e74b4206f6620746865207265676973747261722077686f736520696e64657820697320607265675f696e646578602e002501202d20607265675f696e646578603a2074686520696e646578206f6620746865207265676973747261722077686f7365206a756467656d656e74206973206265696e67206d6164652e5901202d2060746172676574603a20746865206163636f756e742077686f7365206964656e7469747920746865206a756467656d656e742069732075706f6e2e2054686973206d75737420626520616e206163636f756e74782020207769746820612072656769737465726564206964656e746974792e4d01202d20606a756467656d656e74603a20746865206a756467656d656e74206f662074686520726567697374726172206f6620696e64657820607265675f696e646578602061626f75742060746172676574602e009820456d69747320604a756467656d656e74476976656e60206966207375636365737366756c2e002c2023203c7765696768743e38202d20604f2852202b205829602e88202d204f6e652062616c616e63652d7472616e73666572206f7065726174696f6e2e98202d20557020746f206f6e65206163636f756e742d6c6f6f6b7570206f7065726174696f6e2ebc202d2053746f726167653a2031207265616420604f285229602c2031206d757461746520604f2852202b205829602e34202d204f6e65206576656e742e302023203c2f7765696768743e346b696c6c5f6964656e7469747904187461726765748c3c543a3a4c6f6f6b7570206173205374617469634c6f6f6b75703e3a3a536f757263654c45012052656d6f766520616e206163636f756e742773206964656e7469747920616e64207375622d6163636f756e7420696e666f726d6174696f6e20616e6420736c61736820746865206465706f736974732e006501205061796d656e743a2052657365727665642062616c616e6365732066726f6d20607365745f737562736020616e6420607365745f6964656e74697479602061726520736c617368656420616e642068616e646c656420627949012060536c617368602e20566572696669636174696f6e2072657175657374206465706f7369747320617265206e6f742072657475726e65643b20746865792073686f756c642062652063616e63656c6c656484206d616e75616c6c79207573696e67206063616e63656c5f72657175657374602e00310120546865206469737061746368206f726967696e20666f7220746869732063616c6c206d757374206265205f526f6f745f206f72206d617463682060543a3a466f7263654f726967696e602e005901202d2060746172676574603a20746865206163636f756e742077686f7365206964656e7469747920746865206a756467656d656e742069732075706f6e2e2054686973206d75737420626520616e206163636f756e74782020207769746820612072656769737465726564206964656e746974792e009820456d69747320604964656e746974794b696c6c656460206966207375636365737366756c2e002c2023203c7765696768743e48202d20604f2852202b2053202b205829602e84202d204f6e652062616c616e63652d72657365727665206f7065726174696f6e2e74202d206053202b2032602073746f72616765206d75746174696f6e732e34202d204f6e65206576656e742e302023203c2f7765696768743e011c2c4964656e7469747953657404244163636f756e74496404f02041206e616d652077617320736574206f72207265736574202877686963682077696c6c2072656d6f766520616c6c206a756467656d656e7473292e3c4964656e74697479436c656172656408244163636f756e7449641c42616c616e636504d02041206e616d652077617320636c65617265642c20616e642074686520676976656e2062616c616e63652072657475726e65642e384964656e746974794b696c6c656408244163636f756e7449641c42616c616e636504c82041206e616d65207761732072656d6f76656420616e642074686520676976656e2062616c616e636520736c61736865642e484a756467656d656e7452657175657374656408244163636f756e74496438526567697374726172496e64657804a02041206a756467656d656e74207761732061736b65642066726f6d2061207265676973747261722e504a756467656d656e74556e72657175657374656408244163636f756e74496438526567697374726172496e646578048c2041206a756467656d656e74207265717565737420776173207265747261637465642e384a756467656d656e74476976656e08244163636f756e74496438526567697374726172496e64657804982041206a756467656d656e742077617320676976656e2062792061207265676973747261722e3852656769737472617241646465640438526567697374726172496e646578045c204120726567697374726172207761732061646465642e002c48546f6f4d616e795375624163636f756e7473046020546f6f206d616e7920737562732d6163636f756e74732e204e6f74466f756e640454204163636f756e742069736e277420666f756e642e204e6f744e616d65640454204163636f756e742069736e2774206e616d65642e28456d707479496e646578043420456d70747920696e6465782e284665654368616e676564044020466565206973206368616e6765642e284e6f4964656e74697479044c204e6f206964656e7469747920666f756e642e3c537469636b794a756467656d656e74044820537469636b79206a756467656d656e742e384a756467656d656e74476976656e0444204a756467656d656e7420676976656e2e40496e76616c69644a756467656d656e74044c20496e76616c6964206a756467656d656e742e30496e76616c6964496e64657804582054686520696e64657820697320696e76616c69642e34496e76616c6964546172676574045c205468652074617267657420697320696e76616c69642e"
cls.metadata_decoder = MetadataDecoder(ScaleBytes(metadata_v10_hex))
cls.metadata_decoder.decode()
def test_type_registry_versioning(self):
# Event containing old definition of DispatchError which changed since runtime version 1032
RuntimeConfiguration().set_active_spec_version_id(1032)
events_payload_1020 = '0x14000000000000001027000001010000010000000000102700000001000002000000000040420f0000010000030000000d05e8f6971c000000000000000000000000000003000000000101060020a10700000100'
# events_payload_1022 = '0x14000000000000001027000001010000010000000000102700000001000002000000000040420f0000010000030000000d054cb927160000000000000000000000000000030000000001011000a0860100000100'
events_decoder = EventsDecoder(
data=ScaleBytes(events_payload_1020),
metadata=self.metadata_decoder
)
# Should fail with current runtime version
self.assertRaises(ValueError, events_decoder.decode)
# Change runtime version id
RuntimeConfiguration().set_active_spec_version_id(1020)
events_decoder = EventsDecoder(
data=ScaleBytes(events_payload_1020),
metadata=self.metadata_decoder
)
# Now should succeed
events_decoder.decode()
self.assertEqual(len(events_decoder.value), 5)
self.assertEqual(events_decoder.value[4]['event_id'], "ExtrinsicFailed")
def test_type_registry_versioning_struct(self):
RuntimeConfiguration().clear_type_registry()
RuntimeConfiguration().update_type_registry(load_type_registry_preset("default"))
RuntimeConfiguration().update_type_registry(load_type_registry_preset("kusama"))
RuntimeConfiguration().set_active_spec_version_id(1019)
type_cls = RuntimeConfiguration().get_decoder_class("StakingLedger<AccountId, BalanceOf>")
self.assertEqual(type_cls.type_mapping, [
[
"stash",
"AccountId"
],
[
"total",
"Compact<Balance>"
],
[
"active",
"Compact<Balance>"
],
[
"unlocking",
"Vec<UnlockChunk>"
]
])
RuntimeConfiguration().set_active_spec_version_id(1055)
type_cls = RuntimeConfiguration().get_decoder_class("StakingLedger<AccountId, BalanceOf>")
self.assertEqual(type_cls.type_mapping, [
[
"stash",
"AccountId"
],
[
"total",
"Compact<Balance>"
],
[
"active",
"Compact<Balance>"
],
[
"unlocking",
"Vec<UnlockChunk>"
],
[
"lastReward",
"Option<EraIndex>"
]
])
RuntimeConfiguration().set_active_spec_version_id(2019)
type_cls = RuntimeConfiguration().get_decoder_class("StakingLedger<AccountId, BalanceOf>")
self.assertEqual(type_cls.type_mapping, [
[
"stash",
"AccountId"
],
[
"total",
"Compact<Balance>"
],
[
"active",
"Compact<Balance>"
],
[
"unlocking",
"Vec<UnlockChunk>"
],
[
"claimedRewards",
"Vec<EraIndex>"
]
])
def test_type_registry_versioning_type_changed(self):
# Extrinsic containing identity.set_identity without 'twitter' field introduced since runtime version 1038
extrinsic_payload_1030 = '0xdd0284ff8d2879e723893e28c9a7aad53b7c2f464e87019b36d84d990be4509a2e76c64900b84842431eee87e277592d652bc6d911dc11b2b49098c9a307b0e9e774a3149a0a9c205a77d6d74e09cdafdea7ddf815ae210bd54776c68d9557a2d4370d5c0c25000800190100085733462d3031361357656220332e3020466f756e646174696f6e1868747470733a2f2f776562332e666f756e646174696f6e00176465766f707340776562332e666f756e646174696f6e0000'
extrinsic_payload_1040 = '0x150384ff8ef5289702f6b8c7d22b3562ffda7d5593a5f6414226925e72097efbf9b25720013e8921e59463f8fef5a45b879b0f6f24b689ccfcef9ab793e21fcd0b638aa8744d6b45c11a410feeee122b6f9f8e3f6b51c4583b0ddfe15047162070fcdf730f650340001901000d5265676973747261722023310d5265676973747261722023311868747470733a2f2f7777772e63686576646f722e636f6d144063686576646f723a6d61747269782e6f72671263686576646f7240676d61696c2e636f6d000000'
# Change runtime version id
RuntimeConfiguration().set_active_spec_version_id(1030)
extrinsics_decoder = ExtrinsicsDecoder(
data=ScaleBytes(extrinsic_payload_1030),
metadata=self.metadata_decoder
)
extrinsic_data = extrinsics_decoder.decode()
self.assertEqual(extrinsic_data['call_function'], 'set_identity')
self.assertEqual(extrinsic_data['call_module'], 'Identity')
self.assertNotIn('twitter', extrinsic_data['params'][0]['value'])
# Change runtime version id
RuntimeConfiguration().set_active_spec_version_id(1040)
extrinsics_decoder = ExtrinsicsDecoder(
data=ScaleBytes(extrinsic_payload_1040),
metadata=self.metadata_decoder
)
extrinsic_data = extrinsics_decoder.decode()
self.assertEqual(extrinsic_data['call_function'], 'set_identity')
self.assertEqual(extrinsic_data['call_module'], 'Identity')
self.assertIn('twitter', extrinsic_data['params'][0]['value'])
def test_valid_type_registry_presets(self):
preset_path = os.path.join(os.path.dirname(__file__), '..', 'scalecodec', 'type_registry')
for filename in os.listdir(preset_path):
filename_obj = Path(filename)
if filename_obj.suffix == '.json':
type_registry = load_type_registry_preset(Path(filename).stem)
# Check requirements of JSON file
self.assertIn('types', type_registry)
# Try to apply type registry preset
RuntimeConfiguration().clear_type_registry()
RuntimeConfiguration().update_type_registry(load_type_registry_preset('default'))
RuntimeConfiguration().update_type_registry(type_registry)
original_type_reg = copy.deepcopy(RuntimeConfiguration().type_registry)
if 'runtime_id' in type_registry:
self.assertTrue(isinstance(type_registry['runtime_id'], int))
latest_runtime_id = type_registry['runtime_id']
# Switch type registry versioning state
RuntimeConfiguration().set_active_spec_version_id(0)
RuntimeConfiguration().set_active_spec_version_id(latest_runtime_id)
# Test if switch resulted in identical type registry
for type_string, type_definition in RuntimeConfiguration().type_registry['types'].items():
if type_definition:
self.assertEqual(
type_definition.__name__,
original_type_reg['types'][type_string].__name__,
'Type string "{}" mismatch between latest state and when versioning is applied'.format(
type_string
)
)
def test_not_existing_type_registry_preset(self):
with self.assertRaises(ValueError) as cm:
load_type_registry_preset('unknown')
self.assertEqual('Unsupported type registry preset "unknown"', str(cm.exception))
| 790.92511 | 170,443 | 0.982839 | 752 | 179,540 | 234.375 | 0.292553 | 0.002451 | 0.001123 | 0.001685 | 0.012942 | 0.012448 | 0.010462 | 0.009736 | 0.009736 | 0.009736 | 0 | 0.864691 | 0.013395 | 179,540 | 226 | 170,444 | 794.424779 | 0.130313 | 0.007191 | 0 | 0.432258 | 0 | 0 | 0.965885 | 0.961884 | 0 | 1 | 0.96148 | 0 | 0.109677 | 1 | 0.03871 | false | 0 | 0.051613 | 0 | 0.096774 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4d38113d6ecd3ae5ae08e6ed94ee35da9968d893 | 3,866 | py | Python | clients/python_client/onnx_mnist_client.py | jnclt/simple_tensorflow_serving | 61c7227c8c9c77ae08cf7baa1e315036edd65e7a | [
"Apache-2.0"
] | 771 | 2018-01-23T07:15:53.000Z | 2022-03-21T07:32:19.000Z | clients/python_client/onnx_mnist_client.py | jnclt/simple_tensorflow_serving | 61c7227c8c9c77ae08cf7baa1e315036edd65e7a | [
"Apache-2.0"
] | 90 | 2018-01-24T13:53:24.000Z | 2021-07-23T02:45:13.000Z | clients/python_client/onnx_mnist_client.py | jnclt/simple_tensorflow_serving | 61c7227c8c9c77ae08cf7baa1e315036edd65e7a | [
"Apache-2.0"
] | 211 | 2018-01-25T13:37:40.000Z | 2022-03-30T19:49:39.000Z | #!/usr/bin/env python
import requests
def main():
endpoint = "http://127.0.0.1:8500"
input_data = {
#"model_name": "onnx_mnist_model",
#"model_version": 1,
"data": {
"data": [[[[
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
], [
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1
]]]]
}
}
result = requests.post(endpoint, json=input_data)
print(result.text)
if __name__ == "__main__":
main()
| 36.471698 | 79 | 0.252716 | 825 | 3,866 | 1.167273 | 0.032727 | 1.626168 | 2.436137 | 3.244029 | 0.814123 | 0.814123 | 0.814123 | 0.814123 | 0.814123 | 0.814123 | 0 | 0.423097 | 0.513968 | 3,866 | 105 | 80 | 36.819048 | 0.089409 | 0.018624 | 0 | 0.864583 | 0 | 0 | 0.00976 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010417 | false | 0 | 0.010417 | 0 | 0.020833 | 0.010417 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
4d39933bc4875a2fb28c3e63f4c824af527a8fd9 | 13,241 | py | Python | unit_commitment/test_cases/case24.py | Matrixeigs/EnergyManagementSourceCodes | 1ea824941fe87528622ec7aa8148024752a3947c | [
"MIT"
] | 3 | 2021-10-21T07:28:38.000Z | 2022-02-17T11:30:52.000Z | unit_commitment/test_cases/case24.py | Matrixeigs/EnergyManagementSourceCodes | 1ea824941fe87528622ec7aa8148024752a3947c | [
"MIT"
] | null | null | null | unit_commitment/test_cases/case24.py | Matrixeigs/EnergyManagementSourceCodes | 1ea824941fe87528622ec7aa8148024752a3947c | [
"MIT"
] | null | null | null | """
IEEE-24 bus test systems
"""
from numpy import array
def case24():
"""Power flow data for real wind hydro power systems
@return: Power flow data for jointed wind hydro power systems
"""
ppc = {"version": '2'}
##----- Power Flow Data -----##
## system MVA base
ppc["baseMVA"] = 100.0
## bus data
# bus_i type Pd Qd Gs Bs area Vm Va baseKV zone Vmax Vmin
ppc["bus"] = array([
[1, 2, 108, 22, 0, 0, 1, 1, 0, 138, 1, 1.05, 0.95],
[2, 2, 97, 20, 0, 0, 1, 1, 0, 138, 1, 1.05, 0.95],
[3, 1, 180, 37, 0, 0, 1, 1, 0, 138, 1, 1.05, 0.95],
[4, 1, 74, 15, 0, 0, 1, 1, 0, 138, 1, 1.05, 0.95],
[5, 1, 71, 14, 0, 0, 1, 1, 0, 138, 1, 1.05, 0.95],
[6, 1, 136, 28, 0, -100, 2, 1, 0, 138, 1, 1.05, 0.95],
[7, 2, 125, 25, 0, 0, 2, 1, 0, 138, 1, 1.05, 0.95],
[8, 1, 171, 35, 0, 0, 2, 1, 0, 138, 1, 1.05, 0.95],
[9, 1, 175, 36, 0, 0, 1, 1, 0, 138, 1, 1.05, 0.95],
[10, 1, 195, 40, 0, 0, 2, 1, 0, 138, 1, 1.05, 0.95],
[11, 1, 0, 0, 0, 0, 3, 1, 0, 230, 1, 1.05, 0.95],
[12, 1, 0, 0, 0, 0, 3, 1, 0, 230, 1, 1.05, 0.95],
[13, 3, 265, 54, 0, 0, 3, 1, 0, 230, 1, 1.05, 0.95],
[14, 2, 194, 39, 0, 0, 3, 1, 0, 230, 1, 1.05, 0.95],
[15, 2, 317, 64, 0, 0, 4, 1, 0, 230, 1, 1.05, 0.95],
[16, 2, 100, 20, 0, 0, 4, 1, 0, 230, 1, 1.05, 0.95],
[17, 1, 0, 0, 0, 0, 4, 1, 0, 230, 1, 1.05, 0.95],
[18, 2, 333, 68, 0, 0, 4, 1, 0, 230, 1, 1.05, 0.95],
[19, 1, 181, 37, 0, 0, 3, 1, 0, 230, 1, 1.05, 0.95],
[20, 1, 128, 26, 0, 0, 3, 1, 0, 230, 1, 1.05, 0.95],
[21, 2, 0, 0, 0, 0, 4, 1, 0, 230, 1, 1.05, 0.95],
[22, 2, 0, 0, 0, 0, 4, 1, 0, 230, 1, 1.05, 0.95],
[23, 2, 0, 0, 0, 0, 3, 1, 0, 230, 1, 1.05, 0.95],
[24, 1, 0, 0, 0, 0, 4, 1, 0, 230, 1, 1.05, 0.95]
])
## generator data
# bus, Pg, Qg, Qmax, Qmin, Vg, mBase, status, Pmax, Pmin, Pc1, Pc2,
# Qc1min, Qc1max, Qc2min, Qc2max, ramp_agc, ramp_10, ramp_30, ramp_q, apf
ppc["gen"] = array([
[1, 10, 0, 10, 0, 1.035, 100, 1, 20, 16, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U20
[1, 10, 0, 10, 0, 1.035, 100, 1, 20, 16, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U20
[1, 76, 0, 30, -25, 1.035, 100, 1, 76, 15.2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U76
[1, 76, 0, 30, -25, 1.035, 100, 1, 76, 15.2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U76
[2, 10, 0, 10, 0, 1.035, 100, 1, 20, 16, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U20
[2, 10, 0, 10, 0, 1.035, 100, 1, 20, 16, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U20
[2, 76, 0, 30, -25, 1.035, 100, 1, 76, 15.2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U76
[2, 76, 0, 30, -25, 1.035, 100, 1, 76, 15.2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U76
[7, 80, 0, 60, 0, 1.025, 100, 1, 100, 25, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U100
[7, 80, 0, 60, 0, 1.025, 100, 1, 100, 25, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U100
[7, 80, 0, 60, 0, 1.025, 100, 1, 100, 25, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U100
[13, 95.1, 0, 80, 0, 1.02, 100, 1, 197, 69, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U197
[13, 95.1, 0, 80, 0, 1.02, 100, 1, 197, 69, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U197
[13, 95.1, 0, 80, 0, 1.02, 100, 1, 197, 69, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U197
[14, 0, 35.3, 200, -50, 0.98, 100, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # SynCond
[15, 12, 0, 6, 0, 1.014, 100, 1, 12, 2.4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U12
[15, 12, 0, 6, 0, 1.014, 100, 1, 12, 2.4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U12
[15, 12, 0, 6, 0, 1.014, 100, 1, 12, 2.4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U12
[15, 12, 0, 6, 0, 1.014, 100, 1, 12, 2.4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U12
[15, 12, 0, 6, 0, 1.014, 100, 1, 12, 2.4, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U12
[15, 155, 0, 80, -50, 1.014, 100, 1, 155, 54.3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U155
[16, 155, 0, 80, -50, 1.017, 100, 1, 155, 54.3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U155
[18, 400, 0, 200, -50, 1.05, 100, 1, 400, 100, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U400
[21, 400, 0, 200, -50, 1.05, 100, 1, 400, 100, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U400
[22, 50, 0, 16, -10, 1.05, 100, 1, 50, 10, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U50
[22, 50, 0, 16, -10, 1.05, 100, 1, 50, 10, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U50
[22, 50, 0, 16, -10, 1.05, 100, 1, 50, 10, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U50
[22, 50, 0, 16, -10, 1.05, 100, 1, 50, 10, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U50
[22, 50, 0, 16, -10, 1.05, 100, 1, 50, 10, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U50
[22, 50, 0, 16, -10, 1.05, 100, 1, 50, 10, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U50
[23, 155, 0, 80, -50, 1.05, 100, 1, 155, 54.3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U155
[23, 155, 0, 80, -50, 1.05, 100, 1, 155, 54.3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], # U155
[23, 350, 0, 150, -25, 1.05, 100, 1, 350, 140, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] # U350
])
##----- OPF Data -----##
## area data
# area refbus
ppc["areas"] = array([
[1, 1],
[2, 3],
[3, 8],
[4, 6],
])
## branch data
# fbus, tbus, r, x, b, rateA, rateB, rateC, ratio, angle, status, angmin, angmax
ppc["branch"] = array([
[1, 2, 0.0026, 0.0139, 0.4611, 175, 250, 200, 0, 0, 1, -360, 360],
[1, 3, 0.0546, 0.2112, 0.0572, 175, 208, 220, 0, 0, 1, -360, 360],
[1, 5, 0.0218, 0.0845, 0.0229, 175, 208, 220, 0, 0, 1, -360, 360],
[2, 4, 0.0328, 0.1267, 0.0343, 175, 208, 220, 0, 0, 1, -360, 360],
[2, 6, 0.0497, 0.192, 0.052, 175, 208, 220, 0, 0, 1, -360, 360],
[3, 9, 0.0308, 0.119, 0.0322, 175, 208, 220, 0, 0, 1, -360, 360],
[3, 24, 0.0023, 0.0839, 0, 400, 510, 600, 1.03, 0, 1, -360, 360],
[4, 9, 0.0268, 0.1037, 0.0281, 175, 208, 220, 0, 0, 1, -360, 360],
[5, 10, 0.0228, 0.0883, 0.0239, 175, 208, 220, 0, 0, 1, -360, 360],
[6, 10, 0.0139, 0.0605, 2.459, 175, 193, 200, 0, 0, 1, -360, 360],
[7, 8, 0.0159, 0.0614, 0.0166, 175, 208, 220, 0, 0, 1, -360, 360],
[8, 9, 0.0427, 0.1651, 0.0447, 175, 208, 220, 0, 0, 1, -360, 360],
[8, 10, 0.0427, 0.1651, 0.0447, 175, 208, 220, 0, 0, 1, -360, 360],
[9, 11, 0.0023, 0.0839, 0, 400, 510, 600, 1.03, 0, 1, -360, 360],
[9, 12, 0.0023, 0.0839, 0, 400, 510, 600, 1.03, 0, 1, -360, 360],
[10, 11, 0.0023, 0.0839, 0, 400, 510, 600, 1.02, 0, 1, -360, 360],
[10, 12, 0.0023, 0.0839, 0, 400, 510, 600, 1.02, 0, 1, -360, 360],
[11, 13, 0.0061, 0.0476, 0.0999, 500, 600, 625, 0, 0, 1, -360, 360],
[11, 14, 0.0054, 0.0418, 0.0879, 500, 625, 625, 0, 0, 1, -360, 360],
[12, 13, 0.0061, 0.0476, 0.0999, 500, 625, 625, 0, 0, 1, -360, 360],
[12, 23, 0.0124, 0.0966, 0.203, 500, 625, 625, 0, 0, 1, -360, 360],
[13, 23, 0.0111, 0.0865, 0.1818, 500, 625, 625, 0, 0, 1, -360, 360],
[14, 16, 0.005, 0.0389, 0.0818, 500, 625, 625, 0, 0, 1, -360, 360],
[15, 16, 0.0022, 0.0173, 0.0364, 500, 600, 625, 0, 0, 1, -360, 360],
[15, 21, 0.0063, 0.049, 0.103, 500, 600, 625, 0, 0, 1, -360, 360],
[15, 21, 0.0063, 0.049, 0.103, 500, 600, 625, 0, 0, 1, -360, 360],
[15, 24, 0.0067, 0.0519, 0.1091, 500, 600, 625, 0, 0, 1, -360, 360],
[16, 17, 0.0033, 0.0259, 0.0545, 500, 600, 625, 0, 0, 1, -360, 360],
[16, 19, 0.003, 0.0231, 0.0485, 500, 600, 625, 0, 0, 1, -360, 360],
[17, 18, 0.0018, 0.0144, 0.0303, 500, 600, 625, 0, 0, 1, -360, 360],
[17, 22, 0.0135, 0.1053, 0.2212, 500, 600, 625, 0, 0, 1, -360, 360],
[18, 21, 0.0033, 0.0259, 0.0545, 500, 600, 625, 0, 0, 1, -360, 360],
[18, 21, 0.0033, 0.0259, 0.0545, 500, 600, 625, 0, 0, 1, -360, 360],
[19, 20, 0.0051, 0.0396, 0.0833, 500, 600, 625, 0, 0, 1, -360, 360],
[19, 20, 0.0051, 0.0396, 0.0833, 500, 600, 625, 0, 0, 1, -360, 360],
[20, 23, 0.0028, 0.0216, 0.0455, 500, 600, 625, 0, 0, 1, -360, 360],
[20, 23, 0.0028, 0.0216, 0.0455, 500, 600, 625, 0, 0, 1, -360, 360],
[21, 22, 0.0087, 0.0678, 0.1424, 500, 600, 625, 0, 0, 1, -360, 360]
])
##----- OPF Data -----##
## generator cost data
# 1 startup shutdown n x1 y1 ... xn yn
# 2 startup shutdown n c(n-1) ... c0
ppc["gencost"] = array([
[2, 1500, 0, 3, 0.01199, 37.5510, 117.7511, 0, 0, -1, 1, 0, 0.508, 1.167, 1, 20, 20, 2, 0.25, 3.0],
# 1, 16, 20, 0, 10, U20
[2, 1500, 0, 3, 0.01199, 37.5510, 117.7511, 0, 0, -1, 1, 0, 0.508, 1.167, 1, 20, 20, 2, 0.25, 3.0],
# 1, 16, 20, 0, 10, U20
[2, 1500, 0, 3, 0.00876, 13.3272, 81.1364, 3, 2, 3, 2, 1, 0.642, 1.333, 3, 50, 50, 3, 0.93, 1.2],
# 1, 15.2, 76, -25, 30, U76
[2, 1500, 0, 3, 0.00876, 13.3272, 81.1364, 3, 2, 3, 2, 1, 0.642, 1.333, 3, 50, 50, 3, 0.93, 1.2],
# 1, 15.2, 76, -25, 30, U76
[2, 1500, 0, 3, 0.01199, 37.5510, 117.7511, 0, 0, -1, 1, 0, 0.508, 1.167, 1, 20, 20, 2, 0.25, 3.0],
# 2, 16, 20, 0, 10, U20
[2, 1500, 0, 3, 0.01199, 37.5510, 117.7511, 0, 0, -1, 1, 0, 0.508, 1.167, 1, 20, 20, 2, 0.25, 3.0],
# 2, 16, 20, 0, 10, U20
[2, 1500, 0, 3, 0.00876, 13.3272, 81.1364, 3, 2, 3, 2, 1, 0.642, 1.333, 3, 50, 50, 3, 0.93, 1.2],
# 2, 15.2, 76, -25, 30, U76
[2, 1500, 0, 3, 0.00876, 13.3272, 81.1364, 3, 2, 3, 2, 1, 0.642, 1.333, 3, 50, 50, 3, 0.93, 1.2],
# 2, 15.2, 76, -25, 30, U76
[2, 1500, 0, 3, 0.00623, 18, 217.8952, 4, 2, -3, 2, 2, 0.850, 1.233, 3, 70, 70, 4, 0.2, 2.3],
# 7, 25, 100, 0, 60, U100
[2, 1500, 0, 3, 0.00623, 18, 217.8952, 4, 2, -3, 2, 2, 0.850, 1.233, 3, 70, 70, 4, 0.2, 2.3],
# 7, 25, 100, 0, 60, U100
[2, 1500, 0, 3, 0.00623, 18, 217.8952, 4, 2, -3, 2, 2, 0.850, 1.233, 3, 70, 70, 4, 0.2, 2.3],
# 7, 25, 100, 0, 60, U100
[2, 1500, 0, 3, 0.00259, 23, 259.1310, 5, 4, -4, 4, 2, 0.917, 1.650, 6, 200, 200, 8, 0.2, 2.3],
# 13, 69, 197, 0, 80, U197
[2, 1500, 0, 3, 0.00259, 23, 259.1310, 5, 4, -4, 4, 2, 0.917, 1.650, 6, 200, 200, 8, 0.2, 2.3],
# 13, 69, 197, 0, 80, U197
[2, 1500, 0, 3, 0.00259, 23, 259.1310, 5, 4, -4, 4, 2, 0.917, 1.650, 6, 200, 200, 8, 0.2, 2.3],
# 13, 69, 197, 0, 80, U197
[2, 1500, 0, 3, 0.02533, 25.5472, 24.3891, 0, 0, -1, 0, 0, 0.8, 1.00, 0, 0, 0, 1, 0.10, 2.3], # 14 SynCond
[2, 1500, 0, 3, 0.02649, 25.6753, 24.4110, 0, 0, -1, 0, 0, 0.8, 1.00, 0, 0, 0, 1, 0.01, 2.3],
# 15,2.4,12,0,6, U12
[2, 1500, 0, 3, 0.02649, 25.6753, 24.4110, 0, 0, -1, 0, 0, 0.8, 1.00, 0, 0, 0, 1, 0.01, 2.3],
# 15,2.4,12,0,6, U12
[2, 1500, 0, 3, 0.02649, 25.6753, 24.4110, 0, 0, -1, 0, 0, 0.8, 1.00, 0, 0, 0, 1, 0.01, 2.3],
# 15,2.4,12,0,6, U12
[2, 1500, 0, 3, 0.02649, 25.6753, 24.4110, 0, 0, -1, 0, 0, 0.8, 1.00, 0, 0, 0, 1, 0.01, 2.3],
# 15,2.4,12,0,6, U12
[2, 1500, 0, 3, 0.02649, 25.6753, 24.4110, 0, 0, -1, 0, 0, 0.8, 1.00, 0, 0, 0, 1, 0.01, 2.3],
# 15,2.4,12,0,6, U12
[2, 1500, 0, 3, 0.00473, 10.7154, 143.0288, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.15, 1.2],
# 15, 54.3, 155, -50, 80, U155
[2, 1500, 0, 3, 0.00473, 10.7154, 143.0288, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.15, 1.2],
# 16, 54.3, 155, -50, 80, U155
[2, 1500, 0, 3, 0.00481, 10.7367, 143.3179, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.14, 1.2],
# 18, 100, 400, -50, 200, U400
[2, 1500, 0, 3, 0.00481, 10.7367, 143.3179, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.14, 1.2],
# 21, 100, 400, -50, 200, U400
[2, 1500, 0, 3, 0.00487, 10.7583, 142.5972, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.14, 1.2],
# 22, 10, 50, -10, 16, U50
[2, 1500, 0, 3, 0.00487, 10.7583, 142.5972, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.14, 1.2],
# 22, 10, 50, -10, 16, U50
[2, 1500, 0, 3, 0.00487, 10.7583, 142.5972, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.14, 1.2],
# 22, 10, 50, -10, 16, U50
[2, 1500, 0, 3, 0.00487, 10.7583, 142.5972, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.14, 1.2],
# 22, 10, 50, -10, 16, U50
[2, 1500, 0, 3, 0.00487, 10.7583, 142.5972, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.14, 1.2],
# 22, 10, 50, -10, 16, U50
[2, 1500, 0, 3, 0.00487, 10.7583, 142.5972, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.14, 1.2],
# 22, 10, 50, -10, 16, U50
[2, 1500, 0, 3, 0.00473, 10.7154, 143.0288, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.15, 1.2],
# 23, 54.3, 155, -50, 80, U155
[2, 1500, 0, 3, 0.00473, 10.7154, 143.0288, 5, 3, 5, 3, 2, 0.917, 1.300, 5, 150, 150, 6, 1.15, 1.2],
# 23, 54.3, 155, -50, 80, U155
[2, 1500, 0, 3, 0.00195, 7.5031, 311.9102, 8, 5, 10, 8, 4, 0.842, 1.667, 8, 500, 500, 10, 0, 0.6],
# 23, 140, 350, -25, 150, U350
])
return ppc
| 63.052381 | 115 | 0.429122 | 2,988 | 13,241 | 1.899933 | 0.104083 | 0.155364 | 0.171746 | 0.192355 | 0.78087 | 0.775586 | 0.771006 | 0.770654 | 0.739299 | 0.702131 | 0 | 0.557212 | 0.316215 | 13,241 | 209 | 116 | 63.354067 | 0.069803 | 0.127634 | 0 | 0.482993 | 0 | 0 | 0.003411 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006803 | false | 0 | 0.006803 | 0 | 0.020408 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4dc16fcea29c2c572af0ca40e5173683011a2b58 | 119 | py | Python | tests/test_dependencies.py | bird-house/kingfisher | d3623e8e71e1b0e081833e216369926607dfc594 | [
"Apache-2.0"
] | null | null | null | tests/test_dependencies.py | bird-house/kingfisher | d3623e8e71e1b0e081833e216369926607dfc594 | [
"Apache-2.0"
] | 12 | 2018-11-27T15:51:28.000Z | 2019-02-04T14:04:51.000Z | tests/test_dependencies.py | bird-house/kingfisher | d3623e8e71e1b0e081833e216369926607dfc594 | [
"Apache-2.0"
] | null | null | null | def test_dependencies():
from kingfisher.dependencies import ProductIO
from kingfisher.dependencies import jpy
| 29.75 | 49 | 0.806723 | 13 | 119 | 7.307692 | 0.615385 | 0.294737 | 0.547368 | 0.673684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151261 | 119 | 3 | 50 | 39.666667 | 0.940594 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.666667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
4ded6e0af160dcd7a9da85048f21d082a273a02f | 3,966 | py | Python | models/old/3_qualitative_scene_variation/results_church/scripts/parseReults.py | thegricean/overinformativeness | d20b66148c13af473b57cc4d1736191a49660349 | [
"MIT"
] | 1 | 2016-10-27T18:41:57.000Z | 2016-10-27T18:41:57.000Z | models/old/3_qualitative_scene_variation/results_church/scripts/parseReults.py | thegricean/overinformativeness | d20b66148c13af473b57cc4d1736191a49660349 | [
"MIT"
] | 9 | 2015-11-30T21:44:31.000Z | 2020-04-21T01:26:05.000Z | models/old/3_qualitative_scene_variation/results_church/scripts/parseReults.py | thegricean/overinformativeness | d20b66148c13af473b57cc4d1736191a49660349 | [
"MIT"
] | 2 | 2015-11-25T09:53:20.000Z | 2017-03-17T21:51:18.000Z | f = open("../raw/results.txt")
lines = [l.rstrip().split(",,,") for l in f.readlines()]
f.close()
outfile = open("../parsed/results.txt","w")
allutterances = {"fan":0,"tv":0,"desk":0,"couch":0,"chair":0,"big_fan":0,"small_fan":0,"green_fan":0,"blue_fan":0,"gray_fan":0,"red_fan":0,"brown_fan":0,"big_green_fan":0,"small_green_fan":0,"big_blue_fan":0,"small_blue_fan":0,"big_gray_fan":0,"small_gray_fan":0,"big_red_fan":0,"small_red_fan":0,"big_brown_fan":0,"small_brown_fan":0,"big_tv":0,"small_tv":0,"green_tv":0,"blue_tv":0,"gray_tv":0,"red_tv":0,"brown_tv":0,"big_green_tv":0,"small_green_tv":0,"big_blue_tv":0,"small_blue_tv":0,"big_gray_tv":0,"small_gray_tv":0,"big_red_tv":0,"small_red_tv":0,"big_brown_tv":0,"small_brown_tv":0,"big_desk":0,"small_desk":0,"green_desk":0,"blue_desk":0,"gray_desk":0,"red_desk":0,"brown_desk":0,"big_green_desk":0,"small_green_desk":0,"big_blue_desk":0,"small_blue_desk":0,"big_gray_desk":0,"small_gray_desk":0,"big_red_desk":0,"small_red_desk":0,"big_brown_desk":0,"small_brown_desk":0,"big_couch":0,"small_couch":0,"green_couch":0,"blue_couch":0,"gray_couch":0,"red_couch":0,"brown_couch":0,"big_green_couch":0,"small_green_couch":0,"big_blue_couch":0,"small_blue_couch":0,"big_gray_couch":0,"small_gray_couch":0,"big_red_couch":0,"small_red_couch":0,"big_brown_couch":0,"small_brown_couch":0,"big_chair":0,"small_chair":0,"green_chair":0,"blue_chair":0,"gray_chair":0,"red_chair":0,"brown_chair":0,"big_green_chair":0,"small_green_chair":0,"big_blue_chair":0,"small_blue_chair":0,"big_gray_chair":0,"small_gray_chair":0,"big_red_chair":0,"small_red_chair":0,"big_brown_chair":0,"small_brown_chair":0}
headers = lines[0][0].split(",,")[0].split(",")[0:9] + allutterances.keys()
print len(headers)
outfile.write(",".join(headers)+"\n")
for l in lines:
for case in l:
utts = {"fan":0,"tv":0,"desk":0,"couch":0,"chair":0,"big_fan":0,"small_fan":0,"green_fan":0,"blue_fan":0,"gray_fan":0,"red_fan":0,"brown_fan":0,"big_green_fan":0,"small_green_fan":0,"big_blue_fan":0,"small_blue_fan":0,"big_gray_fan":0,"small_gray_fan":0,"big_red_fan":0,"small_red_fan":0,"big_brown_fan":0,"small_brown_fan":0,"big_tv":0,"small_tv":0,"green_tv":0,"blue_tv":0,"gray_tv":0,"red_tv":0,"brown_tv":0,"big_green_tv":0,"small_green_tv":0,"big_blue_tv":0,"small_blue_tv":0,"big_gray_tv":0,"small_gray_tv":0,"big_red_tv":0,"small_red_tv":0,"big_brown_tv":0,"small_brown_tv":0,"big_desk":0,"small_desk":0,"green_desk":0,"blue_desk":0,"gray_desk":0,"red_desk":0,"brown_desk":0,"big_green_desk":0,"small_green_desk":0,"big_blue_desk":0,"small_blue_desk":0,"big_gray_desk":0,"small_gray_desk":0,"big_red_desk":0,"small_red_desk":0,"big_brown_desk":0,"small_brown_desk":0,"big_couch":0,"small_couch":0,"green_couch":0,"blue_couch":0,"gray_couch":0,"red_couch":0,"brown_couch":0,"big_green_couch":0,"small_green_couch":0,"big_blue_couch":0,"small_blue_couch":0,"big_gray_couch":0,"small_gray_couch":0,"big_red_couch":0,"small_red_couch":0,"big_brown_couch":0,"small_brown_couch":0,"big_chair":0,"small_chair":0,"green_chair":0,"blue_chair":0,"gray_chair":0,"red_chair":0,"brown_chair":0,"big_green_chair":0,"small_green_chair":0,"big_blue_chair":0,"small_blue_chair":0,"big_gray_chair":0,"small_gray_chair":0,"big_red_chair":0,"small_red_chair":0,"big_brown_chair":0,"small_brown_chair":0}
try:
splited = case.split(",,")
splagain = splited[1].split(",")
if splagain[0] == "object":
results = splited[2].split(",")
# print results
caseutts = splagain[9:]
for i,k in enumerate(caseutts):
utts[k] = results[9+i]
outfile.write(",".join(results[:9]+[str(utts[u]) for u in utts.keys()])+"\n")
else:
results = splited[1].split(",")
caseutts = splited[0].split(",")[9:]
# print splited[0].split(",")
for i,k in enumerate(caseutts):
utts[k] = results[9+i]
if results[0] == "o1":
outfile.write(",".join(results[:9]+[str(utts[u]) for u in utts.keys()])+"\n")
except IndexError:
continue
| 101.692308 | 1,499 | 0.712557 | 773 | 3,966 | 3.306598 | 0.075032 | 0.093897 | 0.042254 | 0.018779 | 0.838811 | 0.838811 | 0.838811 | 0.838811 | 0.838811 | 0.838811 | 0 | 0.052548 | 0.045134 | 3,966 | 38 | 1,500 | 104.368421 | 0.622392 | 0.011094 | 0 | 0.206897 | 0 | 0 | 0.544967 | 0.005365 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4df35265984ae431553ebd620fabbe75ed5e9903 | 34,216 | py | Python | tests/test_cache.py | Ma233/olo | 54eb3bd4e1330a0467159f9c968557d471537621 | [
"Apache-2.0"
] | null | null | null | tests/test_cache.py | Ma233/olo | 54eb3bd4e1330a0467159f9c968557d471537621 | [
"Apache-2.0"
] | null | null | null | tests/test_cache.py | Ma233/olo | 54eb3bd4e1330a0467159f9c968557d471537621 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
from .base import db, TestCase, Dummy, Foo, Bar, Lala
from .utils import (
auto_use_cache_ctx, patched_execute, no_cache_client,
no_pk, AE
)
from olo.cache import create_cache
from olo.utils import missing
from olo.errors import CacheError
attrs = dict(
name='foo',
tags=['a', 'b', 'c'],
password='password',
payload={
'abc': ['1', 2, 3],
'def': [4, '5', 6]
}
)
class TestCache(TestCase):
def test_get(self):
dummy = Dummy.create(**attrs)
with patched_execute as execute:
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(dummy.id, _dummy.id)
self.assertTrue(execute.called)
with patched_execute as execute:
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(dummy.id, _dummy.id)
self.assertFalse(execute.called)
_dummy.update(age=666)
with patched_execute as execute:
_dummy = Dummy.cache.get(dummy.id)
self.assertTrue(execute.called)
with patched_execute as execute:
_dummy = Dummy.cache.get(dummy.id)
self.assertFalse(execute.called)
self.assertEqual(dummy.id, _dummy.id)
self.assertEqual(_dummy.age, 666)
with patched_execute as execute:
Dummy.get(dummy.id)
self.assertTrue(execute.called)
with patched_execute as execute:
_dummy = Dummy.cache.get(233)
self.assertIsNone(_dummy)
self.assertTrue(execute.called)
with patched_execute as execute:
_dummy = Dummy.cache.get(233)
self.assertIsNone(_dummy)
self.assertFalse(execute.called)
Dummy.create(id=233, **attrs)
with patched_execute as execute:
_dummy = Dummy.cache.get(233)
self.assertIsNotNone(_dummy)
self.assertEqual(_dummy.id, 233)
self.assertTrue(execute.called)
with patched_execute as execute:
_dummy = Dummy.cache.get(233)
self.assertIsNotNone(_dummy)
self.assertEqual(_dummy.id, 233)
self.assertFalse(execute.called)
with patched_execute as execute:
with auto_use_cache_ctx(Dummy):
_dummy = Dummy.cache.get(233)
self.assertIsNotNone(_dummy)
self.assertEqual(_dummy.id, 233)
self.assertFalse(execute.called)
with no_cache_client(Dummy):
with patched_execute as execute:
_dummy = Dummy.cache.get(233)
self.assertIsNotNone(_dummy)
self.assertEqual(_dummy.id, 233)
self.assertTrue(execute.called)
with patched_execute as execute:
foo = Foo.cache.get(name='170331', age=1)
self.assertIsNone(foo)
self.assertTrue(execute.called)
with patched_execute as execute:
foo = Foo.cache.get(name='170331', age=1)
self.assertIsNone(foo)
self.assertFalse(execute.called)
Foo.create(name='170331', age=1)
with patched_execute as execute:
foo = Foo.cache.get(name='170331', age=1)
self.assertIsNotNone(foo)
self.assertTrue(execute.called)
with patched_execute as execute:
foo = Foo.cache.get(name='170331', age=1)
self.assertIsNotNone(foo)
self.assertFalse(execute.called)
def test_update(self):
dummy = Dummy.create(**attrs)
dummy.name = 'xixi'
dummy.save()
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.name, dummy.name)
dummy.update(name='hehe')
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.name, dummy.name)
_dummy.update(name='wow')
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.name, 'wow')
def test_delete(self):
dummy = Dummy.create(**attrs)
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.id, dummy.id)
dummy.delete()
_dummy = Dummy.cache.get(dummy.id)
self.assertTrue(_dummy is None)
dummy = Dummy.create(**attrs)
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.id, dummy.id)
_dummy.delete()
_dummy = Dummy.cache.get(dummy.id)
self.assertTrue(_dummy is None)
foo = Foo.create(name='foo', age=1)
_foo = Foo.cache.get_by(name='foo', age='1')
self.assertEqual(foo.id, _foo.id)
_foo.delete()
_foo = Foo.cache.get_by(name='foo', age='1')
self.assertTrue(_foo is None)
dummy = Dummy.create(**attrs)
_dummy = Dummy.cache.get(dummy.id)
with no_cache_client(Dummy):
dummy.delete()
_dummy = Dummy.cache.get(dummy.id)
self.assertTrue(_dummy is not None)
def test_transaction(self):
dummy = Dummy.create(**attrs)
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.name, dummy.name)
with db.transaction():
dummy.update(name='lala')
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.name, 'lala')
try:
with db.transaction():
dummy.update(name='hehe')
raise AE
except AE:
pass
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.name, 'lala')
try:
with db.transaction():
dummy.update(name='hehe')
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.name, 'hehe')
dummy.update(name='xixi')
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.name, 'xixi')
raise AE
except AE:
pass
_dummy = Dummy.cache.get(dummy.id)
self.assertEqual(_dummy.name, 'lala')
with db.transaction():
dummy.delete()
_dummy = Dummy.cache.get(dummy.id)
self.assertTrue(_dummy is None)
foo = Foo.create(name='lala', age=1)
try:
with db.transaction():
foo.update(name='xixi')
foo = Foo.cache.get_by(age=foo.age)
self.assertEqual(foo.name, 'xixi')
raise AE
except AE:
pass
foo = Foo.cache.get_by(age=foo.age)
self.assertEqual(foo.name, 'lala')
def test_gets(self):
Foo.create(name='abc', age=1)
Foo.create(name='qwe', age=2)
Foo.create(name='xxx', age=1)
Foo.create(name='yyy', age=1)
idents = [
{'name': 'xxx', 'age': 1},
{'name': 'abc', 'age': 1},
]
with patched_execute as execute:
foos = Foo.cache.gets(idents)
self.assertTrue(execute.called)
self.assertEqual(len(foos), 2)
self.assertEqual(foos[0].name, 'xxx')
self.assertEqual(foos[1].name, 'abc')
with patched_execute as execute:
foos = Foo.cache.gets(idents)
self.assertFalse(execute.called)
self.assertEqual(len(foos), 2)
self.assertEqual(foos[0].name, 'xxx')
self.assertEqual(foos[1].name, 'abc')
with patched_execute as execute:
idents.extend([
{'name': 'qwe', 'age': 1},
{'name': 'yyy', 'age': 1}
])
foos = Foo.cache.gets(idents, filter_none=False)
self.assertTrue(execute.called)
self.assertEqual(len(foos), 4)
self.assertIsNone(foos[2])
with patched_execute as execute:
foos = Foo.cache.gets(idents, filter_none=False)
self.assertFalse(execute.called)
self.assertEqual(len(foos), 4)
self.assertIsNone(foos[2])
with patched_execute as execute:
foos = Foo.cache.gets(idents, filter_none=True)
self.assertFalse(execute.called)
self.assertEqual(len(foos), 3)
with patched_execute as execute:
with auto_use_cache_ctx(Foo):
foos = Foo.cache.gets(idents, filter_none=False)
self.assertFalse(execute.called)
self.assertEqual(len(foos), 4)
self.assertIsNone(foos[2])
with patched_execute as execute:
with no_cache_client(Foo):
foos = Foo.cache.gets(idents, filter_none=False)
self.assertTrue(execute.called)
self.assertEqual(len(foos), 4)
self.assertIsNone(foos[2])
with patched_execute as execute:
with no_pk(Foo):
foos = Foo.cache.gets(idents, filter_none=False)
self.assertFalse(execute.called)
self.assertEqual(len(foos), 4)
self.assertIsNone(foos[2])
with self.assertRaises(CacheError):
Foo.cache.gets([{'age_str': 'b'}])
def test_get_by(self):
Foo.create(name='abc', age=1)
with patched_execute as execute:
foo = Foo.cache.get_by(name='abc', age=1)
self.assertTrue(execute.called)
self.assertEqual(foo.name, 'abc')
with patched_execute as execute:
foo = Foo.cache.get_by(name='abc', age=1)
self.assertFalse(execute.called)
self.assertEqual(foo.name, 'abc')
with patched_execute as execute:
foo = Foo.cache.get_by(key=foo.key)
self.assertTrue(execute.called)
self.assertEqual(foo.name, 'abc')
with patched_execute as execute:
foo = Foo.cache.get_by(key=foo.key)
self.assertFalse(execute.called)
self.assertEqual(foo.name, 'abc')
foo.name = 'qwe'
foo.name = 'hehe'
foo.save()
with patched_execute as execute:
Foo.cache.get_by(key=foo.key)
self.assertTrue(execute.called)
foo = Foo.cache.get_by(key=foo.key)
self.assertEqual(foo.name, 'hehe')
with patched_execute as execute:
foo = Foo.cache.get_by(key=foo.key)
self.assertFalse(execute.called)
self.assertEqual(foo.name, 'hehe')
with patched_execute as execute:
foo = Foo.cache.get_by(name='abc', age=1)
self.assertTrue(execute.called)
self.assertIsNone(foo)
with patched_execute as execute:
foo = Foo.cache.get_by(name='missing', age=1)
self.assertTrue(execute.called)
self.assertIsNone(foo)
with patched_execute as execute:
foo = Foo.cache.get_by(name='missing', age=1)
self.assertFalse(execute.called)
self.assertIsNone(foo)
Foo.create(name='missing', age=1)
with patched_execute as execute:
foo = Foo.cache.get_by(name='missing', age=1)
self.assertTrue(execute.called)
self.assertIsNotNone(foo)
with patched_execute as execute:
foo = Foo.cache.get_by(name='missing', age=1)
self.assertFalse(execute.called)
self.assertIsNotNone(foo)
self.assertEqual(foo.name, 'missing')
with patched_execute as execute:
foo = Foo.cache.get_by(name='missing', age=1)
self.assertFalse(execute.called)
self.assertIsNotNone(foo)
self.assertEqual(foo.name, 'missing')
with patched_execute as execute:
foo = Foo.cache.get_by(age=1)
self.assertIsNotNone(foo)
self.assertTrue(execute.called)
with patched_execute as execute:
foo = Foo.cache.get_by(key='aaa')
self.assertIsNone(foo)
self.assertTrue(execute.called)
with patched_execute as execute:
foo = Foo.cache.get_by(key='aaa')
self.assertIsNone(foo)
self.assertFalse(execute.called)
Foo.create(key='aaa')
with patched_execute as execute:
foo = Foo.cache.get_by(key='aaa')
self.assertIsNotNone(foo)
self.assertTrue(execute.called)
bar = Bar.create(name='a', xixi='a', age=1)
with patched_execute as execute:
_bar = Bar.cache.get_by(xixi='b', age=1)
self.assertIsNone(_bar)
self.assertTrue(execute.called)
with patched_execute as execute:
_bar = Bar.cache.get_by(xixi='b', age=1)
self.assertIsNone(_bar)
self.assertFalse(execute.called)
with patched_execute as execute:
_bar = Bar.cache.get_by(xixi='a', age=1)
self.assertEqual(bar.name, _bar.name)
self.assertTrue(execute.called)
with patched_execute as execute:
_bar = Bar.cache.get_by(xixi='a', age=1)
self.assertEqual(bar.name, _bar.name)
self.assertFalse(execute.called)
bar = Bar.create(name='ab', xixi='ab', age=1, word='1')
with patched_execute as execute:
_bar = Bar.cache.get_by(xixi='ab', age=1, word='1')
self.assertEqual(bar.word, _bar.word)
self.assertTrue(execute.called)
with patched_execute as execute:
_bar = Bar.cache.get_by(xixi='ab', age=1, word='1')
self.assertEqual(bar.word, _bar.word)
self.assertTrue(execute.called)
def test_uk_update(self):
with patched_execute as execute:
foo = Foo.cache.get_by(name='170331', age=1)
self.assertIsNone(foo)
self.assertTrue(execute.called)
with patched_execute as execute:
foo = Foo.cache.get_by(name='170331', age=1)
self.assertIsNone(foo)
self.assertFalse(execute.called)
foo = Foo.create(name='abc', age=1)
foo.update(name='170331')
with patched_execute as execute:
foo = Foo.cache.get_by(name='170331', age=1)
self.assertIsNotNone(foo)
self.assertTrue(execute.called)
with patched_execute as execute:
foo = Foo.cache.get_by(name='170331', age=1)
self.assertIsNotNone(foo)
self.assertFalse(execute.called)
def test_gets_by(self):
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1)
self.assertEqual(bars, [])
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1)
self.assertEqual(bars, [])
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=10)
self.assertEqual(bars, [])
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(bars, [])
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(limit=10)
self.assertEqual(bars, [])
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(limit=11)
self.assertEqual(bars, [])
self.assertFalse(execute.called)
bar = Bar.create(name='a', xixi='a', age=1)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
self.assertEqual(execute.call_count, 2)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(limit=10)
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
bar.update(name='a+')
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
self.assertEqual(execute.call_count, 2)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
bar.update(name='a')
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
self.assertEqual(execute.call_count, 2)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
bar.update(word='1')
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
self.assertEqual(execute.call_count, 1)
self.assertEqual(bars[0].word, bar.word)
bar.update(word='2')
Bar.cache.get(bar.name)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
self.assertEqual(bars[0].word, bar.word)
bar.update(xixi='b')
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 0)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='a', age=1, limit=11)
self.assertEqual(len(bars), 0)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, limit=11)
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
bar = Bar.create(name='b', xixi='b', age=1)
bar = Bar.create(name='c', xixi='b', age=1)
bar = Bar.create(name='d', xixi='b', age=1)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, limit=11)
self.assertEqual(len(bars), 4)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, limit=11)
self.assertEqual(len(bars), 4)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1)
self.assertEqual(len(bars), 4)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=1)
self.assertEqual(len(bars), 3)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, limit=2)
self.assertEqual(len(bars), 2)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2)
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1,
limit=Bar.cache.MAX_COUNT + 1)
self.assertEqual(len(bars), 4)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1,
limit=Bar.cache.MAX_COUNT + 1)
self.assertEqual(len(bars), 4)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by='xixi')
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by='xixi')
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by='age')
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by='age')
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1,
limit=3, order_by='-name')
self.assertEqual(len(bars), 3)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1,
limit=3, order_by='-name')
self.assertEqual(len(bars), 3)
self.assertEqual(['d', 'c', 'b'], map(lambda x: x.name, bars))
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1,
limit=3, order_by='name')
self.assertEqual(len(bars), 3)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1,
limit=3, order_by='name')
self.assertEqual(len(bars), 3)
self.assertEqual(['a', 'b', 'c'], map(lambda x: x.name, bars))
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by=('-age', 'xixi'))
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by=('-age', 'xixi'))
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
with patched_execute as execute:
with auto_use_cache_ctx(Bar):
bars = Bar.gets_by(xixi='b', age=1, start=3,
limit=2, order_by=('-age', 'xixi'))
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
_bar = bars[0]
_bar.update(xixi='c')
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=2,
limit=2, order_by=('-age', 'xixi'))
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
_bar.update(xixi='e')
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=2,
order_by=('-age', 'xixi'))
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
_bar.update(xixi='b')
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by=('xixi', 'age'))
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by=('xixi', 'age'))
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
_bar.update(xixi='e')
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=2,
order_by=('-age', 'xixi'))
self.assertEqual(len(bars), 1)
self.assertTrue(execute.called)
Bar.create(name='e', xixi='b', age=1)
Bar.create(name='f', xixi='b', age=1)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by=('xixi', 'age'))
self.assertEqual(len(bars), 2)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3,
limit=2, order_by=['xixi', 'age'])
self.assertEqual(len(bars), 2)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(name='e')
self.assertEqual(len(bars), 1)
self.assertFalse(execute.called)
Foo.create(name='1', age=1)
Foo.create(name='2', age=1)
Foo.create(name='3', age=2)
with no_pk(Foo):
Foo.cache.gets_by(age=1, limit=3)
foos = Foo.cache.gets_by(age=3, limit=3)
self.assertEqual(foos, [])
# test unique key
foos = Foo.cache.gets_by(name=1, age=1)
self.assertEqual(len(foos), 1)
foos = Foo.cache.gets_by(name=100, age=1)
self.assertEqual(foos, [])
def test_gets_by_with_order_by(self):
b0 = Bar.create(name='e', xixi='b', age=1)
b1 = Bar.create(name='f', xixi='a', age=1)
with patched_execute as execute:
bars = Bar.cache.gets_by(age=1, order_by=('xixi', 'age'))
self.assertEqual(bars, [b1, b0])
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(age=1, order_by=['xixi', 'age'])
self.assertEqual(bars, [b1, b0])
self.assertFalse(execute.called)
b1.update(xixi='c')
with patched_execute as execute:
bars = Bar.cache.gets_by(age=1, order_by=['xixi', 'age'])
self.assertEqual(bars, [b0, b1])
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(age=1, order_by=['xixi', 'age'])
self.assertEqual(bars, [b0, b1])
self.assertFalse(execute.called)
def test_gets_by_missing_value(self):
Bar.create(name='b', xixi='b', age=1)
Bar.create(name='c', xixi='b', age=1)
Bar.create(name='d', xixi='b', age=1)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=missing)
self.assertEqual(len(bars), 3)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=missing)
self.assertEqual(len(bars), 3)
self.assertFalse(execute.called)
def test_gets_by_over_limit(self):
max_count = Bar.cache.MAX_COUNT
Bar.create(name='b', xixi='b', age=1)
Bar.create(name='c', xixi='b', age=1)
Bar.create(name='d', xixi='b', age=1)
Bar.cache.MAX_COUNT = 2
try:
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1)
self.assertEqual(len(bars), 3)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1)
self.assertEqual(len(bars), 3)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, limit=2)
self.assertEqual(len(bars), 2)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, limit=2)
self.assertEqual(len(bars), 2)
self.assertFalse(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=1)
self.assertEqual(len(bars), 3)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=1)
self.assertEqual(len(bars), 3)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3)
self.assertEqual(len(bars), 3)
self.assertTrue(execute.called)
with patched_execute as execute:
bars = Bar.cache.gets_by(xixi='b', age=1, start=3)
self.assertEqual(len(bars), 3)
self.assertTrue(execute.called)
finally:
Bar.cache.MAX_COUNT = max_count
def test_count_by(self):
with patched_execute as execute:
c = Bar.cache.count_by(xixi='a', age=1)
self.assertEqual(c, 0)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='a', age=1)
self.assertEqual(c, 0)
self.assertFalse(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='b', age=1)
self.assertEqual(c, 0)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='b', age=1)
self.assertEqual(c, 0)
self.assertFalse(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by()
self.assertEqual(c, 0)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by()
self.assertEqual(c, 0)
self.assertFalse(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(name='a')
self.assertEqual(c, 0)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(name='a')
self.assertEqual(c, 0)
self.assertFalse(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(word='a')
self.assertEqual(c, 0)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(word='a')
self.assertEqual(c, 0)
self.assertTrue(execute.called)
Bar.create(name='a', xixi='b', age=1)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='a', age=1)
self.assertEqual(c, 0)
self.assertFalse(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='b', age=1)
self.assertEqual(c, 1)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by()
self.assertEqual(c, 1)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(name='a')
self.assertEqual(c, 1)
self.assertTrue(execute.called)
Bar.create(name='b', xixi='a', age=1)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='a', age=1)
self.assertEqual(c, 1)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='b', age=1)
self.assertEqual(c, 1)
self.assertFalse(execute.called)
bar = Bar.create(name='c', xixi='b', age=1)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='b', age=1)
self.assertEqual(c, 2)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='b', age=1)
self.assertEqual(c, 2)
self.assertFalse(execute.called)
bar.update(xixi='c')
with patched_execute as execute:
c = Bar.cache.count_by(xixi='b', age=1)
self.assertEqual(c, 1)
self.assertTrue(execute.called)
with patched_execute as execute:
c = Bar.cache.count_by(xixi='b', age=1)
self.assertEqual(c, 1)
self.assertFalse(execute.called)
def test_create_cache(self):
bar = Bar.create(name='b', xixi='a', age=1)
with no_cache_client(Bar):
create_cache(bar)
create_cache(bar)
def test_add_handler(self):
bar = Bar.create(name='b', xixi='a', age=1)
with db.transaction():
Bar.cache.add_handler(bar)
Bar.cache.add_handler(None)
def test_build_report_miss_msg(self):
msg = Bar.cache._build_report_miss_msg('get_by', 1)
self.assertEqual(msg, 'Miss cache method invocation: `Bar.get_by(1)`') # noqa
msg = Bar.cache._build_report_miss_msg('get_by', c=1)
self.assertEqual(msg, 'Miss cache method invocation: `Bar.get_by(c=1)`') # noqa
msg = Bar.cache._build_report_miss_msg('get_by', 1, c=1, a=2)
self.assertEqual(msg, 'Miss cache method invocation: `Bar.get_by(1, a=2, c=1)`') # noqa
def test_before_create_bug(self):
class _Lala(Lala):
__table_name__ = 'lala'
def before_create(self):
self.age = 2
l = _Lala.create(name='a')
self.assertEqual(l.age, 2)
l = _Lala.get(l.id)
self.assertEqual(l.age, 2)
| 41.473939 | 96 | 0.56193 | 4,216 | 34,216 | 4.451613 | 0.031309 | 0.113491 | 0.123721 | 0.137468 | 0.92711 | 0.9099 | 0.89903 | 0.886243 | 0.871217 | 0.850224 | 0 | 0.019375 | 0.316665 | 34,216 | 824 | 97 | 41.524272 | 0.783328 | 0.001286 | 0 | 0.816537 | 0 | 0 | 0.02321 | 0 | 0 | 0 | 0 | 0 | 0.397933 | 1 | 0.021964 | false | 0.005168 | 0.00646 | 0 | 0.031008 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1507122bb082b463a6302896530bfbee7b5eaf5e | 16,543 | py | Python | data/level/level1090.py | levelupai/match3-level-similarity | cc9b28b8741b41bea1273c8bc9b4d265d79a1dca | [
"Apache-2.0"
] | null | null | null | data/level/level1090.py | levelupai/match3-level-similarity | cc9b28b8741b41bea1273c8bc9b4d265d79a1dca | [
"Apache-2.0"
] | 6 | 2020-07-04T02:53:08.000Z | 2022-03-11T23:53:14.000Z | data/level/level1090.py | levelupai/match3-level-similarity | cc9b28b8741b41bea1273c8bc9b4d265d79a1dca | [
"Apache-2.0"
] | 3 | 2019-12-31T11:42:59.000Z | 2021-03-28T20:06:13.000Z | data = {
'level_index': 1090,
'move_count': 41,
'board_info': {
(0, 2): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(0, 3): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(0, 4): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(0, 5): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(0, 6): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(0, 10): {
'base': (2, 1),
'element_number': 2,
'next': (0, 1),
'prev': (0, -1)
},
(0, 11): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(0, 12): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(0, 13): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(0, 14): {
'bg_number': 41,
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(0, 15): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(0, 16): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(0, 17): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(1, 1): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(1, 2): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(1, 3): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(1, 4): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(1, 5): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(1, 6): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(1, 7): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(1, 9): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(1, 10): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(1, 11): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(1, 12): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(1, 13): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(2, 0): {
'base': (1, 1),
'element_number': 1,
'fall_point': (0, -1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 1): {
'base': (4, 1),
'element_number': 4,
'next': (0, 1),
'prev': (0, -1)
},
(2, 2): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(2, 3): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(2, 4): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(2, 5): {
'base': (4, 1),
'element_number': 4,
'next': (0, 1),
'prev': (0, -1)
},
(2, 6): {
'base': (2, 1),
'element_number': 2,
'next': (0, 1),
'prev': (0, -1)
},
(2, 7): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(2, 8): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(2, 9): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(2, 10): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(2, 11): {
'base': (2, 1),
'element_number': 2,
'next': (0, 1),
'prev': (0, -1)
},
(2, 12): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(2, 13): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(2, 14): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1),
'bg_number': 41
},
(2, 15): {
'next': (0, 1),
'prev': (0, -1),
'bg_number': 41
},
(2, 16): {
'next': (0, 1),
'prev': (0, -1),
'bg_number': 41
},
(2, 17): {
'next': (0, 1),
'prev': (0, -1),
'bg_number': 41
},
(3, 0): {
'base': (5, 1),
'element_number': 5,
'fall_point': (0, -1),
'next': (0, 1),
'prev': (0, -1)
},
(3, 1): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(3, 2): {
'base': (4, 1),
'element_number': 4,
'next': (0, 1),
'prev': (0, -1)
},
(3, 3): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(3, 4): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(3, 5): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(3, 6): {
'base': (2, 1),
'element_number': 2,
'next': (0, 1),
'prev': (0, -1)
},
(3, 7): {
'base': (4, 1),
'element_number': 4,
'next': (0, 1),
'prev': (0, -1)
},
(3, 8): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(3, 9): {
'base': (2, 1),
'element_number': 2,
'next': (0, 1),
'prev': (0, -1)
},
(3, 10): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(3, 11): {
'base': (2, 1),
'element_number': 2,
'next': (0, 1),
'prev': (0, -1)
},
(3, 12): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(3, 13): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(4, 0): {
'base': (6, 1),
'element_number': 6,
'fall_point': (0, -1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 1): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(4, 2): {
'base': (4, 1),
'element_number': 4,
'next': (0, 1),
'prev': (0, -1)
},
(4, 3): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(4, 4): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(4, 5): {
'base': (2, 1),
'element_number': 2,
'next': (0, 1),
'prev': (0, -1)
},
(4, 6): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(4, 7): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(4, 8): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(4, 9): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(4, 10): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(4, 11): {
'base': (4, 1),
'element_number': 4,
'next': (0, 1),
'prev': (0, -1)
},
(4, 12): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(4, 13): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(4, 14): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1),
'bg_number': 41
},
(4, 15): {
'next': (0, 1),
'prev': (0, -1),
'bg_number': 41
},
(4, 16): {
'next': (0, 1),
'prev': (0, -1),
'bg_number': 41
},
(4, 17): {
'next': (0, 1),
'prev': (0, -1),
'bg_number': 41
},
(5, 1): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(5, 2): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(5, 3): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(5, 4): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(5, 5): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(5, 6): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(5, 7): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1)
},
(5, 9): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(5, 10): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(5, 11): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(5, 12): {
'base': (2, 1),
'element_number': 2,
'next': (0, 1),
'prev': (0, -1)
},
(5, 13): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(6, 2): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(6, 3): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(6, 4): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(6, 5): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(6, 6): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(6, 10): {
'base': (1, 1),
'element_number': 1,
'next': (0, 1),
'prev': (0, -1)
},
(6, 11): {
'base': (6, 1),
'element_number': 6,
'next': (0, 1),
'prev': (0, -1)
},
(6, 12): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(6, 13): {
'base': (5, 1),
'element_number': 5,
'next': (0, 1),
'prev': (0, -1)
},
(6, 14): {
'cover': (60, 1),
'cover_level': 1,
'cover_number': 60,
'next': (0, 1),
'prev': (0, -1),
'bg_number': 41
},
(6, 15): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(6, 16): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(6, 17): {
'background_number': 41,
'bg_number': 41,
'next': (0, 1),
'prev': (0, -1)
},
(5, 17): {
'prev': (0, -1),
'next': (0, 1),
'bg_number': 41
},
(5, 16): {
'prev': (0, -1),
'next': (0, 1),
'bg_number': 41
},
(5, 14): {
'prev': (0, -1),
'next': (0, 1),
'bg_number': 41,
'cover': (60, 1)
},
(5, 15): {
'prev': (0, -1),
'next': (0, 1),
'bg_number': 41
},
(1, 14): {
'prev': (0, -1),
'next': (0, 1),
'bg_number': 41,
'cover': (60, 1)
},
(1, 15): {
'prev': (0, -1),
'next': (0, 1),
'bg_number': 41
},
(1, 16): {
'prev': (0, -1),
'next': (0, 1),
'bg_number': 41
},
(1, 17): {
'prev': (0, -1),
'next': (0, 1),
'bg_number': 41
}
},
'trans_info': {
(0, 0): {
41: 41
},
(0, 9): {
41: 33
}
},
'max_info': [
7,
18
]
}
| 24.327941 | 36 | 0.256302 | 1,577 | 16,543 | 2.590996 | 0.026633 | 0.107195 | 0.15859 | 0.244738 | 0.973568 | 0.973568 | 0.956192 | 0.9535 | 0.952766 | 0.952766 | 0 | 0.145226 | 0.533398 | 16,543 | 679 | 37 | 24.36377 | 0.384117 | 0 | 0 | 0.655376 | 0 | 0 | 0.188297 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
12fc0f12a0d888a04b37f8a72f2316d56ac94ada | 22,748 | py | Python | PNN/author/models.py | suhuating/ML_CIA | 3240cd0b1dec37aade6aacca93fb42dcc68cf01e | [
"MIT"
] | 572 | 2018-05-10T10:09:09.000Z | 2022-03-30T08:04:23.000Z | PNN/author/models.py | juli25/ML_CIA | 37838eb655d3e432393cee7dda11ea693217eb42 | [
"MIT"
] | 5 | 2018-08-10T01:56:48.000Z | 2020-01-20T07:15:51.000Z | PNN/author/models.py | juli25/ML_CIA | 37838eb655d3e432393cee7dda11ea693217eb42 | [
"MIT"
] | 290 | 2018-05-22T01:39:09.000Z | 2022-03-09T11:25:52.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import sys
if sys.version[0] == '2':
import cPickle as pkl
else:
import pickle as pkl
import numpy as np
import tensorflow as tf
from PNN.author import utils
dtype = utils.DTYPE
class Model:
def __init__(self):
self.sess = None
self.X = None
self.y = None
self.layer_keeps = None
self.vars = None
self.keep_prob_train = None
self.keep_prob_test = None
def run(self, fetches, X=None, y=None, mode='train'):
feed_dict = {}
if type(self.X) is list:
for i in range(len(X)):
feed_dict[self.X[i]] = X[i]
else:
feed_dict[self.X] = X
if y is not None:
feed_dict[self.y] = y
if self.layer_keeps is not None:
if mode == 'train':
feed_dict[self.layer_keeps] = self.keep_prob_train
elif mode == 'test':
feed_dict[self.layer_keeps] = self.keep_prob_test
return self.sess.run(fetches, feed_dict)
def dump(self, model_path):
var_map = {}
for name, var in self.vars.iteritems():
var_map[name] = self.run(var)
pkl.dump(var_map, open(model_path, 'wb'))
print('model dumped at', model_path)
class LR(Model):
def __init__(self, input_dim=None, output_dim=1, init_path=None, opt_algo='gd', learning_rate=1e-2, l2_weight=0,
random_seed=None):
Model.__init__(self)
init_vars = [('w', [input_dim, output_dim], 'xavier', dtype),
('b', [output_dim], 'zero', dtype)]
self.graph = tf.Graph()
with self.graph.as_default():
if random_seed is not None:
tf.set_random_seed(random_seed)
self.X = tf.sparse_placeholder(dtype)
self.y = tf.placeholder(dtype)
self.vars = utils.init_var_map(init_vars, init_path)
w = self.vars['w']
b = self.vars['b']
xw = tf.sparse_tensor_dense_matmul(self.X, w)
logits = tf.reshape(xw + b, [-1])
self.y_prob = tf.sigmoid(logits)
self.loss = tf.reduce_mean(
tf.nn.sigmoid_cross_entropy_with_logits(labels=self.y, logits=logits)) + \
l2_weight * tf.nn.l2_loss(xw)
self.optimizer = utils.get_optimizer(opt_algo, learning_rate, self.loss)
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
self.sess = tf.Session(config=config)
tf.global_variables_initializer().run(session=self.sess)
class FM(Model):
def __init__(self, input_dim=None, output_dim=1, factor_order=10, init_path=None, opt_algo='gd', learning_rate=1e-2,
l2_w=0, l2_v=0, random_seed=None):
Model.__init__(self)
init_vars = [('w', [input_dim, output_dim], 'xavier', dtype),
('v', [input_dim, factor_order], 'xavier', dtype),
('b', [output_dim], 'zero', dtype)]
self.graph = tf.Graph()
with self.graph.as_default():
if random_seed is not None:
tf.set_random_seed(random_seed)
self.X = tf.sparse_placeholder(dtype)
self.y = tf.placeholder(dtype)
self.vars = utils.init_var_map(init_vars, init_path)
w = self.vars['w']
v = self.vars['v']
b = self.vars['b']
X_square = tf.SparseTensor(self.X.indices, tf.square(self.X.values), tf.to_int64(tf.shape(self.X)))
xv = tf.square(tf.sparse_tensor_dense_matmul(self.X, v))
p = 0.5 * tf.reshape(
tf.reduce_sum(xv - tf.sparse_tensor_dense_matmul(X_square, tf.square(v)), 1),
[-1, output_dim])
xw = tf.sparse_tensor_dense_matmul(self.X, w)
logits = tf.reshape(xw + b + p, [-1])
self.y_prob = tf.sigmoid(logits)
self.loss = tf.reduce_mean(
tf.nn.sigmoid_cross_entropy_with_logits(logits=logits, labels=self.y)) + \
l2_w * tf.nn.l2_loss(xw) + \
l2_v * tf.nn.l2_loss(xv)
self.optimizer = utils.get_optimizer(opt_algo, learning_rate, self.loss)
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
self.sess = tf.Session(config=config)
tf.global_variables_initializer().run(session=self.sess)
class FNN(Model):
def __init__(self, field_sizes=None, embed_size=10, layer_sizes=None, layer_acts=None, drop_out=None,
embed_l2=None, layer_l2=None, init_path=None, opt_algo='gd', learning_rate=1e-2, random_seed=None):
Model.__init__(self)
init_vars = []
num_inputs = len(field_sizes)
for i in range(num_inputs):
init_vars.append(('embed_%d' % i, [field_sizes[i], embed_size], 'xavier', dtype))
node_in = num_inputs * embed_size
for i in range(len(layer_sizes)):
init_vars.append(('w%d' % i, [node_in, layer_sizes[i]], 'xavier', dtype))
init_vars.append(('b%d' % i, [layer_sizes[i]], 'zero', dtype))
node_in = layer_sizes[i]
self.graph = tf.Graph()
with self.graph.as_default():
if random_seed is not None:
tf.set_random_seed(random_seed)
self.X = [tf.sparse_placeholder(dtype) for i in range(num_inputs)]
self.y = tf.placeholder(dtype)
self.keep_prob_train = 1 - np.array(drop_out)
self.keep_prob_test = np.ones_like(drop_out)
self.layer_keeps = tf.placeholder(dtype)
self.vars = utils.init_var_map(init_vars, init_path)
w0 = [self.vars['embed_%d' % i] for i in range(num_inputs)]
xw = tf.concat([tf.sparse_tensor_dense_matmul(self.X[i], w0[i]) for i in range(num_inputs)], 1)
l = xw
for i in range(len(layer_sizes)):
wi = self.vars['w%d' % i]
bi = self.vars['b%d' % i]
print(l.shape, wi.shape, bi.shape)
l = tf.nn.dropout(
utils.activate(
tf.matmul(l, wi) + bi,
layer_acts[i]),
self.layer_keeps[i])
l = tf.squeeze(l)
self.y_prob = tf.sigmoid(l)
self.loss = tf.reduce_mean(
tf.nn.sigmoid_cross_entropy_with_logits(logits=l, labels=self.y))
if layer_l2 is not None:
self.loss += embed_l2 * tf.nn.l2_loss(xw)
for i in range(len(layer_sizes)):
wi = self.vars['w%d' % i]
self.loss += layer_l2[i] * tf.nn.l2_loss(wi)
self.optimizer = utils.get_optimizer(opt_algo, learning_rate, self.loss)
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
self.sess = tf.Session(config=config)
tf.global_variables_initializer().run(session=self.sess)
class DeepFM(Model):
def __init__(self, field_sizes=None, embed_size=10, layer_sizes=None, layer_acts=None, drop_out=None,
embed_l2=None, layer_l2=None, init_path=None, opt_algo='gd', learning_rate=1e-2, random_seed=None):
Model.__init__(self)
init_vars = []
num_inputs = len(field_sizes)
for i in range(num_inputs):
init_vars.append(('embed_%d' % i, [field_sizes[i], embed_size], 'xavier', dtype))
init_vars.append(('weight_%d' % i, [field_sizes[i], 1], 'xavier', dtype))
init_vars.append(('bias', [1], 'zero', dtype))
node_in = num_inputs * embed_size
for i in range(len(layer_sizes)):
init_vars.append(('w%d' % i, [node_in, layer_sizes[i]], 'xavier', dtype))
init_vars.append(('b%d' % i, [layer_sizes[i]], 'zero', dtype))
node_in = layer_sizes[i]
self.graph = tf.Graph()
with self.graph.as_default():
if random_seed is not None:
tf.set_random_seed(random_seed)
self.X = [tf.sparse_placeholder(dtype) for i in range(num_inputs)]
self.y = tf.placeholder(dtype)
self.keep_prob_train = 1 - np.array(drop_out)
self.keep_prob_test = np.ones_like(drop_out)
self.layer_keeps = tf.placeholder(dtype)
self.vars = utils.init_var_map(init_vars, init_path)
w = [self.vars['weight_%d' % i] for i in range(num_inputs)]
v = [self.vars['embed_%d' % i] for i in range(num_inputs)]
b = self.vars['bias']
xw = tf.concat([tf.sparse_tensor_dense_matmul(self.X[i], w[i]) for i in range(num_inputs)], 1)
xv = tf.concat([tf.sparse_tensor_dense_matmul(self.X[i], v[i]) for i in range(num_inputs)], 1)
l = xv
for i in range(len(layer_sizes)):
wi = self.vars['w%d' % i]
bi = self.vars['b%d' % i]
print(l.shape, wi.shape, bi.shape)
l = tf.nn.dropout(
utils.activate(
tf.matmul(l, wi) + bi,
layer_acts[i]),
self.layer_keeps[i])
l = tf.squeeze(l)
xv = tf.reshape(xv, [-1, num_inputs, embed_size])
p = 0.5 * tf.reduce_sum(
tf.square(tf.reduce_sum(xv, 1)) -
tf.reduce_sum(tf.square(xv), 1),
1)
xw = tf.reduce_sum(xw, 1)
logits = tf.reshape(l + xw + b + p, [-1])
self.y_prob = tf.sigmoid(logits)
self.loss = tf.reduce_mean(
tf.nn.sigmoid_cross_entropy_with_logits(logits=logits, labels=self.y))
if layer_l2 is not None:
self.loss += embed_l2 * tf.nn.l2_loss(xw)
for i in range(len(layer_sizes)):
wi = self.vars['w%d' % i]
self.loss += layer_l2[i] * tf.nn.l2_loss(wi)
self.optimizer = utils.get_optimizer(opt_algo, learning_rate, self.loss)
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
self.sess = tf.Session(config=config)
tf.global_variables_initializer().run(session=self.sess)
class CCPM(Model):
def __init__(self, field_sizes=None, embed_size=10, filter_sizes=None, layer_acts=None, drop_out=None,
init_path=None, opt_algo='gd', learning_rate=1e-2, random_seed=None):
Model.__init__(self)
init_vars = []
num_inputs = len(field_sizes)
for i in range(num_inputs):
init_vars.append(('embed_%d' % i, [field_sizes[i], embed_size], 'xavier', dtype))
init_vars.append(('f1', [embed_size, filter_sizes[0], 1, 2], 'xavier', dtype))
init_vars.append(('f2', [embed_size, filter_sizes[1], 2, 2], 'xavier', dtype))
init_vars.append(('w1', [2 * 3 * embed_size, 1], 'xavier', dtype))
init_vars.append(('b1', [1], 'zero', dtype))
self.graph = tf.Graph()
with self.graph.as_default():
if random_seed is not None:
tf.set_random_seed(random_seed)
self.X = [tf.sparse_placeholder(dtype) for i in range(num_inputs)]
self.y = tf.placeholder(dtype)
self.keep_prob_train = 1 - np.array(drop_out)
self.keep_prob_test = np.ones_like(drop_out)
self.layer_keeps = tf.placeholder(dtype)
self.vars = utils.init_var_map(init_vars, init_path)
w0 = [self.vars['embed_%d' % i] for i in range(num_inputs)]
xw = tf.concat([tf.sparse_tensor_dense_matmul(self.X[i], w0[i]) for i in range(num_inputs)], 1)
l = xw
l = tf.transpose(tf.reshape(l, [-1, num_inputs, embed_size, 1]), [0, 2, 1, 3])
f1 = self.vars['f1']
l = tf.nn.conv2d(l, f1, [1, 1, 1, 1], 'SAME')
l = tf.transpose(
utils.max_pool_4d(
tf.transpose(l, [0, 1, 3, 2]),
int(num_inputs / 2)),
[0, 1, 3, 2])
f2 = self.vars['f2']
l = tf.nn.conv2d(l, f2, [1, 1, 1, 1], 'SAME')
l = tf.transpose(
utils.max_pool_4d(
tf.transpose(l, [0, 1, 3, 2]), 3),
[0, 1, 3, 2])
l = tf.nn.dropout(
utils.activate(
tf.reshape(l, [-1, embed_size * 3 * 2]),
layer_acts[0]),
self.layer_keeps[0])
w1 = self.vars['w1']
b1 = self.vars['b1']
l = tf.matmul(l, w1) + b1
l = tf.squeeze(l)
self.y_prob = tf.sigmoid(l)
self.loss = tf.reduce_mean(
tf.nn.sigmoid_cross_entropy_with_logits(logits=l, labels=self.y))
self.optimizer = utils.get_optimizer(opt_algo, learning_rate, self.loss)
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
self.sess = tf.Session(config=config)
tf.global_variables_initializer().run(session=self.sess)
class PNN1(Model):
def __init__(self, field_sizes=None, embed_size=10, layer_sizes=None, layer_acts=None, drop_out=None,
embed_l2=None, layer_l2=None, init_path=None, opt_algo='gd', learning_rate=1e-2, random_seed=None):
Model.__init__(self)
init_vars = []
num_inputs = len(field_sizes) # 26
for i in range(num_inputs): # 一个field就对应一个embedding的参数
init_vars.append(('embed_%d' % i, [field_sizes[i], embed_size], 'xavier', dtype))
num_pairs = int(num_inputs * (num_inputs - 1) / 2)
node_in = num_inputs * embed_size + num_pairs # 第一个隐藏层的输入维度,lz大小k * pairs, lp只是pairs,也就是lp一个pair生成一个值,lz一个pair生成一个embedding大小
# node_in = num_inputs * (embed_size + num_inputs)
for i in range(len(layer_sizes)):
init_vars.append(('w%d' % i, [node_in, layer_sizes[i]], 'xavier', dtype))
init_vars.append(('b%d' % i, [layer_sizes[i]], 'zero', dtype))
node_in = layer_sizes[i]
self.graph = tf.Graph()
with self.graph.as_default():
if random_seed is not None:
tf.set_random_seed(random_seed)
self.X = [tf.sparse_placeholder(dtype) for i in range(num_inputs)] # num_input就是field的个数N,也就是说原始输入不用做one-hot
self.y = tf.placeholder(dtype)
self.keep_prob_train = 1 - np.array(drop_out)
self.keep_prob_test = np.ones_like(drop_out)
self.layer_keeps = tf.placeholder(dtype)
self.vars = utils.init_var_map(init_vars, init_path)
w0 = [self.vars['embed_%d' % i] for i in range(num_inputs)]
xw = tf.concat([tf.sparse_tensor_dense_matmul(self.X[i], w0[i]) for i in range(num_inputs)], 1) # 相乘就是在做embedding,concat就是把结果拼接起来
xw3d = tf.reshape(xw, [-1, num_inputs, embed_size]) # [num_samples, num_field, embed_sz]
row = []
col = []
for i in range(num_inputs-1):
for j in range(i+1, num_inputs):
row.append(i)
col.append(j)
# batch * pair * k
p = tf.transpose(
# pair * batch * k
tf.gather(
# num * batch * k
tf.transpose(
xw3d, [1, 0, 2]),
row),
[1, 0, 2])
# batch * pair * k
q = tf.transpose(
tf.gather(
tf.transpose(
xw3d, [1, 0, 2]),
col),
[1, 0, 2])
p = tf.reshape(p, [-1, num_pairs, embed_size])
q = tf.reshape(q, [-1, num_pairs, embed_size])
ip = tf.reshape(tf.reduce_sum(p * q, [-1]), [-1, num_pairs])
# simple but redundant
# batch * n * 1 * k, batch * 1 * n * k
# ip = tf.reshape(
# tf.reduce_sum(
# tf.expand_dims(xw3d, 2) *
# tf.expand_dims(xw3d, 1),
# 3),
# [-1, num_inputs**2])
l = tf.concat([xw, ip], 1)
for i in range(len(layer_sizes)):
wi = self.vars['w%d' % i]
bi = self.vars['b%d' % i]
l = tf.nn.dropout(
utils.activate(
tf.matmul(l, wi) + bi,
layer_acts[i]),
self.layer_keeps[i])
l = tf.squeeze(l)
self.y_prob = tf.sigmoid(l)
self.loss = tf.reduce_mean(
tf.nn.sigmoid_cross_entropy_with_logits(logits=l, labels=self.y))
if layer_l2 is not None:
self.loss += embed_l2 * tf.nn.l2_loss(xw)
for i in range(len(layer_sizes)):
wi = self.vars['w%d' % i]
self.loss += layer_l2[i] * tf.nn.l2_loss(wi)
self.optimizer = utils.get_optimizer(opt_algo, learning_rate, self.loss)
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
self.sess = tf.Session(config=config)
tf.global_variables_initializer().run(session=self.sess)
class PNN2(Model):
def __init__(self, field_sizes=None, embed_size=10, layer_sizes=None, layer_acts=None, drop_out=None,
embed_l2=None, layer_l2=None, init_path=None, opt_algo='gd', learning_rate=1e-2, random_seed=None,
layer_norm=True):
Model.__init__(self)
init_vars = []
num_inputs = len(field_sizes)
for i in range(num_inputs):
init_vars.append(('embed_%d' % i, [field_sizes[i], embed_size], 'xavier', dtype))
num_pairs = int(num_inputs * (num_inputs - 1) / 2)
node_in = num_inputs * embed_size + num_pairs
init_vars.append(('kernel', [embed_size, num_pairs, embed_size], 'xavier', dtype))
for i in range(len(layer_sizes)):
init_vars.append(('w%d' % i, [node_in, layer_sizes[i]], 'xavier', dtype))
init_vars.append(('b%d' % i, [layer_sizes[i]], 'zero', dtype))
node_in = layer_sizes[i]
self.graph = tf.Graph()
with self.graph.as_default():
if random_seed is not None:
tf.set_random_seed(random_seed)
self.X = [tf.sparse_placeholder(dtype) for i in range(num_inputs)]
self.y = tf.placeholder(dtype)
self.keep_prob_train = 1 - np.array(drop_out)
self.keep_prob_test = np.ones_like(drop_out)
self.layer_keeps = tf.placeholder(dtype)
self.vars = utils.init_var_map(init_vars, init_path)
w0 = [self.vars['embed_%d' % i] for i in range(num_inputs)]
xw = tf.concat([tf.sparse_tensor_dense_matmul(self.X[i], w0[i]) for i in range(num_inputs)], 1)
xw3d = tf.reshape(xw, [-1, num_inputs, embed_size])
row = []
col = []
for i in range(num_inputs - 1):
for j in range(i + 1, num_inputs):
row.append(i)
col.append(j)
# batch * pair * k
p = tf.transpose(
# pair * batch * k
tf.gather(
# field * batch * k
tf.transpose(
xw3d, [1, 0, 2]),
row),
[1, 0, 2])
# batch * pair * k
q = tf.transpose(
tf.gather(
tf.transpose(
xw3d, [1, 0, 2]),
col),
[1, 0, 2])
# batch * pair * k
p = tf.reshape(p, [-1, num_pairs, embed_size])
# batch * pair * k
q = tf.reshape(q, [-1, num_pairs, embed_size])
# k * pair * k
k = self.vars['kernel'] # 外积生成二维矩阵; kernel就是用来和二维矩阵进行"卷积"(对应位置相乘相加)的。
# batch * 1 * pair * k
p = tf.expand_dims(p, 1) # 1表示在原来第一维度后面加一维
# batch * pair
kp = tf.reduce_sum(
# batch * pair * k
tf.multiply(
# batch * pair * k
tf.transpose(
# batch * k * pair
tf.reduce_sum(
# batch * k * pair * k
tf.multiply(
p, k),
-1),
[0, 2, 1]),
q),
-1)
#
# if layer_norm:
# # x_mean, x_var = tf.nn.moments(xw, [1], keep_dims=True)
# # xw = (xw - x_mean) / tf.sqrt(x_var)
# # x_g = tf.Variable(tf.ones([num_inputs * embed_size]), name='x_g')
# # x_b = tf.Variable(tf.zeros([num_inputs * embed_size]), name='x_b')
# # x_g = tf.Print(x_g, [x_g[:10], x_b])
# # xw = xw * x_g + x_b
# p_mean, p_var = tf.nn.moments(op, [1], keep_dims=True)
# op = (op - p_mean) / tf.sqrt(p_var)
# p_g = tf.Variable(tf.ones([embed_size**2]), name='p_g')
# p_b = tf.Variable(tf.zeros([embed_size**2]), name='p_b')
# # p_g = tf.Print(p_g, [p_g[:10], p_b])
# op = op * p_g + p_b
l = tf.concat([xw, kp], 1)
for i in range(len(layer_sizes)):
wi = self.vars['w%d' % i]
bi = self.vars['b%d' % i]
l = tf.nn.dropout(
utils.activate(
tf.matmul(l, wi) + bi,
layer_acts[i]),
self.layer_keeps[i])
l = tf.squeeze(l)
self.y_prob = tf.sigmoid(l)
self.loss = tf.reduce_mean(
tf.nn.sigmoid_cross_entropy_with_logits(logits=l, labels=self.y))
if layer_l2 is not None:
self.loss += embed_l2 * tf.nn.l2_loss(xw)#tf.concat(w0, 0))
for i in range(len(layer_sizes)):
wi = self.vars['w%d' % i]
self.loss += layer_l2[i] * tf.nn.l2_loss(wi)
self.optimizer = utils.get_optimizer(opt_algo, learning_rate, self.loss)
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
self.sess = tf.Session(config=config)
tf.global_variables_initializer().run(session=self.sess) | 43.746154 | 141 | 0.523519 | 3,040 | 22,748 | 3.695395 | 0.073684 | 0.039256 | 0.019761 | 0.036229 | 0.816005 | 0.788054 | 0.768115 | 0.760637 | 0.748086 | 0.72966 | 0 | 0.017074 | 0.348602 | 22,748 | 520 | 142 | 43.746154 | 0.741058 | 0.06317 | 0 | 0.725768 | 0 | 0 | 0.018443 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023641 | false | 0 | 0.021277 | 0 | 0.066194 | 0.009456 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
421e5ed396798c0df6fe70927d7e06c6c33e6acf | 808 | py | Python | pseudoregion.py | jon2718/ipycool_2.0 | 34cf74ee99f4a725b997c50a7742ba788ac2dacd | [
"MIT"
] | null | null | null | pseudoregion.py | jon2718/ipycool_2.0 | 34cf74ee99f4a725b997c50a7742ba788ac2dacd | [
"MIT"
] | null | null | null | pseudoregion.py | jon2718/ipycool_2.0 | 34cf74ee99f4a725b997c50a7742ba788ac2dacd | [
"MIT"
] | null | null | null | from region import *
class PseudoRegion(Region):
"""
PseudoRegion commands include: APERTURE, CUTV, DENP, DENS, DISP, DUMMY, DVAR, EDGE, GRID
OUTPUT, REFP, REF2, RESET, RKICK, ROTATE, TAPER, TILT, TRANSPORT, BACKGROUND, BFIELD, ENDB, ! or &
"""
def __init__(self, **kwargs):
pass
def __str__(self):
return '[A PseudoRegion can be either a APERTURE, CUTV, DENP, DENS, DISP, DUMMY, DVAR, EDGE, GRID\
OUTPUT, REFP, REF2, RESET, RKICK, ROTATE, TAPER, TILT, TRANSPORT, BACKGROUND, BFIELD, ENDB, ! or &]'
def __repr__(self):
return '[A PseudoRegion can be either a APERTURE, CUTV, DENP, DENS, DISP, DUMMY, DVAR, EDGE, GRID\
OUTPUT, REFP, REF2, RESET, RKICK, ROTATE, TAPER, TILT, TRANSPORT, BACKGROUND, BFIELD, ENDB, ! or &]' | 40.4 | 116 | 0.637376 | 100 | 808 | 5.03 | 0.41 | 0.071571 | 0.095427 | 0.119284 | 0.813121 | 0.813121 | 0.813121 | 0.813121 | 0.813121 | 0.813121 | 0 | 0.004886 | 0.240099 | 808 | 20 | 117 | 40.4 | 0.814332 | 0.231436 | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0.1 | 0.1 | 0.2 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 11 |
425d3e6f877573af690a0c36a58c1638a29efc34 | 38 | py | Python | uteis.py | LucasGoes-123/Python_Project | 2bdd3620239711abde1462f7310bac7117fa2805 | [
"MIT"
] | null | null | null | uteis.py | LucasGoes-123/Python_Project | 2bdd3620239711abde1462f7310bac7117fa2805 | [
"MIT"
] | null | null | null | uteis.py | LucasGoes-123/Python_Project | 2bdd3620239711abde1462f7310bac7117fa2805 | [
"MIT"
] | null | null | null | def linhas():
return print(20*"=") | 19 | 24 | 0.578947 | 5 | 38 | 4.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 0.184211 | 38 | 2 | 24 | 19 | 0.645161 | 0 | 0 | 0 | 0 | 0 | 0.025641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 8 |
42aebd45b438fb6d618aab3fff76c4b0688e25f1 | 125 | py | Python | cargo/etc/__init__.py | jaredlunde/cargo-orm | 1d5524d359bd52a991edc738982b7df2149d9c69 | [
"MIT"
] | 3 | 2017-02-10T08:03:21.000Z | 2017-02-25T04:55:48.000Z | cargo/etc/__init__.py | jaredlunde/cargo-orm | 1d5524d359bd52a991edc738982b7df2149d9c69 | [
"MIT"
] | null | null | null | cargo/etc/__init__.py | jaredlunde/cargo-orm | 1d5524d359bd52a991edc738982b7df2149d9c69 | [
"MIT"
] | null | null | null | from cargo.etc import operators
from cargo.etc import translator
from cargo.etc import types
from cargo.etc import usernames
| 25 | 32 | 0.84 | 20 | 125 | 5.25 | 0.4 | 0.342857 | 0.457143 | 0.685714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128 | 125 | 4 | 33 | 31.25 | 0.963303 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
35f67b6bdd9df209820bc49083763c100bb7ac79 | 83,648 | py | Python | makeBones.py | warweasle/blenderTools | 8037901133572bb68c48e863ac0597fb5112794e | [
"Apache-2.0"
] | null | null | null | makeBones.py | warweasle/blenderTools | 8037901133572bb68c48e863ac0597fb5112794e | [
"Apache-2.0"
] | null | null | null | makeBones.py | warweasle/blenderTools | 8037901133572bb68c48e863ac0597fb5112794e | [
"Apache-2.0"
] | null | null | null | import bpy
from mathutils import Color
1
def create(obj, srcBones):
# generated by rigify.utils.write_metarig
#arm = bpy.data.armatures.new("metaRig")
#ob = bpy.data.objects.new("MetatRigObject", arm)
#scn = bpy.context.scene
#scn.objects.link(ob)
#scn.objects.active = ob
#ob.select = True
#bpy.ops.object.mode_set(mode='EDIT')
#arm = bpy.data.armatures.new("testArm")
#bpy.ops.object.mode_set(mode='EDIT')
#arm = obj.data
bpy.ops.object.mode_set(mode='EDIT')
arm = obj.data
for i in range(12):
arm.rigify_colors.add()
arm.rigify_colors[0].name = "Root"
arm.rigify_colors[0].active = Color((0.5490196347236633, 1.0, 1.0))
arm.rigify_colors[0].normal = Color((0.4352940022945404, 0.18431399762630463, 0.4156860113143921))
arm.rigify_colors[0].select = Color((0.31372547149658203, 0.7843138575553894, 1.0))
arm.rigify_colors[0].standard_colors_lock = True
arm.rigify_colors[1].name = "IK"
arm.rigify_colors[1].active = Color((0.5490196347236633, 1.0, 1.0))
arm.rigify_colors[1].normal = Color((0.6039220094680786, 0.0, 0.0))
arm.rigify_colors[1].select = Color((0.31372547149658203, 0.7843138575553894, 1.0))
arm.rigify_colors[1].standard_colors_lock = True
arm.rigify_colors[2].name = "Special"
arm.rigify_colors[2].active = Color((0.5490196347236633, 1.0, 1.0))
arm.rigify_colors[2].normal = Color((0.9568629860877991, 0.7882350087165833, 0.04705899953842163))
arm.rigify_colors[2].select = Color((0.31372547149658203, 0.7843138575553894, 1.0))
arm.rigify_colors[2].standard_colors_lock = True
arm.rigify_colors[3].name = "Tweak"
arm.rigify_colors[3].active = Color((0.5490196347236633, 1.0, 1.0))
arm.rigify_colors[3].normal = Color((0.03921600058674812, 0.21176500618457794, 0.5803920030593872))
arm.rigify_colors[3].select = Color((0.31372547149658203, 0.7843138575553894, 1.0))
arm.rigify_colors[3].standard_colors_lock = True
arm.rigify_colors[4].name = "FK"
arm.rigify_colors[4].active = Color((0.5490196347236633, 1.0, 1.0))
arm.rigify_colors[4].normal = Color((0.11764699965715408, 0.5686269998550415, 0.035294000059366226))
arm.rigify_colors[4].select = Color((0.31372547149658203, 0.7843138575553894, 1.0))
arm.rigify_colors[4].standard_colors_lock = True
arm.rigify_colors[5].name = "Extra"
arm.rigify_colors[5].active = Color((0.5490196347236633, 1.0, 1.0))
arm.rigify_colors[5].normal = Color((0.9686279892921448, 0.2509799897670746, 0.09411799907684326))
arm.rigify_colors[5].select = Color((0.31372547149658203, 0.7843138575553894, 1.0))
arm.rigify_colors[5].standard_colors_lock = True
arm.rigify_colors[6].name = " "
arm.rigify_colors[6].active = Color((1.0, 1.0, 1.0))
arm.rigify_colors[6].normal = Color((1.0, 1.0, 1.0))
arm.rigify_colors[6].select = Color((1.0, 1.0, 1.0))
arm.rigify_colors[6].standard_colors_lock = True
arm.rigify_colors[7].name = " "
arm.rigify_colors[7].active = Color((1.0, 1.0, 1.0))
arm.rigify_colors[7].normal = Color((1.0, 1.0, 1.0))
arm.rigify_colors[7].select = Color((1.0, 1.0, 1.0))
arm.rigify_colors[7].standard_colors_lock = True
arm.rigify_colors[8].name = " "
arm.rigify_colors[8].active = Color((1.0, 1.0, 1.0))
arm.rigify_colors[8].normal = Color((1.0, 1.0, 1.0))
arm.rigify_colors[8].select = Color((1.0, 1.0, 1.0))
arm.rigify_colors[8].standard_colors_lock = True
arm.rigify_colors[9].name = " "
arm.rigify_colors[9].active = Color((1.0, 1.0, 1.0))
arm.rigify_colors[9].normal = Color((1.0, 1.0, 1.0))
arm.rigify_colors[9].select = Color((1.0, 1.0, 1.0))
arm.rigify_colors[9].standard_colors_lock = True
arm.rigify_colors[10].name = " "
arm.rigify_colors[10].active = Color((1.0, 1.0, 1.0))
arm.rigify_colors[10].normal = Color((1.0, 1.0, 1.0))
arm.rigify_colors[10].select = Color((1.0, 1.0, 1.0))
arm.rigify_colors[10].standard_colors_lock = True
arm.rigify_colors[11].name = " "
arm.rigify_colors[11].active = Color((1.0, 1.0, 1.0))
arm.rigify_colors[11].normal = Color((1.0, 1.0, 1.0))
arm.rigify_colors[11].select = Color((1.0, 1.0, 1.0))
arm.rigify_colors[11].standard_colors_lock = True
for i in range(58):
arm.rigify_layers.add()
arm.rigify_layers[0].name = "Face"
arm.rigify_layers[0].row = 1
arm.rigify_layers[0].set = False
arm.rigify_layers[0].group = 5
arm.rigify_layers[1].name = "Face (Primary)"
arm.rigify_layers[1].row = 2
arm.rigify_layers[1].set = False
arm.rigify_layers[1].group = 2
arm.rigify_layers[2].name = "Face (Secondary)"
arm.rigify_layers[2].row = 2
arm.rigify_layers[2].set = False
arm.rigify_layers[2].group = 3
arm.rigify_layers[3].name = "Torso"
arm.rigify_layers[3].row = 3
arm.rigify_layers[3].set = False
arm.rigify_layers[3].group = 3
arm.rigify_layers[4].name = "Torso (Tweak)"
arm.rigify_layers[4].row = 4
arm.rigify_layers[4].set = False
arm.rigify_layers[4].group = 4
arm.rigify_layers[5].name = "Fingers"
arm.rigify_layers[5].row = 5
arm.rigify_layers[5].set = False
arm.rigify_layers[5].group = 6
arm.rigify_layers[6].name = "Fingers (Tweak)"
arm.rigify_layers[6].row = 6
arm.rigify_layers[6].set = False
arm.rigify_layers[6].group = 4
arm.rigify_layers[7].name = "Arm.L (IK)"
arm.rigify_layers[7].row = 7
arm.rigify_layers[7].set = False
arm.rigify_layers[7].group = 2
arm.rigify_layers[8].name = "Arm.L (FK)"
arm.rigify_layers[8].row = 8
arm.rigify_layers[8].set = False
arm.rigify_layers[8].group = 5
arm.rigify_layers[9].name = "Arm.L (Tweak)"
arm.rigify_layers[9].row = 9
arm.rigify_layers[9].set = False
arm.rigify_layers[9].group = 4
arm.rigify_layers[10].name = "Arm.R (IK)"
arm.rigify_layers[10].row = 7
arm.rigify_layers[10].set = False
arm.rigify_layers[10].group = 2
arm.rigify_layers[11].name = "Arm.R (FK)"
arm.rigify_layers[11].row = 8
arm.rigify_layers[11].set = False
arm.rigify_layers[11].group = 5
arm.rigify_layers[12].name = "Arm.R (Tweak)"
arm.rigify_layers[12].row = 9
arm.rigify_layers[12].set = False
arm.rigify_layers[12].group = 4
arm.rigify_layers[13].name = "Leg.L (IK)"
arm.rigify_layers[13].row = 10
arm.rigify_layers[13].set = False
arm.rigify_layers[13].group = 2
arm.rigify_layers[14].name = "Leg.L (FK)"
arm.rigify_layers[14].row = 11
arm.rigify_layers[14].set = False
arm.rigify_layers[14].group = 5
arm.rigify_layers[15].name = "Leg.L (Tweak)"
arm.rigify_layers[15].row = 12
arm.rigify_layers[15].set = False
arm.rigify_layers[15].group = 4
arm.rigify_layers[16].name = "Leg.R (IK)"
arm.rigify_layers[16].row = 10
arm.rigify_layers[16].set = False
arm.rigify_layers[16].group = 2
arm.rigify_layers[17].name = "Leg.R (FK)"
arm.rigify_layers[17].row = 11
arm.rigify_layers[17].set = False
arm.rigify_layers[17].group = 5
arm.rigify_layers[18].name = "Leg.R (Tweak)"
arm.rigify_layers[18].row = 12
arm.rigify_layers[18].set = False
arm.rigify_layers[18].group = 4
arm.rigify_layers[19].name = ""
arm.rigify_layers[19].row = 1
arm.rigify_layers[19].set = False
arm.rigify_layers[19].group = 0
arm.rigify_layers[20].name = ""
arm.rigify_layers[20].row = 1
arm.rigify_layers[20].set = False
arm.rigify_layers[20].group = 0
arm.rigify_layers[21].name = ""
arm.rigify_layers[21].row = 1
arm.rigify_layers[21].set = False
arm.rigify_layers[21].group = 0
arm.rigify_layers[22].name = ""
arm.rigify_layers[22].row = 1
arm.rigify_layers[22].set = False
arm.rigify_layers[22].group = 0
arm.rigify_layers[23].name = ""
arm.rigify_layers[23].row = 1
arm.rigify_layers[23].set = False
arm.rigify_layers[23].group = 0
arm.rigify_layers[24].name = ""
arm.rigify_layers[24].row = 1
arm.rigify_layers[24].set = False
arm.rigify_layers[24].group = 0
arm.rigify_layers[25].name = ""
arm.rigify_layers[25].row = 1
arm.rigify_layers[25].set = False
arm.rigify_layers[25].group = 0
arm.rigify_layers[26].name = ""
arm.rigify_layers[26].row = 1
arm.rigify_layers[26].set = False
arm.rigify_layers[26].group = 0
arm.rigify_layers[27].name = ""
arm.rigify_layers[27].row = 1
arm.rigify_layers[27].set = False
arm.rigify_layers[27].group = 0
arm.rigify_layers[28].name = "Root"
arm.rigify_layers[28].row = 14
arm.rigify_layers[28].set = False
arm.rigify_layers[28].group = 1
arm.rigify_layers[29].name = " "
arm.rigify_layers[29].row = 1
arm.rigify_layers[29].set = False
arm.rigify_layers[29].group = 0
arm.rigify_layers[30].name = " "
arm.rigify_layers[30].row = 1
arm.rigify_layers[30].set = False
arm.rigify_layers[30].group = 0
arm.rigify_layers[31].name = " "
arm.rigify_layers[31].row = 1
arm.rigify_layers[31].set = False
arm.rigify_layers[31].group = 0
arm.rigify_layers[32].name = " "
arm.rigify_layers[32].row = 1
arm.rigify_layers[32].set = False
arm.rigify_layers[32].group = 0
arm.rigify_layers[33].name = " "
arm.rigify_layers[33].row = 1
arm.rigify_layers[33].set = False
arm.rigify_layers[33].group = 0
arm.rigify_layers[34].name = " "
arm.rigify_layers[34].row = 1
arm.rigify_layers[34].set = False
arm.rigify_layers[34].group = 0
arm.rigify_layers[35].name = " "
arm.rigify_layers[35].row = 1
arm.rigify_layers[35].set = False
arm.rigify_layers[35].group = 0
arm.rigify_layers[36].name = " "
arm.rigify_layers[36].row = 1
arm.rigify_layers[36].set = False
arm.rigify_layers[36].group = 0
arm.rigify_layers[37].name = " "
arm.rigify_layers[37].row = 1
arm.rigify_layers[37].set = False
arm.rigify_layers[37].group = 0
arm.rigify_layers[38].name = " "
arm.rigify_layers[38].row = 1
arm.rigify_layers[38].set = False
arm.rigify_layers[38].group = 0
arm.rigify_layers[39].name = " "
arm.rigify_layers[39].row = 1
arm.rigify_layers[39].set = False
arm.rigify_layers[39].group = 0
arm.rigify_layers[40].name = " "
arm.rigify_layers[40].row = 1
arm.rigify_layers[40].set = False
arm.rigify_layers[40].group = 0
arm.rigify_layers[41].name = " "
arm.rigify_layers[41].row = 1
arm.rigify_layers[41].set = False
arm.rigify_layers[41].group = 0
arm.rigify_layers[42].name = " "
arm.rigify_layers[42].row = 1
arm.rigify_layers[42].set = False
arm.rigify_layers[42].group = 0
arm.rigify_layers[43].name = " "
arm.rigify_layers[43].row = 1
arm.rigify_layers[43].set = False
arm.rigify_layers[43].group = 0
arm.rigify_layers[44].name = " "
arm.rigify_layers[44].row = 1
arm.rigify_layers[44].set = False
arm.rigify_layers[44].group = 0
arm.rigify_layers[45].name = " "
arm.rigify_layers[45].row = 1
arm.rigify_layers[45].set = False
arm.rigify_layers[45].group = 0
arm.rigify_layers[46].name = " "
arm.rigify_layers[46].row = 1
arm.rigify_layers[46].set = False
arm.rigify_layers[46].group = 0
arm.rigify_layers[47].name = " "
arm.rigify_layers[47].row = 1
arm.rigify_layers[47].set = False
arm.rigify_layers[47].group = 0
arm.rigify_layers[48].name = " "
arm.rigify_layers[48].row = 1
arm.rigify_layers[48].set = False
arm.rigify_layers[48].group = 0
arm.rigify_layers[49].name = " "
arm.rigify_layers[49].row = 1
arm.rigify_layers[49].set = False
arm.rigify_layers[49].group = 0
arm.rigify_layers[50].name = " "
arm.rigify_layers[50].row = 1
arm.rigify_layers[50].set = False
arm.rigify_layers[50].group = 0
arm.rigify_layers[51].name = " "
arm.rigify_layers[51].row = 1
arm.rigify_layers[51].set = False
arm.rigify_layers[51].group = 0
arm.rigify_layers[52].name = " "
arm.rigify_layers[52].row = 1
arm.rigify_layers[52].set = False
arm.rigify_layers[52].group = 0
arm.rigify_layers[53].name = " "
arm.rigify_layers[53].row = 1
arm.rigify_layers[53].set = False
arm.rigify_layers[53].group = 0
arm.rigify_layers[54].name = " "
arm.rigify_layers[54].row = 1
arm.rigify_layers[54].set = False
arm.rigify_layers[54].group = 0
arm.rigify_layers[55].name = " "
arm.rigify_layers[55].row = 1
arm.rigify_layers[55].set = False
arm.rigify_layers[55].group = 0
arm.rigify_layers[56].name = " "
arm.rigify_layers[56].row = 1
arm.rigify_layers[56].set = False
arm.rigify_layers[56].group = 0
arm.rigify_layers[57].name = " "
arm.rigify_layers[57].row = 1
arm.rigify_layers[57].set = False
arm.rigify_layers[57].group = 0
bones = {}
bone = arm.edit_bones.new('spine')
#bone.head[:] = 0.0000, 0.0552, 1.0099
#print(srcBones['pelvis'].head)
##print(bone.head)
#bone.tail[:] = 0.0000, 0.0172, 1.1573
#print(bone.tail)
bone.head[:] = srcBones['pelvis']['head']
bone.tail[:] = srcBones['pelvis']['tail']
bone.roll = srcBones['pelvis']['roll']
##bone.roll = 0.0000
bone.use_connect = False
bones['spine'] = bone.name
bone = arm.edit_bones.new('spine.001')
#bone.head[:] = 0.0000, 0.0172, 1.1573
#bone.tail[:] = 0.0000, 0.0004, 1.2929
#bone.roll = 0.0000
tmp = srcBones['spine01']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['spine']]
bones['spine.001'] = bone.name
bone = arm.edit_bones.new('thigh.L')
bone.head[:] = 0.0980, 0.0124, 1.0720
bone.tail[:] = 0.0980, -0.0286, 0.5372
bone.roll = 0.0000
tmp = srcBones['thigh_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['spine']]
bones['thigh.L'] = bone.name
bone = arm.edit_bones.new('thigh.R')
bone.head[:] = -0.0980, 0.0124, 1.0720
bone.tail[:] = -0.0980, -0.0286, 0.5372
bone.roll = 0.000
tmp = srcBones['thigh_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['spine']]
bones['thigh.R'] = bone.name
bone = arm.edit_bones.new('spine.002')
bone.head[:] = 0.0000, 0.0004, 1.2929
bone.tail[:] = 0.0000, 0.0059, 1.4657
bone.roll = 0.0000
tmp = srcBones['spine02']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['spine.001']]
bones['spine.002'] = bone.name
bone = arm.edit_bones.new('shin.L')
bone.head[:] = 0.0980, -0.0286, 0.5372
bone.tail[:] = 0.0980, 0.0162, 0.0852
bone.roll = 0.0000
tmp = srcBones['calf_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['thigh.L']]
bones['shin.L'] = bone.name
bone = arm.edit_bones.new('shin.R')
bone.head[:] = -0.0980, -0.0286, 0.5372
bone.tail[:] = -0.0980, 0.0162, 0.0852
bone.roll = 0.0000
tmp = srcBones['calf_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['thigh.R']]
bones['shin.R'] = bone.name
bone = arm.edit_bones.new('spine.003')
bone.head[:] = 0.0000, 0.0059, 1.4657
bone.tail[:] = 0.0000, 0.0114, 1.6582
bone.roll = 0.0000
tmp = srcBones['spine03']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['spine.002']]
bones['spine.003'] = bone.name
bone = arm.edit_bones.new('foot.L')
bone.head[:] = 0.0980, 0.0162, 0.0852
bone.tail[:] = 0.0980, -0.0934, 0.0167
bone.roll = 0.0000
tmp = srcBones['foot_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['shin.L']]
bones['foot.L'] = bone.name
bone = arm.edit_bones.new('foot.R')
bone.head[:] = -0.0980, 0.0162, 0.0852
bone.tail[:] = -0.0980, -0.0934, 0.0167
bone.roll = -0.0000
tmp = srcBones['foot_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['shin.R']]
bones['foot.R'] = bone.name
bone = arm.edit_bones.new('spine.004')
bone.head[:] = 0.0000, 0.0114, 1.6582
bone.tail[:] = 0.0000, -0.0130, 1.7197
bone.roll = 0.0000
tmp = srcBones['neck']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['spine.003']]
bones['spine.004'] = bone.name
bone = arm.edit_bones.new('shoulder.L')
bone.head[:] = 0.0183, -0.0684, 1.6051
bone.tail[:] = 0.1694, 0.0205, 1.6050
bone.roll = 0.0004
tmp = srcBones['clavicle_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['spine.003']]
bones['shoulder.L'] = bone.name
bone = arm.edit_bones.new('shoulder.R')
bone.head[:] = -0.0183, -0.0684, 1.6051
bone.tail[:] = -0.1694, 0.0205, 1.6050
bone.roll = -0.0004
tmp = srcBones['clavicle_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['spine.003']]
bones['shoulder.R'] = bone.name
bone = arm.edit_bones.new('breast.L')
bone.head[:] = 0.1184, 0.0485, 1.4596
bone.tail[:] = 0.1184, -0.0907, 1.4596
bone.roll = 0.0000
tmp = srcBones['breast_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['spine.003']]
bones['breast.L'] = bone.name
bone = arm.edit_bones.new('breast.R')
bone.head[:] = -0.1184, 0.0485, 1.4596
bone.tail[:] = -0.1184, -0.0907, 1.4596
bone.roll = -0.0000
tmp = srcBones['breast_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['spine.003']]
bones['breast.R'] = bone.name
bone = arm.edit_bones.new('toe.L')
bone.head[:] = 0.0980, -0.0934, 0.0167
bone.tail[:] = 0.0980, -0.1606, 0.0167
bone.roll = -0.0000
tmp = srcBones['toes_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['foot.L']]
bones['toe.L'] = bone.name
bone = arm.edit_bones.new('heel.02.L')
bone.head[:] = 0.0600, 0.0459, 0.0000
bone.tail[:] = 0.1400, 0.0459, 0.0000
bone.roll = 0.0000
bone.use_connect = False
bone.parent = arm.edit_bones[bones['foot.L']]
bones['heel.02.L'] = bone.name
bone = arm.edit_bones.new('toe.R')
bone.head[:] = -0.0980, -0.0934, 0.0167
bone.tail[:] = -0.0980, -0.1606, 0.0167
bone.roll = 0.0000
tmp = srcBones['toes_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['foot.R']]
bones['toe.R'] = bone.name
bone = arm.edit_bones.new('heel.02.R')
bone.head[:] = -0.0600, 0.0459, 0.0000
bone.tail[:] = -0.1400, 0.0459, 0.0000
bone.roll = -0.0000
bone.use_connect = False
bone.parent = arm.edit_bones[bones['foot.R']]
bones['heel.02.R'] = bone.name
bone = arm.edit_bones.new('spine.005')
bone.head[:] = 0.0000, -0.0130, 1.7197
bone.tail[:] = 0.0000, -0.0247, 1.7813
bone.roll = 0.0000
tmp = srcBones['head']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['spine.004']]
bones['spine.005'] = bone.name
bone = arm.edit_bones.new('upper_arm.L')
bone.head[:] = 0.1953, 0.0267, 1.5846
bone.tail[:] = 0.4424, 0.0885, 1.4491
bone.roll = 2.0724
tmp = srcBones['upperarm_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['shoulder.L']]
bones['upper_arm.L'] = bone.name
bone = arm.edit_bones.new('upper_arm.R')
bone.head[:] = -0.1953, 0.0267, 1.5846
bone.tail[:] = -0.4424, 0.0885, 1.4491
bone.roll = -2.0724
tmp = srcBones['upperarm_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['shoulder.R']]
bones['upper_arm.R'] = bone.name
bone = arm.edit_bones.new('forearm.L')
bone.head[:] = 0.4424, 0.0885, 1.4491
bone.tail[:] = 0.6594, 0.0492, 1.3061
bone.roll = 2.1535
tmp = srcBones['lowerarm_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['upper_arm.L']]
bones['forearm.L'] = bone.name
bone = arm.edit_bones.new('forearm.R')
bone.head[:] = -0.4424, 0.0885, 1.4491
bone.tail[:] = -0.6594, 0.0492, 1.3061
bone.roll = -2.1535
tmp = srcBones['lowerarm_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['upper_arm.R']]
bones['forearm.R'] = bone.name
bone = arm.edit_bones.new('hand.L')
bone.head[:] = 0.6594, 0.0492, 1.3061
bone.tail[:] = 0.7234, 0.0412, 1.2585
bone.roll = 2.2103
tmp = srcBones['hand_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['forearm.L']]
bones['hand.L'] = bone.name
bone = arm.edit_bones.new('hand.R')
bone.head[:] = -0.6594, 0.0492, 1.3061
bone.tail[:] = -0.7234, 0.0412, 1.2585
bone.roll = -2.2103
tmp = srcBones['hand_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['forearm.R']]
bones['hand.R'] = bone.name
bone = arm.edit_bones.new('palm.01.L')
bone.head[:] = 0.6921, 0.0224, 1.2882
bone.tail[:] = 0.7464, 0.0051, 1.2482
bone.roll = -2.4928
tmp = srcBones['index00_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['hand.L']]
bones['palm.01.L'] = bone.name
bone = arm.edit_bones.new('palm.02.L')
bone.head[:] = 0.6970, 0.0389, 1.2877
bone.tail[:] = 0.7518, 0.0277, 1.2487
bone.roll = -2.5274
tmp = srcBones['middle00_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['hand.L']]
bones['palm.02.L'] = bone.name
bone = arm.edit_bones.new('palm.03.L')
bone.head[:] = 0.6963, 0.0545, 1.2874
bone.tail[:] = 0.7540, 0.0521, 1.2482
bone.roll = -2.5843
tmp = srcBones['ring00_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['hand.L']]
bones['palm.03.L'] = bone.name
bone = arm.edit_bones.new('palm.04.L')
bone.head[:] = 0.6929, 0.0696, 1.2871
bone.tail[:] = 0.7528, 0.0763, 1.2428
bone.roll = -2.5155
tmp = srcBones['pinky00_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['hand.L']]
bones['palm.04.L'] = bone.name
bone = arm.edit_bones.new('palm.01.R')
bone.head[:] = -0.6921, 0.0224, 1.2882
bone.tail[:] = -0.7464, 0.0051, 1.2482
bone.roll = 2.4928
tmp = srcBones['index00_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['hand.R']]
bones['palm.01.R'] = bone.name
bone = arm.edit_bones.new('palm.02.R')
bone.head[:] = -0.6970, 0.0389, 1.2877
bone.tail[:] = -0.7518, 0.0277, 1.2487
bone.roll = 2.5274
tmp = srcBones['middle00_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['hand.R']]
bones['palm.02.R'] = bone.name
bone = arm.edit_bones.new('palm.03.R')
bone.head[:] = -0.6963, 0.0544, 1.2874
bone.tail[:] = -0.7540, 0.0521, 1.2482
bone.roll = 2.5843
tmp = srcBones['ring00_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['hand.R']]
bones['palm.03.R'] = bone.name
bone = arm.edit_bones.new('palm.04.R')
bone.head[:] = -0.6929, 0.0696, 1.2871
bone.tail[:] = -0.7528, 0.0763, 1.2428
bone.roll = 2.5155
tmp = srcBones['pinky00_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['hand.R']]
bones['palm.04.R'] = bone.name
bone = arm.edit_bones.new('f_index.01.L')
bone.head[:] = 0.7464, 0.0051, 1.2482
bone.tail[:] = 0.7718, 0.0013, 1.2112
bone.roll = -2.0315
tmp = srcBones['index01_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.01.L']]
bones['f_index.01.L'] = bone.name
bone = arm.edit_bones.new('thumb.01.L')
bone.head[:] = 0.6705, 0.0214, 1.2738
bone.tail[:] = 0.6857, 0.0015, 1.2404
bone.roll = -0.1587
tmp = srcBones['thumb01_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.01.L']]
bones['thumb.01.L'] = bone.name
bone = arm.edit_bones.new('f_middle.01.L')
bone.head[:] = 0.7518, 0.0277, 1.2487
bone.tail[:] = 0.7762, 0.0234, 1.2058
bone.roll = -2.0067
tmp = srcBones['middle01_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.02.L']]
bones['f_middle.01.L'] = bone.name
bone = arm.edit_bones.new('f_ring.01.L')
bone.head[:] = 0.7540, 0.0521, 1.2482
bone.tail[:] = 0.7715, 0.0499, 1.2070
bone.roll = -2.0082
tmp = srcBones['ring01_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.03.L']]
bones['f_ring.01.L'] = bone.name
bone = arm.edit_bones.new('f_pinky.01.L')
bone.head[:] = 0.7528, 0.0763, 1.2428
bone.tail[:] = 0.7589, 0.0765, 1.2156
bone.roll = -1.9749
tmp = srcBones['pinky00_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.04.L']]
bones['f_pinky.01.L'] = bone.name
bone = arm.edit_bones.new('f_index.01.R')
bone.head[:] = -0.7464, 0.0051, 1.2482
bone.tail[:] = -0.7718, 0.0012, 1.2112
bone.roll = 2.0315
tmp = srcBones['index01_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.01.R']]
bones['f_index.01.R'] = bone.name
bone = arm.edit_bones.new('thumb.01.R')
bone.head[:] = -0.6705, 0.0214, 1.2738
bone.tail[:] = -0.6857, 0.0015, 1.2404
bone.roll = 0.1587
tmp = srcBones['thumb01_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.01.R']]
bones['thumb.01.R'] = bone.name
bone = arm.edit_bones.new('f_middle.01.R')
bone.head[:] = -0.7518, 0.0277, 1.2487
bone.tail[:] = -0.7762, 0.0233, 1.2058
bone.roll = 2.0067
tmp = srcBones['middle01_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.02.R']]
bones['f_middle.01.R'] = bone.name
bone = arm.edit_bones.new('f_ring.01.R')
bone.head[:] = -0.7540, 0.0521, 1.2482
bone.tail[:] = -0.7715, 0.0499, 1.2070
bone.roll = 2.0082
tmp = srcBones['ring01_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.03.R']]
bones['f_ring.01.R'] = bone.name
bone = arm.edit_bones.new('f_pinky.01.R')
bone.head[:] = -0.7528, 0.0763, 1.2428
bone.tail[:] = -0.7589, 0.0765, 1.2156
bone.roll = 1.9749
tmp = srcBones['pinky01_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = False
bone.parent = arm.edit_bones[bones['palm.04.R']]
bones['f_pinky.01.R'] = bone.name
bone = arm.edit_bones.new('f_index.02.L')
bone.head[:] = 0.7718, 0.0013, 1.2112
bone.tail[:] = 0.7840, -0.0003, 1.1858
bone.roll = -1.8799
tmp = srcBones['index02_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_index.01.L']]
bones['f_index.02.L'] = bone.name
bone = arm.edit_bones.new('thumb.02.L')
bone.head[:] = 0.6857, 0.0015, 1.2404
bone.tail[:] = 0.7056, -0.0057, 1.2145
bone.roll = -0.4798
tmp = srcBones['thumb02_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['thumb.01.L']]
bones['thumb.02.L'] = bone.name
bone = arm.edit_bones.new('f_middle.02.L')
bone.head[:] = 0.7762, 0.0234, 1.2058
bone.tail[:] = 0.7851, 0.0218, 1.1749
bone.roll = -1.8283
tmp = srcBones['middle02_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_middle.01.L']]
bones['f_middle.02.L'] = bone.name
bone = arm.edit_bones.new('f_ring.02.L')
bone.head[:] = 0.7715, 0.0499, 1.2070
bone.tail[:] = 0.7794, 0.0494, 1.1762
bone.roll = -1.8946
tmp = srcBones['ring02_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_ring.01.L']]
bones['f_ring.02.L'] = bone.name
bone = arm.edit_bones.new('f_pinky.02.L')
bone.head[:] = 0.7589, 0.0765, 1.2156
bone.tail[:] = 0.7618, 0.0770, 1.1932
bone.roll = -1.9059
tmp = srcBones['pinky02_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_pinky.01.L']]
bones['f_pinky.02.L'] = bone.name
bone = arm.edit_bones.new('f_index.02.R')
bone.head[:] = -0.7718, 0.0012, 1.2112
bone.tail[:] = -0.7840, -0.0003, 1.1858
bone.roll = 1.8799
tmp = srcBones['index02_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_index.01.R']]
bones['f_index.02.R'] = bone.name
bone = arm.edit_bones.new('thumb.02.R')
bone.head[:] = -0.6857, 0.0015, 1.2404
bone.tail[:] = -0.7056, -0.0057, 1.2145
bone.roll = 0.4798
tmp = srcBones['thumb02_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['thumb.01.R']]
bones['thumb.02.R'] = bone.name
bone = arm.edit_bones.new('f_middle.02.R')
bone.head[:] = -0.7762, 0.0233, 1.2058
bone.tail[:] = -0.7851, 0.0218, 1.1749
bone.roll = 1.8283
tmp = srcBones['middle02_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_middle.01.R']]
bones['f_middle.02.R'] = bone.name
bone = arm.edit_bones.new('f_ring.02.R')
bone.head[:] = -0.7715, 0.0499, 1.2070
bone.tail[:] = -0.7794, 0.0494, 1.1762
bone.roll = 1.8946
tmp = srcBones['ring02_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_ring.01.R']]
bones['f_ring.02.R'] = bone.name
bone = arm.edit_bones.new('f_pinky.02.R')
bone.head[:] = -0.7589, 0.0765, 1.2156
bone.tail[:] = -0.7618, 0.0770, 1.1932
bone.roll = 1.9059
tmp = srcBones['pinky02_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_pinky.01.R']]
bones['f_pinky.02.R'] = bone.name
bone = arm.edit_bones.new('f_index.03.L')
bone.head[:] = 0.7840, -0.0003, 1.1858
bone.tail[:] = 0.7892, 0.0006, 1.1636
bone.roll = -1.6760
tmp = srcBones['index03_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_index.02.L']]
bones['f_index.03.L'] = bone.name
bone = arm.edit_bones.new('thumb.03.L')
bone.head[:] = 0.7056, -0.0057, 1.2145
bone.tail[:] = 0.7194, -0.0098, 1.1995
bone.roll = -0.5826
tmp = srcBones['thumb03_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['thumb.02.L']]
bones['thumb.03.L'] = bone.name
bone = arm.edit_bones.new('f_middle.03.L')
bone.head[:] = 0.7851, 0.0218, 1.1749
bone.tail[:] = 0.7888, 0.0216, 1.1525
bone.roll = -1.7483
tmp = srcBones['middle03_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_middle.02.L']]
bones['f_middle.03.L'] = bone.name
bone = arm.edit_bones.new('f_ring.03.L')
bone.head[:] = 0.7794, 0.0494, 1.1762
bone.tail[:] = 0.7781, 0.0498, 1.1577
bone.roll = -1.6582
tmp = srcBones['ring03_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_ring.02.L']]
bones['f_ring.03.L'] = bone.name
bone = arm.edit_bones.new('f_pinky.03.L')
bone.head[:] = 0.7618, 0.0770, 1.1932
bone.tail[:] = 0.7611, 0.0772, 1.1782
bone.roll = -1.7639
tmp = srcBones['pinky03_L']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_pinky.02.L']]
bones['f_pinky.03.L'] = bone.name
bone = arm.edit_bones.new('f_index.03.R')
bone.head[:] = -0.7840, -0.0003, 1.1858
bone.tail[:] = -0.7892, 0.0006, 1.1636
bone.roll = 1.6760
tmp = srcBones['index03_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_index.02.R']]
bones['f_index.03.R'] = bone.name
bone = arm.edit_bones.new('thumb.03.R')
bone.head[:] = -0.7056, -0.0057, 1.2145
bone.tail[:] = -0.7194, -0.0098, 1.1995
bone.roll = 0.5826
tmp = srcBones['thumb03_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['thumb.02.R']]
bones['thumb.03.R'] = bone.name
bone = arm.edit_bones.new('f_middle.03.R')
bone.head[:] = -0.7851, 0.0218, 1.1749
bone.tail[:] = -0.7888, 0.0216, 1.1525
bone.roll = 1.7483
tmp = srcBones['middle03_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_middle.02.R']]
bones['f_middle.03.R'] = bone.name
bone = arm.edit_bones.new('f_ring.03.R')
bone.head[:] = -0.7794, 0.0494, 1.1762
bone.tail[:] = -0.7781, 0.0498, 1.1577
bone.roll = 1.6582
tmp = srcBones['ring03_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_ring.02.R']]
bones['f_ring.03.R'] = bone.name
bone = arm.edit_bones.new('f_pinky.03.R')
bone.head[:] = -0.7618, 0.0770, 1.1932
bone.tail[:] = -0.7611, 0.0772, 1.1782
bone.roll = 1.7639
tmp = srcBones['pinky03_R']
bone.head[:] = tmp['head']
bone.tail[:] = tmp['tail']
bone.roll = tmp['roll']
bone.use_connect = True
bone.parent = arm.edit_bones[bones['f_pinky.02.R']]
bones['f_pinky.03.R'] = bone.name
bpy.ops.object.mode_set(mode='OBJECT')
pbone = obj.pose.bones[bones['spine']]
pbone.rigify_type = 'spines.super_spine'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.neck_pos = 4
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['spine.001']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['thigh.L']]
pbone.rigify_type = 'limbs.super_limb'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.limb_type = "leg"
except AttributeError:
pass
try:
pbone.rigify_parameters.fk_layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['thigh.R']]
pbone.rigify_type = 'limbs.super_limb'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.fk_layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
try:
pbone.rigify_parameters.limb_type = "leg"
except AttributeError:
pass
pbone = obj.pose.bones[bones['spine.002']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['shin.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['shin.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['spine.003']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['foot.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['foot.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['spine.004']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['shoulder.L']]
pbone.rigify_type = 'basic.super_copy'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.make_widget = False
except AttributeError:
pass
pbone = obj.pose.bones[bones['shoulder.R']]
pbone.rigify_type = 'basic.super_copy'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.make_widget = False
except AttributeError:
pass
pbone = obj.pose.bones[bones['breast.L']]
pbone.rigify_type = 'basic.super_copy'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['breast.R']]
pbone.rigify_type = 'basic.super_copy'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['toe.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['heel.02.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['toe.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['heel.02.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['spine.005']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['upper_arm.L']]
pbone.rigify_type = 'limbs.super_limb'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
try:
pbone.rigify_parameters.fk_layers = [False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['upper_arm.R']]
pbone.rigify_type = 'limbs.super_limb'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
try:
pbone.rigify_parameters.fk_layers = [False, False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['forearm.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['forearm.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['hand.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['hand.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['palm.01.L']]
pbone.rigify_type = 'limbs.super_palm'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['palm.02.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['palm.03.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['palm.04.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['palm.01.R']]
pbone.rigify_type = 'limbs.super_palm'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['palm.02.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['palm.03.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['palm.04.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'YXZ'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_index.01.L']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['thumb.01.L']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['f_middle.01.L']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['f_ring.01.L']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['f_pinky.01.L']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['f_index.01.R']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['thumb.01.R']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['f_middle.01.R']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['f_ring.01.R']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['f_pinky.01.R']]
pbone.rigify_type = 'limbs.simple_tentacle'
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
try:
pbone.rigify_parameters.tweak_extra_layers = True
except AttributeError:
pass
try:
pbone.rigify_parameters.tweak_layers = [False, False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
except AttributeError:
pass
pbone = obj.pose.bones[bones['f_index.02.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['thumb.02.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_middle.02.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_ring.02.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_pinky.02.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_index.02.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['thumb.02.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_middle.02.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_ring.02.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_pinky.02.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_index.03.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['thumb.03.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_middle.03.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_ring.03.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_pinky.03.L']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_index.03.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['thumb.03.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_middle.03.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_ring.03.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
pbone = obj.pose.bones[bones['f_pinky.03.R']]
pbone.rigify_type = ''
pbone.lock_location = (False, False, False)
pbone.lock_rotation = (False, False, False)
pbone.lock_rotation_w = False
pbone.lock_scale = (False, False, False)
pbone.rotation_mode = 'QUATERNION'
pbone.bone.layers = [False, False, False, False, False, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False]
bpy.ops.object.mode_set(mode='EDIT')
for bone in arm.edit_bones:
bone.select = False
bone.select_head = False
bone.select_tail = False
for b in bones:
bone = arm.edit_bones[bones[b]]
bone.select = True
bone.select_head = True
bone.select_tail = True
arm.edit_bones.active = bone
arm.layers = [(x in [3, 5, 7, 10, 13, 16]) for x in range(32)]
#ob = bpy.data.objects.new("testObj", arm)
print("Sadfsdfasdf")
o = bpy.context.object
mblName = o.name
print(mblName)
armature = bpy.context.object.find_armature()
print(armature)
#bones = armature.data.bones
#bpy.ops.object.mode_set(mode='EDIT')
boneData = {}
bpy.data.objects[armature.name].select = True
bpy.context.scene.objects.active = bpy.data.objects[armature.name]
bpy.ops.object.mode_set(mode='EDIT')
for b in bpy.context.object.data.edit_bones:
boneData[b.name] = {'roll': b.roll, 'head': b.head,'tail': b.tail}
bpy.ops.object.mode_set(mode='OBJECT')
#print(boneData)
# print(b.name, b.roll, b.head, b.tail)
#bpy.ops.object.armature_human_metarig_add()
#target = bpy.context.object
#targetBones = target.data.bones;
#print(targetBones)
#for b in targetBones:
# print(b.name)
#targetBones = target.bones
#print(armature.name)
#for b in targetBones:
# print(type(b))
bpy.ops.object.select_all(action='DESELECT')
if __name__ == "__main__":
amt = bpy.data.armatures.new("ArmatureTest")
ob = bpy.data.objects.new("ObjectTest2", amt)
scn = bpy.context.scene
scn.objects.link(ob)
scn.objects.active = ob
ob.select = True
print(boneData['pelvis'])
create(ob, boneData)
| 46.29109 | 271 | 0.636333 | 11,981 | 83,648 | 4.339371 | 0.033219 | 0.536834 | 0.701962 | 0.79823 | 0.927948 | 0.840969 | 0.835391 | 0.821466 | 0.808771 | 0.789229 | 0 | 0.061231 | 0.206341 | 83,648 | 1,806 | 272 | 46.316722 | 0.721893 | 0.011752 | 0 | 0.517015 | 1 | 0 | 0.06214 | 0.002598 | 0 | 0 | 0 | 0 | 0 | 1 | 0.000597 | false | 0.020299 | 0.001194 | 0 | 0.001791 | 0.002388 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
c445b6216e8c10e3c126f37e916ab65387c613ac | 24,103 | py | Python | Python/windwardrestapi/Model/Issue.py | windward-studios/Windward-REST-version-2-Clients | 8fd467e6f4ece6fcc435609ffb23448d07af3131 | [
"MIT"
] | null | null | null | Python/windwardrestapi/Model/Issue.py | windward-studios/Windward-REST-version-2-Clients | 8fd467e6f4ece6fcc435609ffb23448d07af3131 | [
"MIT"
] | 1 | 2020-10-12T20:32:05.000Z | 2020-10-12T20:38:04.000Z | Python/windwardrestapi/Model/Issue.py | windward-studios/Windward-REST-version-2-Clients | 8fd467e6f4ece6fcc435609ffb23448d07af3131 | [
"MIT"
] | null | null | null | __pyarmor__(__name__, __file__, b'\x50\x59\x41\x52\x4d\x4f\x52\x00\x00\x03\x08\x00\x55\x0d\x0d\x0a\x04\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x40\x00\x00\x00\x40\x17\x00\x00\x00\x00\x00\x10\xf8\x11\xb6\x5e\x54\xd5\xa2\xe0\x9d\x78\x01\x75\x52\x94\x11\x35\x00\x00\x00\x00\x00\x00\x00\x00\x37\x4d\x08\x9a\xa1\xdd\xf6\x53\x86\xa9\xe2\x36\xf4\xc2\xe4\x53\x80\x5c\x23\x14\x96\x95\x28\xb6\xfe\x4c\x96\x7c\x27\x38\x92\x3f\x8e\x45\xbf\x20\x00\xfc\xa8\x08\x10\x35\x66\x46\xa8\x7d\x1d\x6d\x3a\x4a\xdd\xd1\x4c\x9c\xd1\xcb\x0c\x9d\x5a\xe7\x21\x5b\xc1\x0f\x7a\x06\x4e\x72\xd4\x86\x01\xe1\x3d\x81\x63\xb6\x92\x58\xba\x26\x8e\xcb\xce\x02\x45\xa5\x83\xe4\xa2\x57\x91\x3a\xdb\xf3\x85\x47\x03\xa8\x46\x8a\x25\x1d\xdc\xff\x6f\x36\xab\x18\x0a\xee\x34\x75\xc7\x3f\x7a\x3f\xe8\xb7\x2a\x6b\x2d\xab\x1e\x3d\x2b\xee\xd8\x43\x36\x3f\xc6\x59\x26\x70\xc9\x18\x9b\x45\xf9\x53\x61\xa6\x9a\x7c\x70\x57\xc3\xcc\x50\x72\xdf\xdf\x3a\xb2\xa9\xdc\xe2\x75\x09\x0c\x69\x35\xb0\x3c\x68\x59\xeb\x50\x23\x39\x6e\x26\x01\xc9\x27\xd7\x4f\x9b\xf7\xe8\xdf\x7c\x1b\x86\x31\x8d\xc7\xba\x0a\x5f\xb8\x91\x6c\x62\xd5\xfa\x4b\x5b\x5c\x07\x37\x91\x9c\x76\x90\x28\xac\xa3\x91\x30\x97\xd7\x21\xf4\x2f\x6a\x43\x47\xe1\x9f\x3e\x79\x98\xb3\x54\x3d\x56\x46\xc1\x3b\xd0\x82\x4a\x96\x53\x85\xf1\x58\x98\xed\xcc\xc2\x16\x85\xdb\x2e\xa6\x06\x84\x09\xff\xc6\x53\x43\xf9\xec\x66\x6e\x39\x6d\x44\x9d\x19\x51\x72\x3c\xfc\x7e\xef\x17\x32\xb4\xd9\xd2\x4a\x9f\x89\xef\xd4\x62\x48\x8e\x0d\x6f\xba\x80\x84\xd4\x0f\x84\x69\x85\x55\x08\xb7\xb0\x81\x7a\x45\x9e\xdd\x86\x07\xae\x30\x85\xe2\xda\x3c\x88\x9f\xc7\xd4\x03\x6a\xe9\xa5\x59\x18\xd2\x8f\x03\xd1\xd1\x2a\xb2\xce\x61\xc9\xcf\x72\xd7\x85\xf0\x29\xe8\x1a\x33\xc1\xe4\xd4\x04\xc7\x09\x3c\x78\x99\x6f\x9a\x54\xe6\x63\xf5\x00\x77\x82\x05\x93\x2a\x2a\xd4\x54\xee\x01\xfe\xff\x07\x22\x6c\x75\x9e\xb4\x91\x3c\x55\x17\x02\x07\xff\x86\xcf\x29\xfe\x39\x2b\x29\xa5\x14\xf2\x60\x01\x9d\x3f\x6c\xb5\x35\xe6\x68\xac\x18\x62\xa0\xd2\x2b\x10\xf7\xff\x93\x86\xc8\x25\x92\x0c\x01\x58\x93\x31\xdb\x12\xe9\x94\x97\x17\xd2\x95\x98\xd0\x64\x0f\xa9\xb6\xcf\x7a\x18\xfc\x5f\xf4\x21\x44\x9b\x12\xf1\x56\x8a\x12\x8c\x7a\x7d\xd5\xfc\x30\x15\x5a\x77\x65\x86\xf8\x19\x32\x8d\x88\x11\x62\x1d\xea\x86\xb6\x6f\x1f\xd0\x3b\x0e\x11\x40\xb0\x75\xdb\x77\x08\xa0\x57\xa3\xee\xdd\x48\x2e\x6c\xd0\xdc\x83\x7a\xb2\x32\x78\xa5\x77\x58\x51\x8a\x50\xdb\x32\xbc\xe5\xa0\xf3\x0e\x69\xca\xf3\x1e\xd8\xc1\x30\xd8\xdd\x4e\x81\xa4\x09\x7d\x62\xff\xee\x35\x9c\x9b\x5c\x6b\xa9\xe7\x1b\xd3\xb6\x7e\x94\x1a\x4a\x5d\x3f\xe6\x7d\xe1\x30\xdc\x5d\x71\x56\x47\x5b\x08\x2c\x17\xcc\x6d\xe2\xaf\x4b\x6e\x11\x99\x74\x7a\xce\xe6\x61\x74\x60\x66\xf0\x98\xc5\xa7\x26\xde\xaf\x6a\xf5\x1d\x3b\x98\x19\xec\xfa\x10\x90\x6b\xb3\x9d\xab\x5f\xfa\x7c\x89\x54\x8c\xde\x4a\x7b\x13\x96\x67\xa9\xf8\x03\x8b\x90\x86\x3c\x89\x59\x6c\x8c\x91\xdd\x77\xbb\xdc\x70\x34\x47\x59\xb3\x55\x8c\x64\x24\x1d\xd4\x05\x67\x6b\x2f\x48\xfb\x00\x25\x86\x4b\x5d\x0f\x5e\x50\xbc\x55\x79\x92\xf9\x03\x10\x77\xbd\xe9\x98\x17\x69\x48\x3f\xe2\x8c\xe8\x37\x92\x20\x52\xa8\xae\x1d\xfc\xa7\x04\x87\xae\x3a\x2e\x04\xce\x7b\x3b\xc5\xe1\x2e\xc6\x52\x1a\x4b\x0e\x43\x15\xc0\xa6\x20\x9d\x43\x72\x7e\x08\x1f\x4f\x94\x28\x10\xb5\x6f\x46\x28\xb7\x93\x9f\xe6\x07\x82\x47\x8a\x9d\xd8\x1a\x84\xb6\xbc\xc3\x85\xec\xe4\xf4\x6a\xa9\x8f\x04\x5b\x88\x63\x7d\xd6\x19\x80\x77\x1f\xa7\x12\xc2\xe4\xaa\xe2\xc8\xe6\x91\xd4\x25\x1c\xcb\xd8\x92\xe7\xa9\x58\x10\x83\x48\x59\x49\x0d\x50\xb8\x69\xb0\x86\xb1\x1b\xe9\x87\xe7\x75\x48\x5f\xc4\x64\xcf\x9d\x38\x9e\x46\xf1\x1e\x4f\x54\xc2\x45\x22\x4b\x4e\x67\xc6\x0a\x6f\x3d\xc6\x24\x00\x56\xcb\x55\xdc\x5a\x96\x9f\x6d\xb3\xff\xdc\x4b\x9a\x73\xc8\xbd\x72\x03\xd2\x2e\xdf\xa9\x6b\x30\xe7\xb5\x77\x5b\x47\x50\xd4\x39\x50\xd7\x35\x63\x81\x76\xcb\x48\xcd\xaa\x3c\x87\xda\x35\x1e\x8a\x90\xa4\x87\x05\x99\x70\x92\x4c\xf4\x33\x72\xa3\x55\x90\xfa\x9c\x13\x62\x31\x8a\x44\xb2\x2c\xdd\x53\x88\x9a\xd8\xcd\x54\xfc\xd3\xca\x11\x4f\xea\x0f\xa2\x86\xdb\x77\xe1\xde\x15\xbf\xa2\x34\x81\x7b\xd5\xda\x07\xa9\x81\xb5\x8a\xc3\x81\x8a\xae\xa6\x75\x3e\x26\x2c\x66\xba\x6d\x2d\x15\x45\x22\x0f\x71\x62\xe1\x87\x5c\xf6\x57\xba\x52\xfb\xd4\x0e\x04\xf3\x98\xc6\x7f\x37\x3d\x6b\x44\xc7\xb6\x5e\x45\x8f\xe8\x61\x4f\xdf\x49\x82\x68\x62\xf1\xdf\x8a\x6a\xa1\x61\xc8\x00\x30\x27\x07\x23\x81\xbf\x4b\x72\x8e\xb7\xd3\x3c\xe2\x40\xae\x23\xae\x4a\xe8\xb5\xb7\x4d\x8b\x83\xd4\xb2\x72\x95\x28\xdd\x42\x9f\x90\x90\xc4\x0b\xa1\xf8\x2d\xf0\x6e\x00\x7c\xc8\x98\xda\xc7\x46\x2e\xb8\x7d\xc9\xf9\xac\xe4\xe8\x71\x1e\x4d\xc2\xae\x06\xca\x54\x78\xc1\x14\xad\x09\xca\xc4\xe0\x44\x80\xee\x08\x06\x1e\xb1\x8d\xa6\x15\x58\x86\x30\xbf\x12\x8e\xfc\xd5\x89\xa9\x12\xe3\x13\xf1\x7c\x80\x95\xb1\x58\x4f\x69\xb5\xc4\xbe\x7a\xf0\x81\x43\xfa\x49\x28\x0d\x29\x9f\xa0\xc6\x51\xf2\x23\x45\x0e\xc1\x1b\xef\x74\x3c\xe2\x34\x5e\x70\xcb\xae\x2d\x8d\xf3\x18\x94\x57\x37\xeb\xe8\x3b\x41\x09\x5f\xa0\x39\xd2\x55\x63\x87\x16\xf6\x9b\x5b\x1d\x4a\x20\x44\xa1\x04\xba\x56\xeb\x4b\xa6\xce\x4e\x30\x52\x99\x50\x53\x1b\x69\xcc\x18\xbd\x8f\xc6\xa3\x62\x23\xee\x96\x30\x9d\xec\x13\x23\xe5\x7b\xaa\xe2\x32\x24\x53\x46\x97\xd1\x27\x51\xde\x2b\x5e\x02\x7b\x8e\x16\x26\xbf\x6f\xdf\xda\x07\x0f\xf5\xca\x21\x6c\x0d\x30\x13\x11\xcd\xf2\x43\xcd\x2b\x55\xfd\x0e\x7f\x68\x02\x53\x8c\xc7\x88\xab\x80\xa3\x88\x9d\x7c\xa5\x85\x06\x96\x0c\xee\x4e\x4c\x0d\x8c\xa5\x50\xb9\xdb\x4e\x63\xb6\x9b\x47\x69\x45\xdd\x7d\x6b\xf8\x14\xef\x99\xfb\x07\xae\xe3\x55\xb0\xde\x36\x41\xe7\x1c\x78\x19\x98\xb7\x2f\x95\x75\x4f\x75\xe0\xde\x79\x48\xa8\x64\x34\x18\x95\x6a\x52\x96\xa5\x4c\xac\x03\x98\x6e\x58\xee\x8d\x51\xbb\xb5\x41\x01\x06\x7f\x87\x76\xdc\x5a\x9d\x12\x97\x47\x59\xcf\x0b\x27\xc1\x0a\xdc\x60\xeb\xcd\xe2\x60\xe3\xd0\x0c\x95\x78\x26\xba\x8a\xe3\x8b\xf4\x56\x6a\x35\xfe\x00\x52\x91\x98\xc7\xe8\x7c\xeb\xf6\x85\x6e\x62\xea\x52\xf0\xd1\x60\x55\x78\x04\x05\x0c\xee\x58\x27\x73\xa2\xda\xb3\x94\xef\x77\x1d\xb4\x18\xf9\x3e\x44\x44\xde\x9d\x3d\x94\x53\x5d\xa1\xfb\xa3\x23\xfe\x8c\x2e\x92\x23\x32\x46\xc6\x4f\x8f\x6a\xbd\x4c\x1f\x87\xd7\x53\xee\xf1\xfc\xb0\x14\x47\xde\x68\xaf\x72\x1b\xd2\x85\xc1\xc4\x91\x25\x6e\xb4\x96\xe3\xd7\xc7\x6e\xa1\x60\x3f\x59\xa2\xec\xfe\x1f\x06\x5b\x38\x09\x23\x75\x36\xd4\x34\xc7\x0d\x74\xa6\x72\xe5\x42\xa1\x9a\xfe\x4a\xdc\xd1\x75\xca\x88\xa7\xcf\xc0\xe3\x95\x00\xc3\xd1\x50\x61\x09\x0d\xe2\xaf\x48\x17\x62\xdd\x82\x51\x24\xfb\xac\x35\xfe\x40\xaf\xcc\x5d\xcf\x46\xbe\x12\x58\x00\x07\x26\x22\xc1\x01\x94\xe6\x9d\xa8\x00\x5f\x5d\x5f\x74\x62\xd9\x02\x1e\x6a\x08\x63\x24\x02\x07\xbe\xa3\x89\x61\xff\x4f\x90\xfe\x13\x0d\xbf\x4b\x07\x9b\x80\x98\x4c\x47\xd9\xed\x74\x5a\x99\x87\x2f\x41\x26\x64\xa4\xc3\x2e\xd5\x77\xcf\x11\x0c\x41\x02\x7f\x6d\x0d\xaa\xcb\xe1\xe9\xd1\x65\x0e\xf2\x2e\x4b\x28\xdc\x7e\x4f\x5b\xc6\x69\x42\xbe\x1c\x8f\xd8\x66\x40\x54\xf3\xc5\xf5\x4c\xea\xf0\x89\xcb\x66\x34\x29\xb2\x18\x45\x0c\xc3\xdc\x06\x83\xc4\x7d\x15\x73\x0b\x82\xd5\x17\xc4\xf9\xf3\xf6\x10\x49\x7d\x1c\x31\xd2\xf5\x96\x0e\x3d\x7a\x00\x38\x42\x3f\xfa\xfd\x1c\xdd\x28\x37\x05\xe2\x06\xd9\x24\x4f\x9a\x5e\x70\x15\x26\x84\x47\x6c\x3b\xb9\x75\xb8\xfe\x42\x8b\x0b\x91\x79\x71\x23\xba\x5a\x8e\x54\xdf\xab\xb5\x63\x39\x0f\x3a\x93\x42\x4b\x44\x45\x16\x40\x61\x9b\x56\x69\x8f\xd3\x6f\x4c\x7c\xf8\xea\x13\xc7\x00\xaa\x0d\x09\x61\xa2\xe0\x0a\x85\xa3\xfb\xc2\xc9\x8b\x35\x21\x88\xcd\xf4\x4d\x99\xd6\x6a\x49\xbf\xf1\x3e\xdb\x41\x97\x6b\xda\x79\x23\xa6\x72\x98\x03\xe9\x4e\xea\x6d\x40\x4f\xc1\x1b\x03\xfa\xe7\xc2\x08\xa8\x01\x98\xfd\x70\x15\x31\x89\x99\x0c\xd3\xa8\xce\x27\xa8\x2e\x37\x36\x06\x5a\xa7\x76\x23\xf6\x00\x5d\xb9\xa6\x3a\x48\xaa\xe8\x84\x64\x95\x48\xec\xa3\x08\xe4\xe4\x6e\xcf\x4e\x7b\x88\x72\xa4\x58\xd6\xaf\xcb\x17\x42\x9e\x82\x9a\x80\x71\x32\xb4\x1f\x25\xec\x50\xce\x68\x45\x70\x18\xe9\x5a\x24\x11\x66\x89\x3b\x22\x37\xac\x01\x37\xd6\x70\x55\x02\x4a\x8f\x08\x26\xbe\x23\x44\x1c\xce\xcf\x7c\x42\x79\xa3\x5d\xd0\x2f\x49\xeb\x60\xc7\x42\x97\x3e\x85\x61\xd7\xe3\x32\x88\x2f\x6c\x6f\xbd\x2c\x48\x83\xe0\x6f\xe7\x53\xd1\x77\xfb\x2a\x75\x0a\x88\xda\x00\x89\x41\x3c\xd0\x24\x73\x57\xea\x82\xee\x0d\x9b\xd0\x26\x06\x8f\xc8\x69\xaa\x3c\xf3\xd7\x69\xdb\x8d\x2b\x5d\x7e\xb5\x6b\x82\x0b\x8e\xb4\x43\x7d\x13\xb8\x9c\xcb\xea\xff\x72\x5a\xe7\x9e\xca\x38\x63\xed\x74\x4b\x9a\xe6\xaf\x53\xbd\xf9\xc3\x70\xb3\x2c\xf2\xae\x1b\x6b\xaa\xc2\x38\xba\x22\x6e\x70\x9a\x29\xcd\xc6\x38\xf1\xe2\x1a\x99\x74\x06\x46\x1a\x82\x61\xa5\x02\xb6\x6c\xf1\xbc\xfa\x02\x40\x4e\xd5\xae\x37\xc7\xcb\xc3\x2f\x99\xe5\x01\xac\xbc\xf1\x3e\x08\xc5\xa5\x28\xf9\x98\x40\xa9\x6f\x66\x67\x44\x56\xf6\xa9\x5b\x41\xd8\x53\xf4\x1d\x33\xff\xfa\x99\x2e\x44\x52\x83\x2a\x0f\x27\x0d\x04\x58\xe2\x19\xfd\x2d\xb5\x1f\x4c\x53\x85\x8d\xc6\xee\x07\x3d\x96\x27\xeb\x2a\x68\xc9\x9d\x43\x93\x0d\x83\x92\xbc\x75\xbe\x30\x82\xf5\xa4\x81\x04\xc1\x4c\xa0\xa7\xb3\xac\x7e\x04\xde\x11\x1c\x5d\x24\x21\x59\x60\x4a\x05\xc6\xa9\x92\xdf\x05\x20\x27\x94\x6a\x38\x06\xd1\x25\xf1\x45\x18\x20\x69\x47\x58\xde\xb8\x2a\x8e\x24\x69\x4b\xce\x44\x7b\x2f\x09\x2c\x6b\xd1\x9b\x82\x47\xd5\x97\xcb\xeb\x9e\x0e\x1a\x94\x63\x71\x75\xe1\x91\x8f\xd6\xbc\x37\x0e\x17\x08\x00\x0f\x31\xa2\x79\x11\x92\x56\x69\x79\x96\x84\x05\x6c\xb8\x35\x88\x43\x28\xb8\xb8\x69\x01\x7a\x7a\x8a\xe1\x2a\xb2\x41\xa0\x34\xc0\xae\x50\x9e\xcd\x48\x7c\xae\x9b\x17\x33\x23\xac\x70\x34\x31\x28\xec\xbf\x44\x0b\x97\xcd\x30\x36\xad\x35\xad\x19\x02\x25\xfb\x2c\xee\x38\x78\x3f\x10\xe1\x17\x95\x43\x0d\x7f\xf6\xdd\xa5\x5f\x23\xa3\xb9\xb7\x50\x78\x7c\x78\xe8\x14\xe4\x23\x43\x54\xec\xc8\x3b\xf7\x5b\x63\xe6\x7e\xde\x1c\xb7\x09\x5e\xab\x1f\xf3\x0f\xbc\xff\x85\x80\x8b\x5a\xf7\x4f\x20\xf8\xe6\x94\xbc\x36\x95\xb6\x88\xa3\xc3\x18\x47\x1b\x8f\x81\x47\x32\x6d\x5d\x16\x1b\xe8\x94\xe3\x42\x7e\x57\xc8\xcf\x15\xa7\x7c\x7b\x62\xce\xb7\xe6\x78\x2b\xdc\x21\xfb\x61\xdf\xfb\xa0\x95\x1e\xb8\x52\xd5\x54\x57\x0b\xeb\xe7\x43\x77\x70\x59\x48\x6f\xc4\xcf\x5e\x31\xdd\xe9\x2c\x70\xc1\x14\x81\x42\x4a\xd7\xd3\xb3\x4e\xaf\x93\x6a\x02\x62\xfd\x69\x89\x93\x30\x5c\x5f\xdd\x17\xbf\xc2\x27\x5f\x97\xc5\x2b\xf6\xad\x5e\x10\x38\x37\xea\x95\xb6\x84\xd6\xe8\x14\xb1\x57\xfa\xd5\x8a\x98\xb1\xc3\x3f\xbe\x32\x84\xa3\xf8\x40\x25\x74\x78\x0b\xb3\xa6\x18\x51\xe8\x66\xbd\xe7\xca\x47\x19\x68\xd9\xd6\x14\x61\xf8\xda\xea\x94\x78\x52\xa3\x81\x4d\x15\xbe\x8e\x7d\x11\xcb\x7f\xe7\x18\x10\xac\xaf\xc8\x2f\xf9\x71\xa3\x84\x00\x78\x9b\xb1\x3f\x49\x0e\xd9\x2f\xae\xaa\xe7\x3a\x61\xf5\x0b\x6a\x13\x5e\xab\x07\xe8\xc8\xb2\xf1\xf3\x65\xbe\xf8\xdc\x79\x73\x63\x4e\xa6\x27\xd8\x99\xb7\x36\x6a\x00\x7d\xf6\x4e\x74\x78\x16\x7c\x54\xb7\x45\x5d\xbd\x3f\xa2\x77\x10\x76\x80\x4c\x42\x48\xd1\x76\xf9\xcb\xc1\x9f\x07\xf4\x8b\x43\x31\xcf\xa2\x1b\xd6\x65\xeb\x73\x18\xac\xfe\xe2\x8a\x40\xea\xfa\x9a\xd0\x91\x96\x88\x80\x0d\x6e\x80\x55\xd1\xf9\x25\xbf\x32\x00\x9a\x86\x89\x82\xbc\xc8\xa5\x6d\x64\xeb\xe0\xee\x44\xd7\x84\x04\x41\x60\xa5\xe2\x5e\xf5\xca\x95\x94\x63\x73\x3f\x08\x40\x57\xe5\x82\x80\xa1\xa0\xc8\x32\xc9\xcb\xec\x1a\x9e\x08\x1b\xd9\xb4\x83\x2d\x01\x56\x49\xe8\xd4\x85\xe6\x7b\x3d\x93\x58\xe5\x89\x2f\x3f\xd5\xb5\x7d\xd7\x75\x31\xe0\x5c\x6c\x60\x08\x7c\x86\xeb\x43\xd5\x04\x78\xf1\xa1\xcf\x01\x12\xdf\x96\x56\x4a\x5f\x9e\x2e\x2a\x20\xf5\xa2\x9f\xa5\x67\x11\x5a\xad\xca\x1e\x75\xbc\x31\x62\x16\x75\xd1\x57\x8e\x77\x53\x95\x4e\xfb\xef\x01\x83\x5c\xe8\x90\xc8\xe6\x51\x4c\x90\x37\x87\x97\xbb\xfc\xa3\x30\x60\x5e\x8c\xfe\x7e\x36\x22\x6c\xb6\xf5\x68\x60\x62\x84\xba\x2a\xc1\xa1\x5c\x83\xed\x13\x09\x6b\x87\x78\x4f\x48\xa2\x0f\xb3\xb3\x13\x5f\xe7\xdf\x56\x1a\xc8\x4e\xed\x44\xe0\xe3\x57\x71\x49\xdc\x8f\x30\x5d\x1e\x40\xc3\x93\x43\x18\x28\xbb\x2d\x78\x9a\xa9\xc1\xce\xbc\x42\xe1\xf8\x5a\x23\x5a\xb6\x1d\x91\x40\x47\x19\x87\x2a\x7d\xc8\x7d\x4b\x5e\xb4\x4e\xde\x2f\xc6\x1b\x75\x59\xbc\x35\xd9\x1e\xdb\x5d\x30\x4f\x9f\xc9\x22\x51\x6e\xe2\x27\x4a\x0c\x11\xe8\x53\x08\x02\xe0\x73\x63\xc2\xad\x0e\xd3\xcb\x81\x6d\xfd\x15\x6b\x7c\x71\xfe\x6b\xba\xe4\x9d\xda\xe2\xe2\x6d\xb8\x22\xed\x08\x7b\xf3\x84\x77\x6c\xc4\xea\xa1\x68\x7a\xb8\xc8\xbf\x4e\x27\x68\x53\x28\x00\xb8\xfd\xf9\x6f\x11\x2f\xf2\xf5\xc3\xae\xb5\xcd\xb6\x47\x33\x13\x16\xb8\x57\x60\x8e\x9b\x57\xe4\xaf\x29\x4e\x52\x90\x17\xf0\x47\xe7\x10\x2b\xb9\x51\xd2\x1e\xb5\x8e\xed\x5a\xe7\xea\x38\x3b\x68\x80\x9c\x2a\x54\x07\xff\xb5\x92\xfe\x06\xcf\xb0\x4e\x34\x69\x76\x83\x8a\xde\x35\x81\x47\xf7\xbd\xc5\x7b\x51\x11\xe8\x60\xf1\xe7\x5b\x67\x85\x78\x5f\x6a\xf6\x89\xb9\x03\xcd\x55\xad\x4d\x95\x03\xc2\x5e\x12\xea\x63\x65\x5d\x95\x7a\x32\xf8\x30\xf9\x9e\xd1\x33\x6c\xd3\x99\x47\xa8\xc1\x04\x6d\xfa\x74\x2e\xa6\x30\x4a\x5e\x1c\x0b\x5c\x17\xf6\xb2\xd7\x5e\x97\x70\x8b\x2d\xd5\x9d\x50\xb7\xaf\x72\x8e\xe2\x00\x3d\xc8\x8b\x2c\xd4\x51\xd3\xf0\x6b\x2e\x8b\x47\x9f\xe3\xa5\xab\x24\x2f\xe3\x9f\x2a\x3b\x53\x47\xb1\x05\x8c\x41\xda\x94\xa1\xb7\x67\xbb\x53\x14\x55\x32\x1b\xed\xaa\x5b\x0f\x31\x2c\x5d\x95\x9e\xb4\xba\x14\x28\x34\xb0\x74\x50\x17\xe2\xd3\x97\x36\xb1\x5d\xad\x6b\x78\xd2\xa6\x0f\xd4\x4f\x6e\x76\xcb\xc0\x5d\x15\xc4\x26\x7a\xc5\x96\xcd\xf7\x93\x06\x50\xe1\x34\x93\x1c\xd7\x17\x4f\x10\x6b\xc1\x72\xf6\xec\x58\x99\x22\x58\x96\xb2\x60\xa1\xf1\x3a\xf5\x70\xf2\x7d\x46\x4c\x8d\x09\xa7\x3c\x36\xb6\x0f\x71\x25\x9e\xcb\x97\x81\x80\x03\xd9\xf8\xfb\x58\x0e\x6c\x3b\x06\x70\xb5\x23\x74\xc2\x16\x93\x5a\x71\xfe\x21\x1d\x75\x04\xca\x17\xfd\x92\x75\x00\x54\xe5\x71\x85\xb7\x56\x3d\xcd\x26\x29\xa6\x12\x77\x5e\x3d\xd8\xac\xc0\x8c\xa1\x0d\x9f\x45\x78\x89\x80\xbb\x0e\x27\x21\x86\xb3\xe2\x69\xd9\x24\x1b\x94\xd6\xc2\xb2\xad\x1b\x3b\x3f\x7d\x22\xf0\xbd\x64\x4b\x09\x04\xf2\x16\x84\x93\x65\x89\xfd\xda\x18\xe1\x0c\x22\xcb\xa9\x12\x26\x98\xdd\xdb\xe6\xec\xaf\xf7\xf6\xf9\x44\x79\xbe\x41\x1c\x48\xdc\xb3\x64\x59\x09\xc4\xbe\xef\x2d\x7f\xe3\x8b\xed\x0f\xed\x9f\x7f\x51\xe5\xcc\xbd\x7e\x48\x12\x5c\x29\xa0\xc4\xf2\x38\xa1\x62\xc5\xa6\x09\xa5\xfa\x3b\xcb\x58\x09\xdb\x3b\x02\x86\x77\xb0\xb7\xe4\x37\x68\xe1\x36\xb7\xf4\x1f\xce\x33\x4f\x9e\x29\x07\x53\x12\x36\xa1\x73\x9b\x0e\x88\xe1\xd4\xf1\x04\x54\x0e\x62\x7f\x51\xad\xda\x1c\x19\xe8\x9b\xa4\xc4\xd8\x98\x46\xdd\xa0\x8c\xc4\x4a\x6c\x3c\x44\x68\x31\x38\x57\xd2\x3a\x4a\xc1\x6c\x50\x3d\x08\xf5\x1f\x34\x01\xce\xa3\x17\x2d\xef\xbf\x15\xcd\x68\x8a\x95\xe0\xd2\x16\xee\xec\x2f\xfd\xf6\xfc\x66\xd1\x27\x16\x5e\x12\x17\x08\x7c\x3d\x56\xd6\x55\x7b\x3c\xa0\xec\xd2\xf9\x3b\xc1\xc6\xa0\x71\xd7\xe4\x18\x3e\x08\x49\x49\xa4\x64\x6c\x61\xe6\xa5\x3c\x78\x26\x48\x44\x4a\x56\xcf\x68\x31\x4d\xe7\x0f\x7d\x48\x81\x21\x6c\xb8\xab\xee\x7e\x27\x81\x51\x36\x90\x3d\xbb\x38\x15\xad\x6d\x9f\xf2\x83\xdc\xf2\xdf\xc1\xb9\xc8\xdd\x2d\x66\x67\x2f\xf4\xa5\xcf\xa7\x15\xb5\xad\x07\x5f\x15\x39\x6e\xec\x26\x03\xe6\x6a\xdd\xef\xbb\xfd\x9b\xaf\x73\xde\x73\xf1\x6a\x6b\xa9\xcb\x5d\x9c\x87\xfc\x99\x4b\x83\x55\x1c\x54\xb7\xaf\xb4\x98\x7e\xcb\x81\x49\xda\x5b\x88\x81\x3e\x71\xf7\x11\x9d\x53\x4f\xdd\x47\xb1\x54\x4b\xb2\x43\xdd\x08\x21\x56\x93\xb2\xea\xe1\x1c\xf7\xe0\x93\xfa\x89\x8d\x85\xcb\xaa\x9e\xd0\x5c\xbb\x5a\xef\xa9\x0c\x34\x5f\x3b\xc8\x75\x00\x8d\x61\x0d\x5a\x27\x50\x11\x38\xbb\xa5\xdc\xcc\xaa\x85\xf3\x48\x79\x35\x62\xd6\x9a\x13\x65\xd9\x54\x32\x12\x1d\xb7\x36\x46\x31\xa1\x87\xa2\xdd\x15\x5c\x05\x60\x84\x75\x39\xda\xc6\x80\xe6\x60\x53\x70\xd3\x9a\x8f\xe0\x70\x86\x60\xa1\xb2\x8f\xd2\x7e\xb6\xaf\x0b\x0a\x21\x05\x67\xf8\x88\x58\x53\x0a\x8e\x7e\x7f\x83\x96\xef\xc2\xd1\x1a\x2c\x1d\x69\x08\xe2\x53\x82\x1a\x1d\x9c\xd6\xf6\x72\x9d\x1d\x4c\xb2\xc7\x1f\xf7\xd3\x10\x0a\x43\x44\x41\xa0\xa5\xec\x89\x5f\x3e\x5e\x08\xce\x01\xd1\x1d\xb9\x54\x7f\x4b\xdd\x9d\xa3\xc2\x63\xe8\x19\x13\x1e\xa2\x9e\x62\xef\xad\xac\xfd\x91\xf0\xf5\x03\x28\x74\xc4\x33\x9c\xb8\xfb\x77\x8a\xc6\xb7\x9f\xde\x09\xec\x7a\x91\x0e\x38\x5c\x51\x7c\xfb\x77\xc4\x05\xb3\xeb\x4e\xc4\xbe\xbb\x32\x12\xfc\x07\xb8\xca\xcd\xe7\x3a\x52\x67\x11\x6f\xf4\xf9\x54\x1b\x01\x6c\x46\xa7\x2a\x36\x75\x7a\xf7\x35\xf5\x21\x29\x09\x45\xbc\xdb\x84\xe7\x6c\x04\x05\x01\xf6\xf2\xdc\x5a\x32\xdc\xd1\x7c\x47\x51\xa5\x7a\x5f\x29\xfc\x01\x93\xbd\x89\x18\xc2\xac\xd0\xd6\x52\xe5\x62\xed\x63\xee\x78\x7d\x6e\xb4\x4a\x98\x2f\x18\x45\x46\x5c\x1f\xa6\x39\xb4\xe5\xd7\x19\xe0\xc2\xff\x12\xe6\x30\x1a\x89\x82\x7e\x91\x30\x5c\xbf\xe9\x39\x6a\x76\x56\xeb\x4f\xa7\x87\xff\xf5\xa8\x23\x2d\xc0\x29\x78\x0b\xe7\x4e\xdc\x3d\xa1\xe2\xdc\x81\x03\x54\x61\x1b\xd0\x72\xd5\x22\x0f\xcb\x5c\xa7\xc6\x61\xe3\x02\x5f\x2b\xd3\xfc\x0e\x12\xa8\x7e\x12\x58\x9b\x27\x3f\xe7\x94\xee\x7c\xef\x07\x74\xe7\x7c\xa9\xca\xd9\x09\x0e\x2d\x32\xb5\x36\x24\xf3\x9c\x3c\xfb\x84\xd4\x9d\xa4\xaf\x07\x34\x06\xa6\xf8\x70\x88\x75\xb4\x95\x2d\x48\x1b\xf8\x5e\xb9\xac\x62\x38\xd3\xa2\x47\x0f\x61\x62\x9d\x0c\x1d\xff\xb9\xda\x23\x35\x66\xf4\xc1\xc2\x7c\xd1\x58\x0a\xe6\xee\x8d\x89\xdb\xc6\x6e\x30\x3d\x5a\x90\xc9\x1a\x84\x4b\x52\x7a\x39\xbd\xc2\x3f\x2e\x67\x19\xcd\xf5\xde\x74\xfb\x12\xd8\xdd\xe2\x56\x95\x57\xaf\x0c\x0b\x40\x7b\xd1\x32\x82\xb4\xcc\xae\xf7\xbf\xd7\xa4\xaf\x04\xda\x53\x3b\xd9\xc8\x1c\x13\xe3\x1a\x0a\x07\xde\xda\x95\x38\x9d\x21\x16\x60\x25\x92\x14\x21\x55\x16\xcf\xc8\x83\xfb\x5c\xa4\x3c\xcf\xcd\xe0\xcb\x1e\x24\xc1\x75\x2d\x91\xf7\xaf\x51\x5f\x2b\x83\x0a\x7e\xa5\x90\x3d\x5d\x95\x0e\xe6\x1c\x21\x9b\x69\xd2\x9a\x55\xa3\x0c\xdf\x5d\xaa\xdc\xa0\xb5\x56\xf1\x50\x9e\x86\xae\xc3\xaf\x64\xe2\x62\xb5\xd4\x20\xa5\xa8\xad\x94\x7c\x68\xad\x0b\xca\xaf\x1b\x82\xb9\x1f\x2a\x73\x31\x50\xfa\xbc\x6e\x04\x52\xc6\x6a\xc5\xf2\xf0\x73\x5d\x5d\x0b\x6e\x22\xfe\x6d\x00\xd3\xe3\x3a\x3d\x4e\x2d\xcf\xcc\xf6\xfa\xf0\x63\xb3\xcd\x88\x9a\xa1\x71\x0b\x69\x1b\x95\x9a\xc7\x2b\xa3\xb4\x32\x7e\xaa\x89\xe1\xeb\xeb\xb1\x3e\x18\x86\xca\xfd\x99\xfe\xf4\xdb\xdb\xec\xe5\x36\x25\x3f\x99\x2b\x72\xfc\xbb\x4b\x5d\x7f\x0f\x3c\xef\x9d\xf9\x5b\x7e\x4a\x70\x4f\xc7\x4e\xcf\xbc\x30\xcf\xe6\xa0\x29\x68\xf5\xaf\x33\x6c\x57\xc4\x82\x12\x6f\xef\xdb\x73\x3d\xba\x99\x21\xcb\xf4\x54\xa9\x0a\x6f\x5e\x62\xd2\x42\x20\x7a\x3d\xe7\x44\xec\x7c\x07\xcc\x65\xda\x17\x12\x92\x98\xc1\x50\x44\xe1\xb0\x67\xf6\xa5\xf8\x07\x4b\xb2\xfa\x51\x96\x25\xf5\x82\x16\x15\x06\xa8\x18\xf2\x9c\x08\x1f\xe7\xa4\x7a\xb9\xcb\xee\xcd\xc2\x05\x4e\x50\xb9\xd3\x54\x2e\xd1\x3b\xf4\xb4\xb4\x29\x71\x1f\x88\xe6\xee\x23\xab\x8f\x05\xe6\xd1\x32\x93\xa4\xdc\xaf\xe2\x98\x57\xfc\x41\xc6\x3b\x61\xb7\x17\xc0\x7f\x5f\xf2\xee\x37\xfb\xda\xcf\x56\x14\x4a\x2d\x9b\xee\x63\xa4\xe4\x56\x3a\x25\x34\x51\xb1\x3d\xf7\x2c\xdb\x46\xb3\xfe\x53\xc2\x6f\xf5\x4c\xc2\x1c\xe3\x92\xc5\xed\x65\x3a\x11\x30\xbf\xa5\x95\x9d\xf4\x31\x84\x78\x9b\x69\x4f\xf7\x96\x78\x8c\x60\xb2\x0d\x68\xda\x87\x4c\xb6\x05\xf6\x05\xda\xf0\x9a\x2a\x44\x35\x81\xf4\x81\x10\xec\x55\x43\x6a\x46\x2a\x23\xc6\x00\x2c\xdb\xb6\xb8\x9a\xd8\x0d\x22\x09\xcf\x57\x7d\xd5\x5c\x3c\x89\x9f\x15\x27\xf1\x05\x6f\x09\x54\xbd\xd7\xa1\xd1\xd0\x6d\x20\xef\x58\xa2\xa5\xcf\x41\x8a\xb8\xc1\x60\xa9\x97\xbe\x8d\xfd\x08\x87\xea\x89\xda\x6d\xd0\x02\xfd\xfa\x99\xb4\xc1\x22\x23\x59\x37\x98\x29\xa9\xbd\x30\x61\x8b\x15\x66\x9a\x4b\x3b\xf9\xb9\x4f\xcf\x24\xec\x5e\x02\xfb\x4f\xfb\x67\x4f\x3e\x3a\x02\x68\x4b\xc9\x81\xeb\x97\x1e\xdc\x32\x57\x2d\x18\x0b\xa5\xe4\xbc\xde\x41\x9c\x20\xaf\x5d\x2e\x01\xf4\x8f\x40\x5d\xec\x19\x6d\xd7\xb7\x39\x6f\x37\xa6\xfb\x17\xb8\x4e\x17\x74\x10\x5f\x1d\x64\x65\x71\x26\x30\xa3\x57\x28\x04\x9f\x40\x5d\x4f\xd6\x7b\xf4\x0c\x26\x42\x9d\x94\xc1\x86\xa2\x16\x14\x47\x53\xaa\xaa\xea\xdd\x2c\x0c\xe1\x01\xfb\x2d\x4b\xee\xee\x3e\xe0\x3b\x07\x7e\x9d\xe7\x65\xce\x3f\x0b\x60\x63\x89\x8a\xc7\x4c\x3e\x2f\xf4\x6e\xcc\x26\x73\xa4\xe2\x4d\xc4\xc4\x84\xc3\xa4\xbd\x41\x05\x9b\x90\x5e\x73\xaa\xb7\xeb\xae\xee\xc2\xcc\xf2\xc7\x51\x6e\x29\xf4\xe0\x5e\x88\xb4\x6d\x66\x4d\xc7\x04\xfb\xb4\x3e\x63\x97\xcf\x5e\xbc\x57\xcf\x2b\x26\x8e\x72\x27\x09\xf1\xa0\x92\x38\xa4\x9b\xad\xcd\x5e\x65\xc1\x86\x27\x7a\x37\x63\xf6\x64\x4e\x98\x38\x9a\x3a\x33\x82\xb3\xe4\x2e\x13\xa7\x63\x8a\xb0\xd5\x59\xa4\xd2\x92\xf6\x24\x71\x33\x25\x9b\xf9\xbc\x96\x93\xaa\x4d\x6f\x92\xdf\xab\x8f\x58\xba\x92\x29\xa4\x4c\x95\x38\xaf\x1f\x8a\x74\x25\xd1\x88\xff\x38\x75\x40\xa9\xb7\xa3\x7a\x25\xfd\x84\xec\xd1\x89\xc1\xdf\xdf\x84\x58\xb2\x54\x4c\xee\x2f\x12\x0d\x41\x88\xa4\x28\xdd\x10\x39\xe4\xdd\x6a\xcd\x05\x59\x95\x81\x3a\x94\xd3\x2c\xa8\x9d\xa4\x84\xaa\xb4\xa7\x0e\xd5\xa2\x24\x4d\xf3\xf6\xff\x7e\xef\x5e\x22\x12\xa1\x74\x2c\x77\xbf\x82\x72\x5d\x27\xfc\x55\x0f\x74\x1a\xe4\xa7\x15\x95\x89\x47\x35\xdd\x76\x53\x41\xad\xae\x29\x5e\xcb\x23\xa0\x42\xdf\x85\xf8\x56\x5b\xd0\x97\x2e\x55\xa8\xd7\x2e\xe7\x58\x10\xc3\xe3\x95\x94\x7b\xf4\x45\x58\xc7\x94\x50\x98\x01\x39\x34\x1c\x9c\x3b\x83\x38\x36\xbd\xde\x96\xd8\x01\xd2\xb6\x6b\x24\x99\x8d\xc8\x7d\x52\xf9\x5b\xf6\x89\x0e\x5d\xf0\xcb\xb1\x64\xca\x67\x86\x65\x3e\xb5\x65\x1d\x8f\x5c\xe4\x06\x90\x2d\xcb\xe4\xf8\x53\xb0\xa0\xad\x8a\xd8\x27\x1b\xaa\xd7\x14\x10\xe0\xd0\x09\x4d\x5b\x58\x17\x61\xab\x04\xe0\x0a\x5a\x72\xa5\x36\xe4\xd9\x3f\x14\x71\x58\x37\x00\xc5\x66\xaf\x82\x8e\x3e\x97\xca\x67\x70\xfe\x23\x3b\xaf\xd4\x1d\x4b\xf2\xa0\xc2\xda\x94\x58\xf4\x12\xe2\x7a\xb7\xa2\xfe\xc3\x7b\x6f\x48\xa5\x6a\xa4\x56\xc8\x97\x44\x5e\x59\x7f\x70\x1b\x0a\x88\x6c\x37\xe6\x8b\xfd\x71\xca\x9a\xcd\x35\x5c\x81\x27\x1f\x42\x32\x88\x11\xdb\x6e\x8f\xae\xc0\x35\x27\x8b\x3d\xee\x50\x75\xcb\x00\xfc\x3d\x65\x28\x44\xa5\xbf\xa7\x29\x6e\xf9\x48\x66\xd4\x17\xd7\x60\x10\xfe\xf8\xa8\x76\x7e\x3d\xb2\x71\x50\xb1\x19\x22\xd3\xe4\x3d\x75\x2f\x67\xb0\xf4\xfb\xbf\x77\xf5\x7f\xad\xe1\xea\x4b\xed\xa6\xf8\x35\x58\x77\x26\xc3\x90\xfd\x45\xa0\xd6\xbb\x9c\x16\x4d\xe8\x6c\xd1\x3c\x72\xa8\x10\xa4\x0c\x40\xe8\x07\x6f\xf9\xd6\x03\x17\x5d\x8f\xf4\xd0\xb0\xfe\x2f\x50\x32\x96\xbd\x23\x09\x19\x2c\x00\x5d\x44\x3c\x61\x0f\xfb\xa2\xd7\x67\x02\xdb\x40\xbf\x8a\x63\x52\x50\x72\xe3\xbb\x46\x19\x1d\xf4\x28\xc6\x59\xce\x70\x4b\x54\xd5\x6f\x18\x88\xb3\x0e\xe0\x82\x96\x1b\x0e\xe4\x3b\xd0\x1c\xe7\x3f\xec\xf9\x63\xcb\x99\x7a\xa8\x5f\x5c\xf3\xb3\x91\x91\xbc\x20\x91\xf7\x38\x76\xd0\x2a\xc1\xf7\xba\x49\x35\x79\x53\x42\xf4\xbf\xef\x7a\xf8\x9c\x7c\x96\x9e\x28\x53\x91\x46\x8c\x6a\xf3\x16\x0c\x51\x2a\x7c\xae\xb0\x3f\x0a\xc7\x2d\x0e\x09\x00\x8c\x4f\x77\x75\x0e\x96\x25\x3c\x3b\xff\x15\xb7\x56\x7a\x46\x6f\xff\x25\xed\x41\x46\xb0\xc9\xd5\x14\x85\x7d\x07\xa4\x00\xe6\xa0\x27\xee\xdb\x30\x12\x39\x8e\xe4\x6d\x76\xdf\xc1\xfc\xec\x1b\x70\xe1\xab\x08\xda\xf1\xc8\xa4\x8c\x93\xd9\x22\x71\x5f\xd2\xbb\xf3\x22\x42\x11\x39\x1d\xcd\x8f\xe3\x82\x6d\x77\xe4\x05\xaa\xec\xc6\x63\x47\xbe\xf4\x3a\x6c\x78\x01\x58\x7c\x39\xde\xc6\xc3\x06\xc5\x22\xf9\x6a\x28\x44\x16\x51\x65\x46\xa7\x25\x89\xba\x98\xb3\xa8\x1f\xda\x81\x56\x94\xe6\x1e\x32\x7d\xd9\x6a\xb7\x64\xb6\x8a\x39\xb7\x54\x98\x5d\x85\x2f\xd7\x30\x12\xf1\x6b\x3c\xd7\xcb\xe9\x79\x92\x74\x6d\x4a\x5e\x74\x20\x6c\xec\x9f\xae\x0d\xcf\x99\x3e\x67\xea\x20\xfa\xa9\x07\xcd\x06\xe1\xfe\x7f\xe9\xb2\x35\xe9\x1e\x4e\x17\x43\x2c\x7e\x46\xf0\x8f\x7a\xc3\xbd\x23\x55\xdf\xef\x2b\x26\x5a\x5e\xf1\xd5\xf3\x0c\x63\x77\x14\xbc\x53\x2b\x1e\x68\xf2\x25\x20\xd0\x27\x00\x1e\x40\x59\x1b\xeb\x35\x18\xcc\xde\xa7\x59\x2c\x4d\xfe\xd9\xbc\xa0\xa7\xaf\x3c\x75\x81\x17\x82\x1a\x38\xb3\xfd\xa2\x49\x28\x83\x72\x4f\x6f\x59\x42\xba\x54\x13\xbc\xad\x03\x18\x4a\x7e\x39\x7c\x3d\xf3\x63\xa1\x6e\xd2\x27\x7a\xfc\xf7\x45\xd6\x33\xfd\x1b\xb2\xf3\xc6\xcb\x2a\xb0\x43\x06\xfd\xaa\xb5\x80\x50\x1c\xb1\x2b\xc2\xf9\xd6\xa0\x49\x46\x0c\xe6\x9a\x38\x92\x8a\x67\x34\xe8\x64\x63\x20\xdb\xe4\xdd\x1e\x9c\xcf\xa1\x01\xd8\xe7\x5e\x55\x4a\xc3\xbb\xe7\xda\xd9\x12\xd4\xee\x28\x32\x0a\x20\x4c\xbd\xf5\x8f\x07\x32\x5b\xba\x55\x1c\xcd\xdd\xcb\xcb\x89\x04\x28\x7d\xdd\xd8\x16\x5d\x0a\xf9\x3f\xcd\x53\xf6\xec\x81\x90\x07\xb5\xca\xb7\x7b\x49\x19\xda\xac\x61\x29\xa3\x9a\x1b\x6f\x90\xcc\x9b\x47\xd6\xe8\xf9\x25\x7b\x1f\x47\xe8\x6d\x5b\x47\x8a\xa1\xcf\x3b\x49\x17\x3a\xd0\x50\x89\x74\x71\x1c\x82\x12\x5a\x48\xa9\xeb\xf5\xb8\x0f\x24\x15\x1b\x0a\xfd\x33\xba\x1c\xcc\xd7\xef\xc8\x9c\x40\x82\x25\x0e\x3d\x45\x58\xf1\xc4\xfe\x4d\xf5\xe4\xf1\x84\x60\x84\x3f\x59\xa2\x13\xd3\x17\x78\x16\xdf\x4d\x94\xca\x0c\x32\x52\xa3\x63\x21\x51\xc9\x51\xe5\x37\x71\x8e\xcc\xcf\x94\xb8\xbc\xd5\x39\x41\x7f\x05\xce\xc8\x33\xe1\xea\xb6\x8c\xe2\x63\xa7\xc0\x95\x64\x41\x0b\x0e\x82\x4a\xef\xc3\xf3\x7f\x7a\xa1\x05\x56\xdb\xf6\xa1\xa3\x27\xa2\x20\xfe\xc8\x90\x4f\xae\xef\x91\x2b\x89\xcc\x29\xcc\xf6\xdb\xd7\xde\x91\x6f\x4a\x22\x18\x3b\x5f\xd2\xef\x1c\x35\xda\x07\x9f\x84\x8f\xa5\x89\x8d\xd0\x41\x23\x78\xbd\xae\x17\xe4\x7b\xcd\xc9\x23\x81\x2a\xaf\x0b\x3f\x8f\xca\xa7\x34\xd8\x4d\x6c\x56\x5a\x90\xec\x25\xc5\x9a\xd3\x6b\x35\x23\xd9\x9b\x25\xe4\x14\x19\x70\x67\x35\x9c\xe5\xad\xe6\x76\x0e\x9e\xa3\xdc\xba\x9a\xe8\x3b\x95\x93\x40\xce\x48\xc1\x8c\x11\x4e\xe2\x82\xb5\xe9\x9a\xb0\x9a\x1a\x55\xa1\x04\x63\xf5\xd9\x80\xb9\x8a\xf8\xfd\xcc\x94\x6a\x25\xe4\xdd\xbd\xbd\xe4\x74\xdb\x77\xf0\x91\x11\x14\xcf\xff\xe3\xd6\xf8\x09\x4e\x9d\xc4\x74\x79\x62\x9a\x4e\x0f\x25\x71\x3c\xd8\x1a\x08\x76\x32\xf9\xca\x8c\xb1\xd6\xde\xc8\x28\x26\xc9\xdd\x7b\x4f\x48\x56\xad\x5c\xfb\xdb\x9f\x78\x18\x00\x0c\x1a\xda\xcc\xce\xa7\x55\xbc\x10\x75\xb5\x33\xed\x9a\x95\x2b\xb4\x93\xd8\xd8\x45\x13\xe2\xb4\x82\x4a\xed\x9d\xb2\x7d\xb7\x86\xe3\x72\xff\x73\xe3\x47\xbd\xad\x7f\xfb', 2) | 24,103 | 24,103 | 0.74999 | 6,021 | 24,103 | 3.000332 | 0.043348 | 0.007307 | 0.007971 | 0.007307 | 0.003321 | 0.002491 | 0.002491 | 0 | 0 | 0 | 0 | 0.315975 | 0.000124 | 24,103 | 1 | 24,103 | 24,103 | 0.43361 | 0 | 0 | 0 | 0 | 1 | 0.998341 | 0.998341 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
c45a4e3896b5acaa4ed4698ad1a0889215747b25 | 10,769 | py | Python | model/convlstm_models/convlstm_models.py | zhaoyutim/vit-keras | 33f620ca737c563cc0c0806a92123620ecec957e | [
"Apache-2.0"
] | null | null | null | model/convlstm_models/convlstm_models.py | zhaoyutim/vit-keras | 33f620ca737c563cc0c0806a92123620ecec957e | [
"Apache-2.0"
] | null | null | null | model/convlstm_models/convlstm_models.py | zhaoyutim/vit-keras | 33f620ca737c563cc0c0806a92123620ecec957e | [
"Apache-2.0"
] | null | null | null | from tensorflow.keras import Input
from tensorflow.keras.layers import TimeDistributed, MaxPooling2D, ConvLSTM2D, UpSampling2D, \
Convolution2D, Concatenate, Conv2D, Dropout
from tensorflow.keras.models import Model
import tensorflow as tf
def get_convlstm_unet1(input_shape):
inputs = Input(input_shape)
conv1 = TimeDistributed(Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(inputs)
# conv1 = TimeDistributed(Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv1)
convlstm1 = ConvLSTM2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal', return_sequences=True)(conv1)
pool1 = TimeDistributed(MaxPooling2D(pool_size=(2, 2)))(conv1)
conv2 = TimeDistributed(Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(pool1)
# conv2 = TimeDistributed(Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv2)
convlstm2 = ConvLSTM2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal', return_sequences=True)(conv2)
pool2 = TimeDistributed(MaxPooling2D(pool_size=(2, 2)))(conv2)
conv3 = TimeDistributed(Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(pool2)
# conv3 =TimeDistributed(Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv3)
convlstm3 = ConvLSTM2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal', return_sequences=True)(conv3)
pool3 = TimeDistributed(MaxPooling2D(pool_size=(2, 2)))(conv3)
conv4 = TimeDistributed(Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(pool3)
# conv4 = TimeDistributed(Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv4)
# convlstm4 = ConvLSTM2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal', return_sequences=True)(conv4)
# drop4 = TimeDistributed(Dropout(0.5))(conv4)
# pool4 = TimeDistributed(MaxPooling2D(pool_size=(2, 2)))(drop4)
# conv5 = TimeDistributed(Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(pool4)
# conv5 = TimeDistributed(Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv5)
# drop5 = TimeDistributed(Dropout(0.5))(conv5)
# up6 = TimeDistributed(Conv2D(512, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(drop4)
# merge6 = tf.concat([conv4,up6], axis = 4)
conv6 = TimeDistributed(Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv4)
# conv6 = TimeDistributed(Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv6)
up7 = TimeDistributed(Conv2D(256, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(TimeDistributed(UpSampling2D(size = (2,2)))(conv6))
merge7 = tf.concat([convlstm3,up7], axis = 4)
conv7 = TimeDistributed(Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(merge7)
conv7 = TimeDistributed(Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv7)
up8 = TimeDistributed(Conv2D(128, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(TimeDistributed(UpSampling2D(size = (2,2)))(conv7))
merge8 = tf.concat([convlstm2,up8], axis = 4)
conv8 = TimeDistributed(Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(merge8)
conv8 = TimeDistributed(Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv8)
up9 = TimeDistributed(Conv2D(64, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(TimeDistributed(UpSampling2D(size = (2,2)))(conv8))
merge9 = tf.concat([convlstm1,up9], axis = 4)
conv9 = TimeDistributed(Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(merge9)
conv9 = TimeDistributed(Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv9)
conv9 = TimeDistributed(Conv2D(2, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv9)
conv10 = TimeDistributed(Conv2D(1, 1, activation = 'sigmoid'))(conv9)
model = Model(inputs = inputs, outputs = conv10)
return model
def get_convlstm_unet2(input_shape):
inputs = Input(input_shape)
conv1 = ConvLSTM2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal', return_sequences=True)(inputs)
# conv1 = TimeDistributed(Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv1)
pool1 = TimeDistributed(MaxPooling2D(pool_size=(2, 2)))(conv1)
conv2 = ConvLSTM2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal', return_sequences=True)(pool1)
# conv2 = ConvLSTM2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal', return_sequences=True)(conv2)
pool2 = TimeDistributed(MaxPooling2D(pool_size=(2, 2)))(conv2)
conv3 = ConvLSTM2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal', return_sequences=True)(pool2)
# conv3 = ConvLSTM2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal', return_sequences=True)(conv3)
pool3 = TimeDistributed(MaxPooling2D(pool_size=(2, 2)))(conv3)
conv4 = TimeDistributed(Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(pool3)
conv6 = TimeDistributed(Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv4)
up7 = TimeDistributed(Conv2D(256, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(TimeDistributed(UpSampling2D(size = (2,2)))(conv6))
merge7 = tf.concat([conv3,up7], axis = 4)
conv7 = TimeDistributed(Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(merge7)
conv7 = TimeDistributed(Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv7)
up8 = TimeDistributed(Conv2D(128, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(TimeDistributed(UpSampling2D(size = (2,2)))(conv7))
merge8 = tf.concat([conv2,up8], axis = 4)
conv8 = TimeDistributed(Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(merge8)
conv8 = TimeDistributed(Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv8)
up9 = TimeDistributed(Conv2D(64, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(TimeDistributed(UpSampling2D(size = (2,2)))(conv8))
merge9 = tf.concat([conv1,up9], axis = 4)
conv9 = TimeDistributed(Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(merge9)
conv9 = TimeDistributed(Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv9)
conv9 = TimeDistributed(Conv2D(2, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal'))(conv9)
conv10 = TimeDistributed(Conv2D(1, 1, activation = 'sigmoid'))(conv9)
model = Model(inputs = inputs, outputs = conv10)
return model
def unet(input_size = (256,256,5)):
inputs = Input(input_size)
conv1 = Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(inputs)
conv1 = Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv1)
pool1 = MaxPooling2D(pool_size=(2, 2))(conv1)
conv2 = Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(pool1)
conv2 = Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv2)
pool2 = MaxPooling2D(pool_size=(2, 2))(conv2)
conv3 = Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(pool2)
conv3 = Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv3)
pool3 = MaxPooling2D(pool_size=(2, 2))(conv3)
conv4 = Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(pool3)
conv4 = Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv4)
drop4 = Dropout(0.5)(conv4)
pool4 = MaxPooling2D(pool_size=(2, 2))(drop4)
conv5 = Conv2D(1024, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(pool4)
conv5 = Conv2D(1024, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv5)
drop5 = Dropout(0.5)(conv5)
up6 = Conv2D(512, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(UpSampling2D(size = (2,2))(drop5))
merge6 = tf.concat([drop4,up6], axis = 3)
conv6 = Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(merge6)
conv6 = Conv2D(512, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv6)
up7 = Conv2D(256, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(UpSampling2D(size = (2,2))(conv6))
merge7 = tf.concat([conv3,up7], axis = 3)
conv7 = Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(merge7)
conv7 = Conv2D(256, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv7)
up8 = Conv2D(128, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(UpSampling2D(size = (2,2))(conv7))
merge8 = tf.concat([conv2,up8], axis = 3)
conv8 = Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(merge8)
conv8 = Conv2D(128, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv8)
up9 = Conv2D(64, 2, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(UpSampling2D(size = (2,2))(conv8))
merge9 = tf.concat([conv1,up9], axis = 3)
conv9 = Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(merge9)
conv9 = Conv2D(64, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv9)
conv9 = Conv2D(2, 3, activation = 'relu', padding = 'same', kernel_initializer = 'he_normal')(conv9)
conv10 = Conv2D(1, 1, activation = 'sigmoid')(conv9)
model = Model(inputs = inputs, outputs = conv10)
return model | 76.921429 | 166 | 0.689386 | 1,282 | 10,769 | 5.661466 | 0.063963 | 0.131166 | 0.196748 | 0.234224 | 0.930559 | 0.920639 | 0.918573 | 0.890741 | 0.890741 | 0.890741 | 0 | 0.069326 | 0.14811 | 10,769 | 140 | 167 | 76.921429 | 0.721823 | 0.154425 | 0 | 0.408163 | 0 | 0 | 0.107041 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030612 | false | 0 | 0.040816 | 0 | 0.102041 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
674eb7dfe2f0521787874ec532b3afaaf8fce054 | 940 | py | Python | app009.py | ChloeRuan/HelloWorld | e1297ee871c9a84a6e7c50e0d3aa1c332daef27f | [
"MIT"
] | null | null | null | app009.py | ChloeRuan/HelloWorld | e1297ee871c9a84a6e7c50e0d3aa1c332daef27f | [
"MIT"
] | null | null | null | app009.py | ChloeRuan/HelloWorld | e1297ee871c9a84a6e7c50e0d3aa1c332daef27f | [
"MIT"
] | null | null | null | # conditon
is_hot = True
if is_hot:
print("It's a hot day")
print("Enjoy your day")
is_hot = False
if is_hot:
print("It's a hot day")
print("Enjoy your day")
is_hot = True
if is_hot:
print("It's a hot day")
print("Drink plenty of water")
else:
print("It's a cold day")
print("Wear warm clothes")
print("Enjoy your day")
is_hot = False
if is_hot:
print("It's a hot day")
print("Drink plenty of water")
else:
print("It's a cold day")
print("Wear warm clothes")
print("Enjoy your day")
is_hot = False
is_cold = False
if is_hot:
print("It's a hot day")
print("Drink plenty of water")
elif is_cold:
print("It's a cold day")
print("Wear warm clothes")
else:
print("It's a lovely day")
print("Enjoy your day")
# exercise
price = 1000000
good_credit = True
if good_credit:
down_payment = 0.1 * price
else:
down_payment = 0.2 * price
print(f"Down payment: ${down_payment}")
| 18.431373 | 39 | 0.648936 | 167 | 940 | 3.550898 | 0.209581 | 0.084317 | 0.121417 | 0.136594 | 0.757167 | 0.701518 | 0.701518 | 0.701518 | 0.701518 | 0.701518 | 0 | 0.014986 | 0.219149 | 940 | 50 | 40 | 18.8 | 0.792916 | 0.019149 | 0 | 0.785714 | 0 | 0 | 0.375408 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
6764ecb40f2815b7b1772cb5aef8a8e1660e3ff2 | 2,768 | py | Python | avalanche/training/plugins/strategy_plugin.py | tachyonicClock/avalanche | 6c3b84b4b9e3123c838092433f29590d955bfdf2 | [
"MIT"
] | 810 | 2018-10-08T15:49:05.000Z | 2022-03-31T15:28:09.000Z | avalanche/training/plugins/strategy_plugin.py | tachyonicClock/avalanche | 6c3b84b4b9e3123c838092433f29590d955bfdf2 | [
"MIT"
] | 477 | 2021-03-01T17:50:51.000Z | 2022-03-31T14:51:23.000Z | avalanche/training/plugins/strategy_plugin.py | tachyonicClock/avalanche | 6c3b84b4b9e3123c838092433f29590d955bfdf2 | [
"MIT"
] | 147 | 2018-10-08T15:49:18.000Z | 2022-03-31T04:08:45.000Z | from typing import Any, TYPE_CHECKING
from avalanche.core import StrategyCallbacks
if TYPE_CHECKING:
from avalanche.training import BaseStrategy
class StrategyPlugin(StrategyCallbacks[Any]):
"""
Base class for strategy plugins. Implements all the callbacks required
by the BaseStrategy with an empty function. Subclasses should override
the callbacks.
"""
def __init__(self):
super().__init__()
pass
def before_training(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_training_exp(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_train_dataset_adaptation(self, strategy: 'BaseStrategy',
**kwargs):
pass
def after_train_dataset_adaptation(self, strategy: 'BaseStrategy',
**kwargs):
pass
def before_training_epoch(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_training_iteration(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_forward(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_forward(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_backward(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_backward(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_training_iteration(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_update(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_update(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_training_epoch(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_training_exp(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_training(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_eval(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_eval_dataset_adaptation(self, strategy: 'BaseStrategy',
**kwargs):
pass
def after_eval_dataset_adaptation(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_eval_exp(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_eval_exp(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_eval(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_eval_iteration(self, strategy: 'BaseStrategy', **kwargs):
pass
def before_eval_forward(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_eval_forward(self, strategy: 'BaseStrategy', **kwargs):
pass
def after_eval_iteration(self, strategy: 'BaseStrategy', **kwargs):
pass
| 27.68 | 80 | 0.641618 | 279 | 2,768 | 6.164875 | 0.175627 | 0.105814 | 0.362791 | 0.453488 | 0.809302 | 0.809302 | 0.809302 | 0.751744 | 0.396512 | 0.123256 | 0 | 0 | 0.255058 | 2,768 | 99 | 81 | 27.959596 | 0.834142 | 0.056358 | 0 | 0.47619 | 0 | 0 | 0.120556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.428571 | 0.047619 | 0 | 0.492063 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
6770925dad8fdbba41447cc3abbca1a781ed404f | 37,621 | py | Python | Packages/backrefs/st3/backrefs/uniprops/unidata/numericvalue.py | aimee5/sublime_packages | 071e3d0a5892e177d7f93365b20ebccb3f60aedd | [
"MIT"
] | 2 | 2018-04-24T10:02:26.000Z | 2019-06-02T13:53:31.000Z | Packages/backrefs/st3/backrefs/uniprops/unidata/numericvalue.py | aimee5/sublime_packages | 071e3d0a5892e177d7f93365b20ebccb3f60aedd | [
"MIT"
] | null | null | null | Packages/backrefs/st3/backrefs/uniprops/unidata/numericvalue.py | aimee5/sublime_packages | 071e3d0a5892e177d7f93365b20ebccb3f60aedd | [
"MIT"
] | 2 | 2019-04-11T04:13:02.000Z | 2019-06-02T13:53:33.000Z | """Unicode Properties from Unicode version 6.1.0 (autogen)."""
from __future__ import unicode_literals
unicode_numeric_values = {
"0": "\u0030\u0660\u06f0\u07c0\u0966\u09e6\u0a66\u0ae6\u0b66\u0be6\u0c66\u0c78\u0ce6\u0d66\u0e50\u0ed0\u0f20\u1040\u1090\u17e0\u17f0\u1810\u1946\u19d0\u1a80\u1a90\u1b50\u1bb0\u1c40\u1c50\u2070\u2080\u2189\u24ea\u24ff\u3007\u96f6\ua620\ua6ef\ua8d0\ua900\ua9d0\uaa50\uabf0\uf9b2\uff10\U0001018a\U000104a0\U00011066\U000110f0\U00011136\U000111d0\U000116c0\U0001d7ce\U0001d7d8\U0001d7e2\U0001d7ec\U0001d7f6\U0001f100-\U0001f101",
"1": "\u0031\u00b9\u0661\u06f1\u07c1\u0967\u09e7\u0a67\u0ae7\u0b67\u0be7\u0c67\u0c79\u0c7c\u0ce7\u0d67\u0e51\u0ed1\u0f21\u1041\u1091\u1369\u17e1\u17f1\u1811\u1947\u19d1\u19da\u1a81\u1a91\u1b51\u1bb1\u1c41\u1c51\u2081\u215f-\u2160\u2170\u2460\u2474\u2488\u24f5\u2776\u2780\u278a\u3021\u3192\u3220\u3280\u4e00\u58f1\u58f9\u5e7a\u5f0c\ua621\ua6e6\ua8d1\ua901\ua9d1\uaa51\uabf1\uff11\U00010107\U00010142\U00010158-\U0001015a\U00010320\U000103d1\U000104a1\U00010858\U00010916\U00010a40\U00010a7d\U00010b58\U00010b78\U00010e60\U00011052\U00011067\U000110f1\U00011137\U000111d1\U000116c1\U00012415\U0001241e\U0001242c\U00012434\U0001244f\U00012458\U0001d360\U0001d7cf\U0001d7d9\U0001d7e3\U0001d7ed\U0001d7f7\U0001f102\U0002092a",
"1/10": "\u2152",
"1/16": "\u09f4\u0b75\ua833",
"1/2": "\u00bd\u0b73\u0d74\u0f2a\u0f33\u2cfd\ua831\U00010141\U00010175-\U00010176\U00010e7b",
"1/3": "\u2153\U00010e7d\U0001245a\U0001245d",
"1/4": "\u00bc\u09f7\u0b72\u0d73\ua830\U00010140\U00010e7c\U00012460\U00012462",
"1/5": "\u2155",
"1/6": "\u2159\U00012461",
"1/7": "\u2150",
"1/8": "\u09f5\u0b76\u215b\ua834\U0001245f",
"1/9": "\u2151",
"10": "\u0bf0\u0d70\u1372\u2169\u2179\u2469\u247d\u2491\u24fe\u277f\u2789\u2793\u3038\u3229\u3248\u3289\u4ec0\u5341\u62fe\uf973\uf9fd\U00010110\U00010149\U00010150\U00010157\U00010160-\U00010164\U00010322\U000103d3\U0001085b\U00010917\U00010a44\U00010b5c\U00010b7c\U00010e69\U0001105b\U0001d369",
"100": "\u0bf1\u0d71\u137b\u216d\u217d\u4f70\u767e\u964c\U00010119\U0001014b\U00010152\U0001016a\U000103d5\U0001085d\U00010919\U00010a46\U00010b5e\U00010b7e\U00010e72\U00011064",
"1000": "\u0bf2\u0d72\u216f\u217f-\u2180\u4edf\u5343\u9621\U00010122\U0001014d\U00010154\U00010171\U0001085e\U00010a47\U00010b5f\U00010b7f\U00011065",
"10000": "\u137c\u2182\u4e07\u842c\U0001012b\U00010155\U0001085f",
"100000": "\u2188",
"100000000": "\u4ebf\u5104",
"1000000000000": "\u5146",
"11": "\u216a\u217a\u246a\u247e\u2492\u24eb",
"11/2": "\u0f2f",
"12": "\u216b\u217b\u246b\u247f\u2493\u24ec",
"13": "\u246c\u2480\u2494\u24ed",
"13/2": "\u0f30",
"14": "\u246d\u2481\u2495\u24ee",
"15": "\u246e\u2482\u2496\u24ef",
"15/2": "\u0f31",
"16": "\u09f9\u246f\u2483\u2497\u24f0",
"17": "\u16ee\u2470\u2484\u2498\u24f1",
"17/2": "\u0f32",
"18": "\u16ef\u2471\u2485\u2499\u24f2",
"19": "\u16f0\u2472\u2486\u249a\u24f3",
"2": "\u0032\u00b2\u0662\u06f2\u07c2\u0968\u09e8\u0a68\u0ae8\u0b68\u0be8\u0c68\u0c7a\u0c7d\u0ce8\u0d68\u0e52\u0ed2\u0f22\u1042\u1092\u136a\u17e2\u17f2\u1812\u1948\u19d2\u1a82\u1a92\u1b52\u1bb2\u1c42\u1c52\u2082\u2161\u2171\u2461\u2475\u2489\u24f6\u2777\u2781\u278b\u3022\u3193\u3221\u3281\u3483\u4e8c\u5169\u5f0d\u5f10\u8cae\u8cb3\u8d30\ua622\ua6e7\ua8d2\ua902\ua9d2\uaa52\uabf2\uf978\uff12\U00010108\U0001015b-\U0001015e\U000103d2\U000104a2\U00010859\U0001091a\U00010a41\U00010b59\U00010b79\U00010e61\U00011053\U00011068\U000110f2\U00011138\U000111d2\U000116c2\U00012400\U00012416\U0001241f\U00012423\U0001242d\U00012435\U0001244a\U00012450\U00012459\U0001d361\U0001d7d0\U0001d7da\U0001d7e4\U0001d7ee\U0001d7f8\U0001f103\U00022390",
"2/3": "\u2154\U00010177\U00010e7e\U0001245b\U0001245e",
"2/5": "\u2156",
"20": "\u1373\u2473\u2487\u249b\u24f4\u3039\u3249\u5344\u5eff\U00010111\U000103d4\U0001085c\U00010918\U00010a45\U00010b5d\U00010b7d\U00010e6a\U0001105c\U0001d36a",
"200": "\U0001011a\U00010e73",
"2000": "\U00010123",
"20000": "\U0001012c",
"21": "\u3251",
"22": "\u3252",
"23": "\u3253",
"24": "\u3254",
"25": "\u3255",
"26": "\u3256",
"27": "\u3257",
"28": "\u3258",
"29": "\u3259",
"3": "\u0033\u00b3\u0663\u06f3\u07c3\u0969\u09e9\u0a69\u0ae9\u0b69\u0be9\u0c69\u0c7b\u0c7e\u0ce9\u0d69\u0e53\u0ed3\u0f23\u1043\u1093\u136b\u17e3\u17f3\u1813\u1949\u19d3\u1a83\u1a93\u1b53\u1bb3\u1c43\u1c53\u2083\u2162\u2172\u2462\u2476\u248a\u24f7\u2778\u2782\u278c\u3023\u3194\u3222\u3282\u4e09\u4ee8\u53c1-\u53c4\u5f0e\ua623\ua6e8\ua8d3\ua903\ua9d3\uaa53\uabf3\uf96b\uff13\U00010109\U000104a3\U0001085a\U0001091b\U00010a42\U00010b5a\U00010b7a\U00010e62\U00011054\U00011069\U000110f3\U00011139\U000111d3\U000116c3\U00012401\U00012408\U00012417\U00012420\U00012424-\U00012425\U0001242e-\U0001242f\U00012436-\U00012437\U0001243a-\U0001243b\U0001244b\U00012451\U0001d362\U0001d7d1\U0001d7db\U0001d7e5\U0001d7ef\U0001d7f9\U0001f104\U00020afd\U00020b19\U00022998\U00023b1b",
"3/16": "\u09f6\u0b77\ua835",
"3/2": "\u0f2b",
"3/4": "\u00be\u09f8\u0b74\u0d75\ua832\U00010178",
"3/5": "\u2157",
"3/8": "\u215c",
"30": "\u1374\u303a\u324a\u325a\u5345\U00010112\U00010165\U00010e6b\U0001105d\U0001d36b\U00020983",
"300": "\U0001011b\U0001016b\U00010e74",
"3000": "\U00010124",
"30000": "\U0001012d",
"31": "\u325b",
"32": "\u325c",
"33": "\u325d",
"34": "\u325e",
"35": "\u325f",
"36": "\u32b1",
"37": "\u32b2",
"38": "\u32b3",
"39": "\u32b4",
"4": "\u0034\u0664\u06f4\u07c4\u096a\u09ea\u0a6a\u0aea\u0b6a\u0bea\u0c6a\u0cea\u0d6a\u0e54\u0ed4\u0f24\u1044\u1094\u136c\u17e4\u17f4\u1814\u194a\u19d4\u1a84\u1a94\u1b54\u1bb4\u1c44\u1c54\u2074\u2084\u2163\u2173\u2463\u2477\u248b\u24f8\u2779\u2783\u278d\u3024\u3195\u3223\u3283\u4e96\u56db\u8086\ua624\ua6e9\ua8d4\ua904\ua9d4\uaa54\uabf4\uff14\U0001010a\U000104a4\U00010a43\U00010b5b\U00010b7b\U00010e63\U00011055\U0001106a\U000110f4\U0001113a\U000111d4\U000116c4\U00012402\U00012409\U0001240f\U00012418\U00012421\U00012426\U00012430\U00012438\U0001243c-\U0001243f\U0001244c\U00012452-\U00012453\U0001d363\U0001d7d2\U0001d7dc\U0001d7e6\U0001d7f0\U0001d7fa\U0001f105\U00020064\U000200e2\U0002626d",
"4/5": "\u2158",
"40": "\u1375\u324b\u32b5\u534c\U00010113\U00010e6c\U0001105e\U0001d36c\U0002098c\U0002099c",
"400": "\U0001011c\U00010e75",
"4000": "\U00010125",
"40000": "\U0001012e",
"41": "\u32b6",
"42": "\u32b7",
"43": "\u32b8",
"44": "\u32b9",
"45": "\u32ba",
"46": "\u32bb",
"47": "\u32bc",
"48": "\u32bd",
"49": "\u32be",
"5": "\u0035\u0665\u06f5\u07c5\u096b\u09eb\u0a6b\u0aeb\u0b6b\u0beb\u0c6b\u0ceb\u0d6b\u0e55\u0ed5\u0f25\u1045\u1095\u136d\u17e5\u17f5\u1815\u194b\u19d5\u1a85\u1a95\u1b55\u1bb5\u1c45\u1c55\u2075\u2085\u2164\u2174\u2464\u2478\u248c\u24f9\u277a\u2784\u278e\u3025\u3224\u3284\u3405\u382a\u4e94\u4f0d\ua625\ua6ea\ua8d5\ua905\ua9d5\uaa55\uabf5\uff15\U0001010b\U00010143\U00010148\U0001014f\U0001015f\U00010173\U00010321\U000104a5\U00010e64\U00011056\U0001106b\U000110f5\U0001113b\U000111d5\U000116c5\U00012403\U0001240a\U00012410\U00012419\U00012422\U00012427\U00012431\U00012439\U0001244d\U00012454-\U00012455\U0001d364\U0001d7d3\U0001d7dd\U0001d7e7\U0001d7f1\U0001d7fb\U0001f106\U00020121",
"5/2": "\u0f2c",
"5/6": "\u215a\U0001245c",
"5/8": "\u215d",
"50": "\u1376\u216c\u217c\u2186\u324c\u32bf\U00010114\U00010144\U0001014a\U00010151\U00010166-\U00010169\U00010174\U00010323\U00010a7e\U00010e6d\U0001105f\U0001d36d",
"500": "\u216e\u217e\U0001011d\U00010145\U0001014c\U00010153\U0001016c-\U00010170\U00010e76",
"5000": "\u2181\U00010126\U00010146\U0001014e\U00010172",
"50000": "\u2187\U0001012f\U00010147\U00010156",
"6": "\u0036\u0666\u06f6\u07c6\u096c\u09ec\u0a6c\u0aec\u0b6c\u0bec\u0c6c\u0cec\u0d6c\u0e56\u0ed6\u0f26\u1046\u1096\u136e\u17e6\u17f6\u1816\u194c\u19d6\u1a86\u1a96\u1b56\u1bb6\u1c46\u1c56\u2076\u2086\u2165\u2175\u2185\u2465\u2479\u248d\u24fa\u277b\u2785\u278f\u3026\u3225\u3285\u516d\u9646\u9678\ua626\ua6eb\ua8d6\ua906\ua9d6\uaa56\uabf6\uf9d1\uf9d3\uff16\U0001010c\U000104a6\U00010e65\U00011057\U0001106c\U000110f6\U0001113c\U000111d6\U000116c6\U00012404\U0001240b\U00012411\U0001241a\U00012428\U00012440\U0001244e\U0001d365\U0001d7d4\U0001d7de\U0001d7e8\U0001d7f2\U0001d7fc\U0001f107\U00020aea",
"60": "\u1377\u324d\U00010115\U00010e6e\U00011060\U0001d36e",
"600": "\U0001011e\U00010e77",
"6000": "\U00010127",
"60000": "\U00010130",
"7": "\u0037\u0667\u06f7\u07c7\u096d\u09ed\u0a6d\u0aed\u0b6d\u0bed\u0c6d\u0ced\u0d6d\u0e57\u0ed7\u0f27\u1047\u1097\u136f\u17e7\u17f7\u1817\u194d\u19d7\u1a87\u1a97\u1b57\u1bb7\u1c47\u1c57\u2077\u2087\u2166\u2176\u2466\u247a\u248e\u24fb\u277c\u2786\u2790\u3027\u3226\u3286\u3b4d\u4e03\u67d2\u6f06\ua627\ua6ec\ua8d7\ua907\ua9d7\uaa57\uabf7\uff17\U0001010d\U000104a7\U00010e66\U00011058\U0001106d\U000110f7\U0001113d\U000111d7\U000116c7\U00012405\U0001240c\U00012412\U0001241b\U00012429\U00012441-\U00012443\U0001d366\U0001d7d5\U0001d7df\U0001d7e9\U0001d7f3\U0001d7fd\U0001f108\U00020001",
"7/2": "\u0f2d",
"7/8": "\u215e",
"70": "\u1378\u324e\U00010116\U00010e6f\U00011061\U0001d36f",
"700": "\U0001011f\U00010e78",
"7000": "\U00010128",
"70000": "\U00010131",
"8": "\u0038\u0668\u06f8\u07c8\u096e\u09ee\u0a6e\u0aee\u0b6e\u0bee\u0c6e\u0cee\u0d6e\u0e58\u0ed8\u0f28\u1048\u1098\u1370\u17e8\u17f8\u1818\u194e\u19d8\u1a88\u1a98\u1b58\u1bb8\u1c48\u1c58\u2078\u2088\u2167\u2177\u2467\u247b\u248f\u24fc\u277d\u2787\u2791\u3028\u3227\u3287\u516b\u634c\ua628\ua6ed\ua8d8\ua908\ua9d8\uaa58\uabf8\uff18\U0001010e\U000104a8\U00010e67\U00011059\U0001106e\U000110f8\U0001113e\U000111d8\U000116c8\U00012406\U0001240d\U00012413\U0001241c\U0001242a\U00012444-\U00012445\U0001d367\U0001d7d6\U0001d7e0\U0001d7ea\U0001d7f4\U0001d7fe\U0001f109",
"80": "\u1379\u324f\U00010117\U00010e70\U00011062\U0001d370",
"800": "\U00010120\U00010e79",
"8000": "\U00010129",
"80000": "\U00010132",
"9": "\u0039\u0669\u06f9\u07c9\u096f\u09ef\u0a6f\u0aef\u0b6f\u0bef\u0c6f\u0cef\u0d6f\u0e59\u0ed9\u0f29\u1049\u1099\u1371\u17e9\u17f9\u1819\u194f\u19d9\u1a89\u1a99\u1b59\u1bb9\u1c49\u1c59\u2079\u2089\u2168\u2178\u2468\u247c\u2490\u24fd\u277e\u2788\u2792\u3029\u3228\u3288\u4e5d\u5efe\u7396\ua629\ua6ee\ua8d9\ua909\ua9d9\uaa59\uabf9\uff19\U0001010f\U000104a9\U00010e68\U0001105a\U0001106f\U000110f9\U0001113f\U000111d9\U000116c9\U00012407\U0001240e\U00012414\U0001241d\U0001242b\U00012446-\U00012449\U0001d368\U0001d7d7\U0001d7e1\U0001d7eb\U0001d7f5\U0001d7ff\U0001f10a\U0002f890",
"9/2": "\u0f2e",
"90": "\u137a\U00010118\U00010341\U00010e71\U00011063\U0001d371",
"900": "\U00010121\U0001034a\U00010e7a",
"9000": "\U0001012a",
"90000": "\U00010133",
"^0": "\u0000-\u002f\u0031-\u065f\u0661-\u06ef\u06f1-\u07bf\u07c1-\u0965\u0967-\u09e5\u09e7-\u0a65\u0a67-\u0ae5\u0ae7-\u0b65\u0b67-\u0be5\u0be7-\u0c65\u0c67-\u0c77\u0c79-\u0ce5\u0ce7-\u0d65\u0d67-\u0e4f\u0e51-\u0ecf\u0ed1-\u0f1f\u0f21-\u103f\u1041-\u108f\u1091-\u17df\u17e1-\u17ef\u17f1-\u180f\u1811-\u1945\u1947-\u19cf\u19d1-\u1a7f\u1a81-\u1a8f\u1a91-\u1b4f\u1b51-\u1baf\u1bb1-\u1c3f\u1c41-\u1c4f\u1c51-\u206f\u2071-\u207f\u2081-\u2188\u218a-\u24e9\u24eb-\u24fe\u2500-\u3006\u3008-\u96f5\u96f7-\ua61f\ua621-\ua6ee\ua6f0-\ua8cf\ua8d1-\ua8ff\ua901-\ua9cf\ua9d1-\uaa4f\uaa51-\uabef\uabf1-\uf9b1\uf9b3-\uff0f\uff11-\U00010189\U0001018b-\U0001049f\U000104a1-\U00011065\U00011067-\U000110ef\U000110f1-\U00011135\U00011137-\U000111cf\U000111d1-\U000116bf\U000116c1-\U0001d7cd\U0001d7cf-\U0001d7d7\U0001d7d9-\U0001d7e1\U0001d7e3-\U0001d7eb\U0001d7ed-\U0001d7f5\U0001d7f7-\U0001f0ff\U0001f102-\U0010ffff",
"^1": "\u0000-\u0030\u0032-\u00b8\u00ba-\u0660\u0662-\u06f0\u06f2-\u07c0\u07c2-\u0966\u0968-\u09e6\u09e8-\u0a66\u0a68-\u0ae6\u0ae8-\u0b66\u0b68-\u0be6\u0be8-\u0c66\u0c68-\u0c78\u0c7a-\u0c7b\u0c7d-\u0ce6\u0ce8-\u0d66\u0d68-\u0e50\u0e52-\u0ed0\u0ed2-\u0f20\u0f22-\u1040\u1042-\u1090\u1092-\u1368\u136a-\u17e0\u17e2-\u17f0\u17f2-\u1810\u1812-\u1946\u1948-\u19d0\u19d2-\u19d9\u19db-\u1a80\u1a82-\u1a90\u1a92-\u1b50\u1b52-\u1bb0\u1bb2-\u1c40\u1c42-\u1c50\u1c52-\u2080\u2082-\u215e\u2161-\u216f\u2171-\u245f\u2461-\u2473\u2475-\u2487\u2489-\u24f4\u24f6-\u2775\u2777-\u277f\u2781-\u2789\u278b-\u3020\u3022-\u3191\u3193-\u321f\u3221-\u327f\u3281-\u4dff\u4e01-\u58f0\u58f2-\u58f8\u58fa-\u5e79\u5e7b-\u5f0b\u5f0d-\ua620\ua622-\ua6e5\ua6e7-\ua8d0\ua8d2-\ua900\ua902-\ua9d0\ua9d2-\uaa50\uaa52-\uabf0\uabf2-\uff10\uff12-\U00010106\U00010108-\U00010141\U00010143-\U00010157\U0001015b-\U0001031f\U00010321-\U000103d0\U000103d2-\U000104a0\U000104a2-\U00010857\U00010859-\U00010915\U00010917-\U00010a3f\U00010a41-\U00010a7c\U00010a7e-\U00010b57\U00010b59-\U00010b77\U00010b79-\U00010e5f\U00010e61-\U00011051\U00011053-\U00011066\U00011068-\U000110f0\U000110f2-\U00011136\U00011138-\U000111d0\U000111d2-\U000116c0\U000116c2-\U00012414\U00012416-\U0001241d\U0001241f-\U0001242b\U0001242d-\U00012433\U00012435-\U0001244e\U00012450-\U00012457\U00012459-\U0001d35f\U0001d361-\U0001d7ce\U0001d7d0-\U0001d7d8\U0001d7da-\U0001d7e2\U0001d7e4-\U0001d7ec\U0001d7ee-\U0001d7f6\U0001d7f8-\U0001f101\U0001f103-\U00020929\U0002092b-\U0010ffff",
"^1/10": "\u0000-\u2151\u2153-\U0010ffff",
"^1/16": "\u0000-\u09f3\u09f5-\u0b74\u0b76-\ua832\ua834-\U0010ffff",
"^1/2": "\u0000-\u00bc\u00be-\u0b72\u0b74-\u0d73\u0d75-\u0f29\u0f2b-\u0f32\u0f34-\u2cfc\u2cfe-\ua830\ua832-\U00010140\U00010142-\U00010174\U00010177-\U00010e7a\U00010e7c-\U0010ffff",
"^1/3": "\u0000-\u2152\u2154-\U00010e7c\U00010e7e-\U00012459\U0001245b-\U0001245c\U0001245e-\U0010ffff",
"^1/4": "\u0000-\u00bb\u00bd-\u09f6\u09f8-\u0b71\u0b73-\u0d72\u0d74-\ua82f\ua831-\U0001013f\U00010141-\U00010e7b\U00010e7d-\U0001245f\U00012461\U00012463-\U0010ffff",
"^1/5": "\u0000-\u2154\u2156-\U0010ffff",
"^1/6": "\u0000-\u2158\u215a-\U00012460\U00012462-\U0010ffff",
"^1/7": "\u0000-\u214f\u2151-\U0010ffff",
"^1/8": "\u0000-\u09f4\u09f6-\u0b75\u0b77-\u215a\u215c-\ua833\ua835-\U0001245e\U00012460-\U0010ffff",
"^1/9": "\u0000-\u2150\u2152-\U0010ffff",
"^10": "\u0000-\u0bef\u0bf1-\u0d6f\u0d71-\u1371\u1373-\u2168\u216a-\u2178\u217a-\u2468\u246a-\u247c\u247e-\u2490\u2492-\u24fd\u24ff-\u277e\u2780-\u2788\u278a-\u2792\u2794-\u3037\u3039-\u3228\u322a-\u3247\u3249-\u3288\u328a-\u4ebf\u4ec1-\u5340\u5342-\u62fd\u62ff-\uf972\uf974-\uf9fc\uf9fe-\U0001010f\U00010111-\U00010148\U0001014a-\U0001014f\U00010151-\U00010156\U00010158-\U0001015f\U00010165-\U00010321\U00010323-\U000103d2\U000103d4-\U0001085a\U0001085c-\U00010916\U00010918-\U00010a43\U00010a45-\U00010b5b\U00010b5d-\U00010b7b\U00010b7d-\U00010e68\U00010e6a-\U0001105a\U0001105c-\U0001d368\U0001d36a-\U0010ffff",
"^100": "\u0000-\u0bf0\u0bf2-\u0d70\u0d72-\u137a\u137c-\u216c\u216e-\u217c\u217e-\u4f6f\u4f71-\u767d\u767f-\u964b\u964d-\U00010118\U0001011a-\U0001014a\U0001014c-\U00010151\U00010153-\U00010169\U0001016b-\U000103d4\U000103d6-\U0001085c\U0001085e-\U00010918\U0001091a-\U00010a45\U00010a47-\U00010b5d\U00010b5f-\U00010b7d\U00010b7f-\U00010e71\U00010e73-\U00011063\U00011065-\U0010ffff",
"^1000": "\u0000-\u0bf1\u0bf3-\u0d71\u0d73-\u216e\u2170-\u217e\u2181-\u4ede\u4ee0-\u5342\u5344-\u9620\u9622-\U00010121\U00010123-\U0001014c\U0001014e-\U00010153\U00010155-\U00010170\U00010172-\U0001085d\U0001085f-\U00010a46\U00010a48-\U00010b5e\U00010b60-\U00010b7e\U00010b80-\U00011064\U00011066-\U0010ffff",
"^10000": "\u0000-\u137b\u137d-\u2181\u2183-\u4e06\u4e08-\u842b\u842d-\U0001012a\U0001012c-\U00010154\U00010156-\U0001085e\U00010860-\U0010ffff",
"^100000": "\u0000-\u2187\u2189-\U0010ffff",
"^100000000": "\u0000-\u4ebe\u4ec0-\u5103\u5105-\U0010ffff",
"^1000000000000": "\u0000-\u5145\u5147-\U0010ffff",
"^11": "\u0000-\u2169\u216b-\u2179\u217b-\u2469\u246b-\u247d\u247f-\u2491\u2493-\u24ea\u24ec-\U0010ffff",
"^11/2": "\u0000-\u0f2e\u0f30-\U0010ffff",
"^12": "\u0000-\u216a\u216c-\u217a\u217c-\u246a\u246c-\u247e\u2480-\u2492\u2494-\u24eb\u24ed-\U0010ffff",
"^13": "\u0000-\u246b\u246d-\u247f\u2481-\u2493\u2495-\u24ec\u24ee-\U0010ffff",
"^13/2": "\u0000-\u0f2f\u0f31-\U0010ffff",
"^14": "\u0000-\u246c\u246e-\u2480\u2482-\u2494\u2496-\u24ed\u24ef-\U0010ffff",
"^15": "\u0000-\u246d\u246f-\u2481\u2483-\u2495\u2497-\u24ee\u24f0-\U0010ffff",
"^15/2": "\u0000-\u0f30\u0f32-\U0010ffff",
"^16": "\u0000-\u09f8\u09fa-\u246e\u2470-\u2482\u2484-\u2496\u2498-\u24ef\u24f1-\U0010ffff",
"^17": "\u0000-\u16ed\u16ef-\u246f\u2471-\u2483\u2485-\u2497\u2499-\u24f0\u24f2-\U0010ffff",
"^17/2": "\u0000-\u0f31\u0f33-\U0010ffff",
"^18": "\u0000-\u16ee\u16f0-\u2470\u2472-\u2484\u2486-\u2498\u249a-\u24f1\u24f3-\U0010ffff",
"^19": "\u0000-\u16ef\u16f1-\u2471\u2473-\u2485\u2487-\u2499\u249b-\u24f2\u24f4-\U0010ffff",
"^2": "\u0000-\u0031\u0033-\u00b1\u00b3-\u0661\u0663-\u06f1\u06f3-\u07c1\u07c3-\u0967\u0969-\u09e7\u09e9-\u0a67\u0a69-\u0ae7\u0ae9-\u0b67\u0b69-\u0be7\u0be9-\u0c67\u0c69-\u0c79\u0c7b-\u0c7c\u0c7e-\u0ce7\u0ce9-\u0d67\u0d69-\u0e51\u0e53-\u0ed1\u0ed3-\u0f21\u0f23-\u1041\u1043-\u1091\u1093-\u1369\u136b-\u17e1\u17e3-\u17f1\u17f3-\u1811\u1813-\u1947\u1949-\u19d1\u19d3-\u1a81\u1a83-\u1a91\u1a93-\u1b51\u1b53-\u1bb1\u1bb3-\u1c41\u1c43-\u1c51\u1c53-\u2081\u2083-\u2160\u2162-\u2170\u2172-\u2460\u2462-\u2474\u2476-\u2488\u248a-\u24f5\u24f7-\u2776\u2778-\u2780\u2782-\u278a\u278c-\u3021\u3023-\u3192\u3194-\u3220\u3222-\u3280\u3282-\u3482\u3484-\u4e8b\u4e8d-\u5168\u516a-\u5f0c\u5f0e-\u5f0f\u5f11-\u8cad\u8caf-\u8cb2\u8cb4-\u8d2f\u8d31-\ua621\ua623-\ua6e6\ua6e8-\ua8d1\ua8d3-\ua901\ua903-\ua9d1\ua9d3-\uaa51\uaa53-\uabf1\uabf3-\uf977\uf979-\uff11\uff13-\U00010107\U00010109-\U0001015a\U0001015f-\U000103d1\U000103d3-\U000104a1\U000104a3-\U00010858\U0001085a-\U00010919\U0001091b-\U00010a40\U00010a42-\U00010b58\U00010b5a-\U00010b78\U00010b7a-\U00010e60\U00010e62-\U00011052\U00011054-\U00011067\U00011069-\U000110f1\U000110f3-\U00011137\U00011139-\U000111d1\U000111d3-\U000116c1\U000116c3-\U000123ff\U00012401-\U00012415\U00012417-\U0001241e\U00012420-\U00012422\U00012424-\U0001242c\U0001242e-\U00012434\U00012436-\U00012449\U0001244b-\U0001244f\U00012451-\U00012458\U0001245a-\U0001d360\U0001d362-\U0001d7cf\U0001d7d1-\U0001d7d9\U0001d7db-\U0001d7e3\U0001d7e5-\U0001d7ed\U0001d7ef-\U0001d7f7\U0001d7f9-\U0001f102\U0001f104-\U0002238f\U00022391-\U0010ffff",
"^2/3": "\u0000-\u2153\u2155-\U00010176\U00010178-\U00010e7d\U00010e7f-\U0001245a\U0001245c-\U0001245d\U0001245f-\U0010ffff",
"^2/5": "\u0000-\u2155\u2157-\U0010ffff",
"^20": "\u0000-\u1372\u1374-\u2472\u2474-\u2486\u2488-\u249a\u249c-\u24f3\u24f5-\u3038\u303a-\u3248\u324a-\u5343\u5345-\u5efe\u5f00-\U00010110\U00010112-\U000103d3\U000103d5-\U0001085b\U0001085d-\U00010917\U00010919-\U00010a44\U00010a46-\U00010b5c\U00010b5e-\U00010b7c\U00010b7e-\U00010e69\U00010e6b-\U0001105b\U0001105d-\U0001d369\U0001d36b-\U0010ffff",
"^200": "\u0000-\U00010119\U0001011b-\U00010e72\U00010e74-\U0010ffff",
"^2000": "\u0000-\U00010122\U00010124-\U0010ffff",
"^20000": "\u0000-\U0001012b\U0001012d-\U0010ffff",
"^21": "\u0000-\u3250\u3252-\U0010ffff",
"^22": "\u0000-\u3251\u3253-\U0010ffff",
"^23": "\u0000-\u3252\u3254-\U0010ffff",
"^24": "\u0000-\u3253\u3255-\U0010ffff",
"^25": "\u0000-\u3254\u3256-\U0010ffff",
"^26": "\u0000-\u3255\u3257-\U0010ffff",
"^27": "\u0000-\u3256\u3258-\U0010ffff",
"^28": "\u0000-\u3257\u3259-\U0010ffff",
"^29": "\u0000-\u3258\u325a-\U0010ffff",
"^3": "\u0000-\u0032\u0034-\u00b2\u00b4-\u0662\u0664-\u06f2\u06f4-\u07c2\u07c4-\u0968\u096a-\u09e8\u09ea-\u0a68\u0a6a-\u0ae8\u0aea-\u0b68\u0b6a-\u0be8\u0bea-\u0c68\u0c6a-\u0c7a\u0c7c-\u0c7d\u0c7f-\u0ce8\u0cea-\u0d68\u0d6a-\u0e52\u0e54-\u0ed2\u0ed4-\u0f22\u0f24-\u1042\u1044-\u1092\u1094-\u136a\u136c-\u17e2\u17e4-\u17f2\u17f4-\u1812\u1814-\u1948\u194a-\u19d2\u19d4-\u1a82\u1a84-\u1a92\u1a94-\u1b52\u1b54-\u1bb2\u1bb4-\u1c42\u1c44-\u1c52\u1c54-\u2082\u2084-\u2161\u2163-\u2171\u2173-\u2461\u2463-\u2475\u2477-\u2489\u248b-\u24f6\u24f8-\u2777\u2779-\u2781\u2783-\u278b\u278d-\u3022\u3024-\u3193\u3195-\u3221\u3223-\u3281\u3283-\u4e08\u4e0a-\u4ee7\u4ee9-\u53c0\u53c5-\u5f0d\u5f0f-\ua622\ua624-\ua6e7\ua6e9-\ua8d2\ua8d4-\ua902\ua904-\ua9d2\ua9d4-\uaa52\uaa54-\uabf2\uabf4-\uf96a\uf96c-\uff12\uff14-\U00010108\U0001010a-\U000104a2\U000104a4-\U00010859\U0001085b-\U0001091a\U0001091c-\U00010a41\U00010a43-\U00010b59\U00010b5b-\U00010b79\U00010b7b-\U00010e61\U00010e63-\U00011053\U00011055-\U00011068\U0001106a-\U000110f2\U000110f4-\U00011138\U0001113a-\U000111d2\U000111d4-\U000116c2\U000116c4-\U00012400\U00012402-\U00012407\U00012409-\U00012416\U00012418-\U0001241f\U00012421-\U00012423\U00012426-\U0001242d\U00012430-\U00012435\U00012438-\U00012439\U0001243c-\U0001244a\U0001244c-\U00012450\U00012452-\U0001d361\U0001d363-\U0001d7d0\U0001d7d2-\U0001d7da\U0001d7dc-\U0001d7e4\U0001d7e6-\U0001d7ee\U0001d7f0-\U0001d7f8\U0001d7fa-\U0001f103\U0001f105-\U00020afc\U00020afe-\U00020b18\U00020b1a-\U00022997\U00022999-\U00023b1a\U00023b1c-\U0010ffff",
"^3/16": "\u0000-\u09f5\u09f7-\u0b76\u0b78-\ua834\ua836-\U0010ffff",
"^3/2": "\u0000-\u0f2a\u0f2c-\U0010ffff",
"^3/4": "\u0000-\u00bd\u00bf-\u09f7\u09f9-\u0b73\u0b75-\u0d74\u0d76-\ua831\ua833-\U00010177\U00010179-\U0010ffff",
"^3/5": "\u0000-\u2156\u2158-\U0010ffff",
"^3/8": "\u0000-\u215b\u215d-\U0010ffff",
"^30": "\u0000-\u1373\u1375-\u3039\u303b-\u3249\u324b-\u3259\u325b-\u5344\u5346-\U00010111\U00010113-\U00010164\U00010166-\U00010e6a\U00010e6c-\U0001105c\U0001105e-\U0001d36a\U0001d36c-\U00020982\U00020984-\U0010ffff",
"^300": "\u0000-\U0001011a\U0001011c-\U0001016a\U0001016c-\U00010e73\U00010e75-\U0010ffff",
"^3000": "\u0000-\U00010123\U00010125-\U0010ffff",
"^30000": "\u0000-\U0001012c\U0001012e-\U0010ffff",
"^31": "\u0000-\u325a\u325c-\U0010ffff",
"^32": "\u0000-\u325b\u325d-\U0010ffff",
"^33": "\u0000-\u325c\u325e-\U0010ffff",
"^34": "\u0000-\u325d\u325f-\U0010ffff",
"^35": "\u0000-\u325e\u3260-\U0010ffff",
"^36": "\u0000-\u32b0\u32b2-\U0010ffff",
"^37": "\u0000-\u32b1\u32b3-\U0010ffff",
"^38": "\u0000-\u32b2\u32b4-\U0010ffff",
"^39": "\u0000-\u32b3\u32b5-\U0010ffff",
"^4": "\u0000-\u0033\u0035-\u0663\u0665-\u06f3\u06f5-\u07c3\u07c5-\u0969\u096b-\u09e9\u09eb-\u0a69\u0a6b-\u0ae9\u0aeb-\u0b69\u0b6b-\u0be9\u0beb-\u0c69\u0c6b-\u0ce9\u0ceb-\u0d69\u0d6b-\u0e53\u0e55-\u0ed3\u0ed5-\u0f23\u0f25-\u1043\u1045-\u1093\u1095-\u136b\u136d-\u17e3\u17e5-\u17f3\u17f5-\u1813\u1815-\u1949\u194b-\u19d3\u19d5-\u1a83\u1a85-\u1a93\u1a95-\u1b53\u1b55-\u1bb3\u1bb5-\u1c43\u1c45-\u1c53\u1c55-\u2073\u2075-\u2083\u2085-\u2162\u2164-\u2172\u2174-\u2462\u2464-\u2476\u2478-\u248a\u248c-\u24f7\u24f9-\u2778\u277a-\u2782\u2784-\u278c\u278e-\u3023\u3025-\u3194\u3196-\u3222\u3224-\u3282\u3284-\u4e95\u4e97-\u56da\u56dc-\u8085\u8087-\ua623\ua625-\ua6e8\ua6ea-\ua8d3\ua8d5-\ua903\ua905-\ua9d3\ua9d5-\uaa53\uaa55-\uabf3\uabf5-\uff13\uff15-\U00010109\U0001010b-\U000104a3\U000104a5-\U00010a42\U00010a44-\U00010b5a\U00010b5c-\U00010b7a\U00010b7c-\U00010e62\U00010e64-\U00011054\U00011056-\U00011069\U0001106b-\U000110f3\U000110f5-\U00011139\U0001113b-\U000111d3\U000111d5-\U000116c3\U000116c5-\U00012401\U00012403-\U00012408\U0001240a-\U0001240e\U00012410-\U00012417\U00012419-\U00012420\U00012422-\U00012425\U00012427-\U0001242f\U00012431-\U00012437\U00012439-\U0001243b\U00012440-\U0001244b\U0001244d-\U00012451\U00012454-\U0001d362\U0001d364-\U0001d7d1\U0001d7d3-\U0001d7db\U0001d7dd-\U0001d7e5\U0001d7e7-\U0001d7ef\U0001d7f1-\U0001d7f9\U0001d7fb-\U0001f104\U0001f106-\U00020063\U00020065-\U000200e1\U000200e3-\U0002626c\U0002626e-\U0010ffff",
"^4/5": "\u0000-\u2157\u2159-\U0010ffff",
"^40": "\u0000-\u1374\u1376-\u324a\u324c-\u32b4\u32b6-\u534b\u534d-\U00010112\U00010114-\U00010e6b\U00010e6d-\U0001105d\U0001105f-\U0001d36b\U0001d36d-\U0002098b\U0002098d-\U0002099b\U0002099d-\U0010ffff",
"^400": "\u0000-\U0001011b\U0001011d-\U00010e74\U00010e76-\U0010ffff",
"^4000": "\u0000-\U00010124\U00010126-\U0010ffff",
"^40000": "\u0000-\U0001012d\U0001012f-\U0010ffff",
"^41": "\u0000-\u32b5\u32b7-\U0010ffff",
"^42": "\u0000-\u32b6\u32b8-\U0010ffff",
"^43": "\u0000-\u32b7\u32b9-\U0010ffff",
"^44": "\u0000-\u32b8\u32ba-\U0010ffff",
"^45": "\u0000-\u32b9\u32bb-\U0010ffff",
"^46": "\u0000-\u32ba\u32bc-\U0010ffff",
"^47": "\u0000-\u32bb\u32bd-\U0010ffff",
"^48": "\u0000-\u32bc\u32be-\U0010ffff",
"^49": "\u0000-\u32bd\u32bf-\U0010ffff",
"^5": "\u0000-\u0034\u0036-\u0664\u0666-\u06f4\u06f6-\u07c4\u07c6-\u096a\u096c-\u09ea\u09ec-\u0a6a\u0a6c-\u0aea\u0aec-\u0b6a\u0b6c-\u0bea\u0bec-\u0c6a\u0c6c-\u0cea\u0cec-\u0d6a\u0d6c-\u0e54\u0e56-\u0ed4\u0ed6-\u0f24\u0f26-\u1044\u1046-\u1094\u1096-\u136c\u136e-\u17e4\u17e6-\u17f4\u17f6-\u1814\u1816-\u194a\u194c-\u19d4\u19d6-\u1a84\u1a86-\u1a94\u1a96-\u1b54\u1b56-\u1bb4\u1bb6-\u1c44\u1c46-\u1c54\u1c56-\u2074\u2076-\u2084\u2086-\u2163\u2165-\u2173\u2175-\u2463\u2465-\u2477\u2479-\u248b\u248d-\u24f8\u24fa-\u2779\u277b-\u2783\u2785-\u278d\u278f-\u3024\u3026-\u3223\u3225-\u3283\u3285-\u3404\u3406-\u3829\u382b-\u4e93\u4e95-\u4f0c\u4f0e-\ua624\ua626-\ua6e9\ua6eb-\ua8d4\ua8d6-\ua904\ua906-\ua9d4\ua9d6-\uaa54\uaa56-\uabf4\uabf6-\uff14\uff16-\U0001010a\U0001010c-\U00010142\U00010144-\U00010147\U00010149-\U0001014e\U00010150-\U0001015e\U00010160-\U00010172\U00010174-\U00010320\U00010322-\U000104a4\U000104a6-\U00010e63\U00010e65-\U00011055\U00011057-\U0001106a\U0001106c-\U000110f4\U000110f6-\U0001113a\U0001113c-\U000111d4\U000111d6-\U000116c4\U000116c6-\U00012402\U00012404-\U00012409\U0001240b-\U0001240f\U00012411-\U00012418\U0001241a-\U00012421\U00012423-\U00012426\U00012428-\U00012430\U00012432-\U00012438\U0001243a-\U0001244c\U0001244e-\U00012453\U00012456-\U0001d363\U0001d365-\U0001d7d2\U0001d7d4-\U0001d7dc\U0001d7de-\U0001d7e6\U0001d7e8-\U0001d7f0\U0001d7f2-\U0001d7fa\U0001d7fc-\U0001f105\U0001f107-\U00020120\U00020122-\U0010ffff",
"^5/2": "\u0000-\u0f2b\u0f2d-\U0010ffff",
"^5/6": "\u0000-\u2159\u215b-\U0001245b\U0001245d-\U0010ffff",
"^5/8": "\u0000-\u215c\u215e-\U0010ffff",
"^50": "\u0000-\u1375\u1377-\u216b\u216d-\u217b\u217d-\u2185\u2187-\u324b\u324d-\u32be\u32c0-\U00010113\U00010115-\U00010143\U00010145-\U00010149\U0001014b-\U00010150\U00010152-\U00010165\U0001016a-\U00010173\U00010175-\U00010322\U00010324-\U00010a7d\U00010a7f-\U00010e6c\U00010e6e-\U0001105e\U00011060-\U0001d36c\U0001d36e-\U0010ffff",
"^500": "\u0000-\u216d\u216f-\u217d\u217f-\U0001011c\U0001011e-\U00010144\U00010146-\U0001014b\U0001014d-\U00010152\U00010154-\U0001016b\U00010171-\U00010e75\U00010e77-\U0010ffff",
"^5000": "\u0000-\u2180\u2182-\U00010125\U00010127-\U00010145\U00010147-\U0001014d\U0001014f-\U00010171\U00010173-\U0010ffff",
"^50000": "\u0000-\u2186\u2188-\U0001012e\U00010130-\U00010146\U00010148-\U00010155\U00010157-\U0010ffff",
"^6": "\u0000-\u0035\u0037-\u0665\u0667-\u06f5\u06f7-\u07c5\u07c7-\u096b\u096d-\u09eb\u09ed-\u0a6b\u0a6d-\u0aeb\u0aed-\u0b6b\u0b6d-\u0beb\u0bed-\u0c6b\u0c6d-\u0ceb\u0ced-\u0d6b\u0d6d-\u0e55\u0e57-\u0ed5\u0ed7-\u0f25\u0f27-\u1045\u1047-\u1095\u1097-\u136d\u136f-\u17e5\u17e7-\u17f5\u17f7-\u1815\u1817-\u194b\u194d-\u19d5\u19d7-\u1a85\u1a87-\u1a95\u1a97-\u1b55\u1b57-\u1bb5\u1bb7-\u1c45\u1c47-\u1c55\u1c57-\u2075\u2077-\u2085\u2087-\u2164\u2166-\u2174\u2176-\u2184\u2186-\u2464\u2466-\u2478\u247a-\u248c\u248e-\u24f9\u24fb-\u277a\u277c-\u2784\u2786-\u278e\u2790-\u3025\u3027-\u3224\u3226-\u3284\u3286-\u516c\u516e-\u9645\u9647-\u9677\u9679-\ua625\ua627-\ua6ea\ua6ec-\ua8d5\ua8d7-\ua905\ua907-\ua9d5\ua9d7-\uaa55\uaa57-\uabf5\uabf7-\uf9d0\uf9d2\uf9d4-\uff15\uff17-\U0001010b\U0001010d-\U000104a5\U000104a7-\U00010e64\U00010e66-\U00011056\U00011058-\U0001106b\U0001106d-\U000110f5\U000110f7-\U0001113b\U0001113d-\U000111d5\U000111d7-\U000116c5\U000116c7-\U00012403\U00012405-\U0001240a\U0001240c-\U00012410\U00012412-\U00012419\U0001241b-\U00012427\U00012429-\U0001243f\U00012441-\U0001244d\U0001244f-\U0001d364\U0001d366-\U0001d7d3\U0001d7d5-\U0001d7dd\U0001d7df-\U0001d7e7\U0001d7e9-\U0001d7f1\U0001d7f3-\U0001d7fb\U0001d7fd-\U0001f106\U0001f108-\U00020ae9\U00020aeb-\U0010ffff",
"^60": "\u0000-\u1376\u1378-\u324c\u324e-\U00010114\U00010116-\U00010e6d\U00010e6f-\U0001105f\U00011061-\U0001d36d\U0001d36f-\U0010ffff",
"^600": "\u0000-\U0001011d\U0001011f-\U00010e76\U00010e78-\U0010ffff",
"^6000": "\u0000-\U00010126\U00010128-\U0010ffff",
"^60000": "\u0000-\U0001012f\U00010131-\U0010ffff",
"^7": "\u0000-\u0036\u0038-\u0666\u0668-\u06f6\u06f8-\u07c6\u07c8-\u096c\u096e-\u09ec\u09ee-\u0a6c\u0a6e-\u0aec\u0aee-\u0b6c\u0b6e-\u0bec\u0bee-\u0c6c\u0c6e-\u0cec\u0cee-\u0d6c\u0d6e-\u0e56\u0e58-\u0ed6\u0ed8-\u0f26\u0f28-\u1046\u1048-\u1096\u1098-\u136e\u1370-\u17e6\u17e8-\u17f6\u17f8-\u1816\u1818-\u194c\u194e-\u19d6\u19d8-\u1a86\u1a88-\u1a96\u1a98-\u1b56\u1b58-\u1bb6\u1bb8-\u1c46\u1c48-\u1c56\u1c58-\u2076\u2078-\u2086\u2088-\u2165\u2167-\u2175\u2177-\u2465\u2467-\u2479\u247b-\u248d\u248f-\u24fa\u24fc-\u277b\u277d-\u2785\u2787-\u278f\u2791-\u3026\u3028-\u3225\u3227-\u3285\u3287-\u3b4c\u3b4e-\u4e02\u4e04-\u67d1\u67d3-\u6f05\u6f07-\ua626\ua628-\ua6eb\ua6ed-\ua8d6\ua8d8-\ua906\ua908-\ua9d6\ua9d8-\uaa56\uaa58-\uabf6\uabf8-\uff16\uff18-\U0001010c\U0001010e-\U000104a6\U000104a8-\U00010e65\U00010e67-\U00011057\U00011059-\U0001106c\U0001106e-\U000110f6\U000110f8-\U0001113c\U0001113e-\U000111d6\U000111d8-\U000116c6\U000116c8-\U00012404\U00012406-\U0001240b\U0001240d-\U00012411\U00012413-\U0001241a\U0001241c-\U00012428\U0001242a-\U00012440\U00012444-\U0001d365\U0001d367-\U0001d7d4\U0001d7d6-\U0001d7de\U0001d7e0-\U0001d7e8\U0001d7ea-\U0001d7f2\U0001d7f4-\U0001d7fc\U0001d7fe-\U0001f107\U0001f109-\U00020000\U00020002-\U0010ffff",
"^7/2": "\u0000-\u0f2c\u0f2e-\U0010ffff",
"^7/8": "\u0000-\u215d\u215f-\U0010ffff",
"^70": "\u0000-\u1377\u1379-\u324d\u324f-\U00010115\U00010117-\U00010e6e\U00010e70-\U00011060\U00011062-\U0001d36e\U0001d370-\U0010ffff",
"^700": "\u0000-\U0001011e\U00010120-\U00010e77\U00010e79-\U0010ffff",
"^7000": "\u0000-\U00010127\U00010129-\U0010ffff",
"^70000": "\u0000-\U00010130\U00010132-\U0010ffff",
"^8": "\u0000-\u0037\u0039-\u0667\u0669-\u06f7\u06f9-\u07c7\u07c9-\u096d\u096f-\u09ed\u09ef-\u0a6d\u0a6f-\u0aed\u0aef-\u0b6d\u0b6f-\u0bed\u0bef-\u0c6d\u0c6f-\u0ced\u0cef-\u0d6d\u0d6f-\u0e57\u0e59-\u0ed7\u0ed9-\u0f27\u0f29-\u1047\u1049-\u1097\u1099-\u136f\u1371-\u17e7\u17e9-\u17f7\u17f9-\u1817\u1819-\u194d\u194f-\u19d7\u19d9-\u1a87\u1a89-\u1a97\u1a99-\u1b57\u1b59-\u1bb7\u1bb9-\u1c47\u1c49-\u1c57\u1c59-\u2077\u2079-\u2087\u2089-\u2166\u2168-\u2176\u2178-\u2466\u2468-\u247a\u247c-\u248e\u2490-\u24fb\u24fd-\u277c\u277e-\u2786\u2788-\u2790\u2792-\u3027\u3029-\u3226\u3228-\u3286\u3288-\u516a\u516c-\u634b\u634d-\ua627\ua629-\ua6ec\ua6ee-\ua8d7\ua8d9-\ua907\ua909-\ua9d7\ua9d9-\uaa57\uaa59-\uabf7\uabf9-\uff17\uff19-\U0001010d\U0001010f-\U000104a7\U000104a9-\U00010e66\U00010e68-\U00011058\U0001105a-\U0001106d\U0001106f-\U000110f7\U000110f9-\U0001113d\U0001113f-\U000111d7\U000111d9-\U000116c7\U000116c9-\U00012405\U00012407-\U0001240c\U0001240e-\U00012412\U00012414-\U0001241b\U0001241d-\U00012429\U0001242b-\U00012443\U00012446-\U0001d366\U0001d368-\U0001d7d5\U0001d7d7-\U0001d7df\U0001d7e1-\U0001d7e9\U0001d7eb-\U0001d7f3\U0001d7f5-\U0001d7fd\U0001d7ff-\U0001f108\U0001f10a-\U0010ffff",
"^80": "\u0000-\u1378\u137a-\u324e\u3250-\U00010116\U00010118-\U00010e6f\U00010e71-\U00011061\U00011063-\U0001d36f\U0001d371-\U0010ffff",
"^800": "\u0000-\U0001011f\U00010121-\U00010e78\U00010e7a-\U0010ffff",
"^8000": "\u0000-\U00010128\U0001012a-\U0010ffff",
"^80000": "\u0000-\U00010131\U00010133-\U0010ffff",
"^9": "\u0000-\u0038\u003a-\u0668\u066a-\u06f8\u06fa-\u07c8\u07ca-\u096e\u0970-\u09ee\u09f0-\u0a6e\u0a70-\u0aee\u0af0-\u0b6e\u0b70-\u0bee\u0bf0-\u0c6e\u0c70-\u0cee\u0cf0-\u0d6e\u0d70-\u0e58\u0e5a-\u0ed8\u0eda-\u0f28\u0f2a-\u1048\u104a-\u1098\u109a-\u1370\u1372-\u17e8\u17ea-\u17f8\u17fa-\u1818\u181a-\u194e\u1950-\u19d8\u19da-\u1a88\u1a8a-\u1a98\u1a9a-\u1b58\u1b5a-\u1bb8\u1bba-\u1c48\u1c4a-\u1c58\u1c5a-\u2078\u207a-\u2088\u208a-\u2167\u2169-\u2177\u2179-\u2467\u2469-\u247b\u247d-\u248f\u2491-\u24fc\u24fe-\u277d\u277f-\u2787\u2789-\u2791\u2793-\u3028\u302a-\u3227\u3229-\u3287\u3289-\u4e5c\u4e5e-\u5efd\u5eff-\u7395\u7397-\ua628\ua62a-\ua6ed\ua6ef-\ua8d8\ua8da-\ua908\ua90a-\ua9d8\ua9da-\uaa58\uaa5a-\uabf8\uabfa-\uff18\uff1a-\U0001010e\U00010110-\U000104a8\U000104aa-\U00010e67\U00010e69-\U00011059\U0001105b-\U0001106e\U00011070-\U000110f8\U000110fa-\U0001113e\U00011140-\U000111d8\U000111da-\U000116c8\U000116ca-\U00012406\U00012408-\U0001240d\U0001240f-\U00012413\U00012415-\U0001241c\U0001241e-\U0001242a\U0001242c-\U00012445\U0001244a-\U0001d367\U0001d369-\U0001d7d6\U0001d7d8-\U0001d7e0\U0001d7e2-\U0001d7ea\U0001d7ec-\U0001d7f4\U0001d7f6-\U0001d7fe\U0001d800-\U0001f109\U0001f10b-\U0002f88f\U0002f891-\U0010ffff",
"^9/2": "\u0000-\u0f2d\u0f2f-\U0010ffff",
"^90": "\u0000-\u1379\u137b-\U00010117\U00010119-\U00010340\U00010342-\U00010e70\U00010e72-\U00011062\U00011064-\U0001d370\U0001d372-\U0010ffff",
"^900": "\u0000-\U00010120\U00010122-\U00010349\U0001034b-\U00010e79\U00010e7b-\U0010ffff",
"^9000": "\u0000-\U00010129\U0001012b-\U0010ffff",
"^90000": "\u0000-\U00010132\U00010134-\U0010ffff",
"^nan": "\u0030-\u0039\u00b2-\u00b3\u00b9\u00bc-\u00be\u0660-\u0669\u06f0-\u06f9\u07c0-\u07c9\u0966-\u096f\u09e6-\u09ef\u09f4-\u09f9\u0a66-\u0a6f\u0ae6-\u0aef\u0b66-\u0b6f\u0b72-\u0b77\u0be6-\u0bf2\u0c66-\u0c6f\u0c78-\u0c7e\u0ce6-\u0cef\u0d66-\u0d75\u0e50-\u0e59\u0ed0-\u0ed9\u0f20-\u0f33\u1040-\u1049\u1090-\u1099\u1369-\u137c\u16ee-\u16f0\u17e0-\u17e9\u17f0-\u17f9\u1810-\u1819\u1946-\u194f\u19d0-\u19da\u1a80-\u1a89\u1a90-\u1a99\u1b50-\u1b59\u1bb0-\u1bb9\u1c40-\u1c49\u1c50-\u1c59\u2070\u2074-\u2079\u2080-\u2089\u2150-\u2182\u2185-\u2189\u2460-\u249b\u24ea-\u24ff\u2776-\u2793\u2cfd\u3007\u3021-\u3029\u3038-\u303a\u3192-\u3195\u3220-\u3229\u3248-\u324f\u3251-\u325f\u3280-\u3289\u32b1-\u32bf\u3405\u3483\u382a\u3b4d\u4e00\u4e03\u4e07\u4e09\u4e5d\u4e8c\u4e94\u4e96\u4ebf-\u4ec0\u4edf\u4ee8\u4f0d\u4f70\u5104\u5146\u5169\u516b\u516d\u5341\u5343-\u5345\u534c\u53c1-\u53c4\u56db\u58f1\u58f9\u5e7a\u5efe-\u5eff\u5f0c-\u5f0e\u5f10\u62fe\u634c\u67d2\u6f06\u7396\u767e\u8086\u842c\u8cae\u8cb3\u8d30\u9621\u9646\u964c\u9678\u96f6\ua620-\ua629\ua6e6-\ua6ef\ua830-\ua835\ua8d0-\ua8d9\ua900-\ua909\ua9d0-\ua9d9\uaa50-\uaa59\uabf0-\uabf9\uf96b\uf973\uf978\uf9b2\uf9d1\uf9d3\uf9fd\uff10-\uff19\U00010107-\U00010133\U00010140-\U00010178\U0001018a\U00010320-\U00010323\U00010341\U0001034a\U000103d1-\U000103d5\U000104a0-\U000104a9\U00010858-\U0001085f\U00010916-\U0001091b\U00010a40-\U00010a47\U00010a7d-\U00010a7e\U00010b58-\U00010b5f\U00010b78-\U00010b7f\U00010e60-\U00010e7e\U00011052-\U0001106f\U000110f0-\U000110f9\U00011136-\U0001113f\U000111d0-\U000111d9\U000116c0-\U000116c9\U00012400-\U00012431\U00012434-\U00012455\U00012458-\U00012462\U0001d360-\U0001d371\U0001d7ce-\U0001d7ff\U0001f100-\U0001f10a\U00020001\U00020064\U000200e2\U00020121\U0002092a\U00020983\U0002098c\U0002099c\U00020aea\U00020afd\U00020b19\U00022390\U00022998\U00023b1b\U0002626d\U0002f890",
"nan": "\u0000-\u002f\u003a-\u00b1\u00b4-\u00b8\u00ba-\u00bb\u00bf-\u065f\u066a-\u06ef\u06fa-\u07bf\u07ca-\u0965\u0970-\u09e5\u09f0-\u09f3\u09fa-\u0a65\u0a70-\u0ae5\u0af0-\u0b65\u0b70-\u0b71\u0b78-\u0be5\u0bf3-\u0c65\u0c70-\u0c77\u0c7f-\u0ce5\u0cf0-\u0d65\u0d76-\u0e4f\u0e5a-\u0ecf\u0eda-\u0f1f\u0f34-\u103f\u104a-\u108f\u109a-\u1368\u137d-\u16ed\u16f1-\u17df\u17ea-\u17ef\u17fa-\u180f\u181a-\u1945\u1950-\u19cf\u19db-\u1a7f\u1a8a-\u1a8f\u1a9a-\u1b4f\u1b5a-\u1baf\u1bba-\u1c3f\u1c4a-\u1c4f\u1c5a-\u206f\u2071-\u2073\u207a-\u207f\u208a-\u214f\u2183-\u2184\u218a-\u245f\u249c-\u24e9\u2500-\u2775\u2794-\u2cfc\u2cfe-\u3006\u3008-\u3020\u302a-\u3037\u303b-\u3191\u3196-\u321f\u322a-\u3247\u3250\u3260-\u327f\u328a-\u32b0\u32c0-\u3404\u3406-\u3482\u3484-\u3829\u382b-\u3b4c\u3b4e-\u4dff\u4e01-\u4e02\u4e04-\u4e06\u4e08\u4e0a-\u4e5c\u4e5e-\u4e8b\u4e8d-\u4e93\u4e95\u4e97-\u4ebe\u4ec1-\u4ede\u4ee0-\u4ee7\u4ee9-\u4f0c\u4f0e-\u4f6f\u4f71-\u5103\u5105-\u5145\u5147-\u5168\u516a\u516c\u516e-\u5340\u5342\u5346-\u534b\u534d-\u53c0\u53c5-\u56da\u56dc-\u58f0\u58f2-\u58f8\u58fa-\u5e79\u5e7b-\u5efd\u5f00-\u5f0b\u5f0f\u5f11-\u62fd\u62ff-\u634b\u634d-\u67d1\u67d3-\u6f05\u6f07-\u7395\u7397-\u767d\u767f-\u8085\u8087-\u842b\u842d-\u8cad\u8caf-\u8cb2\u8cb4-\u8d2f\u8d31-\u9620\u9622-\u9645\u9647-\u964b\u964d-\u9677\u9679-\u96f5\u96f7-\ua61f\ua62a-\ua6e5\ua6f0-\ua82f\ua836-\ua8cf\ua8da-\ua8ff\ua90a-\ua9cf\ua9da-\uaa4f\uaa5a-\uabef\uabfa-\uf96a\uf96c-\uf972\uf974-\uf977\uf979-\uf9b1\uf9b3-\uf9d0\uf9d2\uf9d4-\uf9fc\uf9fe-\uff0f\uff1a-\U00010106\U00010134-\U0001013f\U00010179-\U00010189\U0001018b-\U0001031f\U00010324-\U00010340\U00010342-\U00010349\U0001034b-\U000103d0\U000103d6-\U0001049f\U000104aa-\U00010857\U00010860-\U00010915\U0001091c-\U00010a3f\U00010a48-\U00010a7c\U00010a7f-\U00010b57\U00010b60-\U00010b77\U00010b80-\U00010e5f\U00010e7f-\U00011051\U00011070-\U000110ef\U000110fa-\U00011135\U00011140-\U000111cf\U000111da-\U000116bf\U000116ca-\U000123ff\U00012432-\U00012433\U00012456-\U00012457\U00012463-\U0001d35f\U0001d372-\U0001d7cd\U0001d800-\U0001f0ff\U0001f10b-\U00020000\U00020002-\U00020063\U00020065-\U000200e1\U000200e3-\U00020120\U00020122-\U00020929\U0002092b-\U00020982\U00020984-\U0002098b\U0002098d-\U0002099b\U0002099d-\U00020ae9\U00020aeb-\U00020afc\U00020afe-\U00020b18\U00020b1a-\U0002238f\U00022391-\U00022997\U00022999-\U00023b1a\U00023b1c-\U0002626c\U0002626e-\U0002f88f\U0002f891-\U0010ffff"
}
| 160.773504 | 2,416 | 0.764254 | 4,674 | 37,621 | 6.149979 | 0.346812 | 0.003827 | 0.001044 | 0.001392 | 0.004175 | 0.004175 | 0 | 0 | 0 | 0 | 0 | 0.541987 | 0.036788 | 37,621 | 233 | 2,417 | 161.463519 | 0.251263 | 0.001489 | 0 | 0 | 1 | 0.233766 | 0.925291 | 0.888655 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004329 | 0 | 0.004329 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6774473adfd5c95c5e3d1c22e9194c315f9270c8 | 232,038 | py | Python | misc/baxter/src_py_/JRx_dots.py | YoshimitsuMatsutaIe/rmp_test | a7c94ff68b518ef51821484795c308c2c8519c4c | [
"MIT"
] | null | null | null | misc/baxter/src_py_/JRx_dots.py | YoshimitsuMatsutaIe/rmp_test | a7c94ff68b518ef51821484795c308c2c8519c4c | [
"MIT"
] | null | null | null | misc/baxter/src_py_/JRx_dots.py | YoshimitsuMatsutaIe/rmp_test | a7c94ff68b518ef51821484795c308c2c8519c4c | [
"MIT"
] | null | null | null | import numpy as np
from math import cos as c
from math import sin as s
from math import tan as ta
from math import sqrt as sq
def jrx_W0_dot(q, dq):
return numpy.array([[0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0]])
def jrx_BR_dot(q, dq):
return numpy.array([[0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0]])
def jrx_0_dot(q, dq):
return numpy.array([[0.707106781186548*numpy.sqrt(2)*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0], 0, 0, 0, 0, 0, 0], [0.707106781186548*numpy.sqrt(2)*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0], 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0]])
def jrx_1_dot(q, dq):
return numpy.array([[-0.707106781186548*numpy.sqrt(2)*(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0]), -0.707106781186548*numpy.sqrt(2)*(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0]), 0, 0, 0, 0, 0], [0.707106781186548*numpy.sqrt(2)*(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0]), 0.707106781186548*numpy.sqrt(2)*(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0]), 0, 0, 0, 0, 0], [0, numpy.cos(q[1, 0])*dq[1, 0], 0, 0, 0, 0, 0]])
def jrx_2_dot(q, dq):
return numpy.array([[0.707106781186548*numpy.sqrt(2)*(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0]), -0.707106781186548*numpy.sqrt(2)*(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[2, 0]), 0.707106781186548*numpy.sqrt(2)*(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0]), 0, 0, 0, 0], [0.707106781186548*numpy.sqrt(2)*(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0]), 0.707106781186548*numpy.sqrt(2)*(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[2, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0]), 0.707106781186548*numpy.sqrt(2)*(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0]), 0, 0, 0, 0], [0, -numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0], -numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[1, 0] + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0], 0, 0, 0, 0]])
def jrx_3_dot(q, dq):
return numpy.array([[0.707106781186548*numpy.sqrt(2)*((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0]), 0.707106781186548*numpy.sqrt(2)*(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0]), 0.707106781186548*numpy.sqrt(2)*(-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[3, 0])), 0.707106781186548*numpy.sqrt(2)*(-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0]), 0, 0, 0], [0.707106781186548*numpy.sqrt(2)*(-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0]), 0.707106781186548*numpy.sqrt(2)*(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0]), 0.707106781186548*numpy.sqrt(2)*((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[3, 0])), 0.707106781186548*numpy.sqrt(2)*(-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0]), 0, 0, 0], [0, -numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[3, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0], -numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0], -numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] - numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0], 0, 0, 0]])
def jrx_4_dot(q, dq):
return numpy.array([[0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0])), 0.707106781186548*numpy.sqrt(2)*((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] - numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0]), 0.707106781186548*numpy.sqrt(2)*(-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.cos(q[4, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[4, 0])), 0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.cos(q[4, 0])), 0.707106781186548*numpy.sqrt(2)*(-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[4, 0])), 0, 0], [0.707106781186548*numpy.sqrt(2)*(-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0])), 0.707106781186548*numpy.sqrt(2)*((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0]), 0.707106781186548*numpy.sqrt(2)*((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.cos(q[4, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])), 0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[4, 0])), 0.707106781186548*numpy.sqrt(2)*(-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[4, 0])), 0, 0], [0, -(numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[4, 0] - numpy.sin(q[1, 0])*numpy.sin(q[4, 0])*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[1, 0], -numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.sin(q[4, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[3, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[4, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[2, 0] + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[2, 0] + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[4, 0], -(numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.cos(q[4, 0]), -(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[4, 0] + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0], 0, 0]])
def jrx_5_dot(q, dq):
return numpy.array([[0.707106781186548*numpy.sqrt(2)*(((-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.sin(q[5, 0])), 0.707106781186548*numpy.sqrt(2)*(-((-numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] - numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.sin(q[5, 0])), 0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.cos(q[5, 0])*dq[5, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[5, 0])*numpy.cos(q[3, 0])*dq[3, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.cos(q[4, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0])), 0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[5, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[4, 0])*dq[5, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.cos(q[4, 0])*numpy.cos(q[5, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0])), -0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.cos(q[5, 0])), 0], [-0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[5, 0])), 0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[5, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[5, 0])), 0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.cos(q[5, 0])*dq[5, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[3, 0])*dq[3, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.cos(q[4, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0])), 0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[5, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[4, 0])*dq[5, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[4, 0])*numpy.cos(q[5, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0])), 0.707106781186548*numpy.sqrt(2)*(-(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[5, 0])), 0], [0, -((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - ((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[1, 0])*numpy.sin(q[4, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[5, 0]) - (-numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[5, 0]), -(numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[4, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[1, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[4, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[4, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[4, 0])*numpy.cos(q[5, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0])*dq[1, 0] - numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[5, 0])*dq[5, 0] - numpy.sin(q[2, 0])*numpy.sin(q[5, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.sin(q[5, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0], -(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[5, 0])*dq[4, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[4, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]) - (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[5, 0]), ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[4, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0])*numpy.cos(q[5, 0]), -((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (-(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[5, 0]) - (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.cos(q[5, 0]), 0]])
def jrx_6_dot(q, dq):
return numpy.array([[0.707106781186548*numpy.sqrt(2)*((-(-(-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (((-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - ((-(-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + ((-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - ((-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - (-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) - (((-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((-((-numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[6, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[5, 0])*dq[5, 0] + (-(-numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[4, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] - numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) - (-(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[4, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[4, 0])*dq[1, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[4, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0])*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0]) - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.cos(q[5, 0])*dq[5, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[5, 0])*numpy.cos(q[3, 0])*dq[3, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.cos(q[4, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.sin(q[4, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(-(-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[6, 0])*dq[6, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[4, 0])*dq[4, 0] + (-(-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[4, 0])*numpy.sin(q[6, 0]) - ((-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[5, 0])*dq[4, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[4, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(-(-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[6, 0])*dq[5, 0] - (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[5, 0])*dq[6, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[6, 0]) - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0])*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0])), -0.707106781186548*numpy.sqrt(2)*(((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0]) + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0]))], [-0.707106781186548*numpy.sqrt(2)*(((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[6, 0])*dq[6, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[5, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.sin(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[4, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[0, 0])*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.cos(q[5, 0])*dq[5, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[3, 0])*dq[3, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.cos(q[4, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.sin(q[4, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[6, 0])*dq[6, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[4, 0])*dq[4, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.sin(q[4, 0])*numpy.sin(q[6, 0]) + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[5, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[4, 0])*dq[5, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[6, 0])*dq[5, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[5, 0])*dq[6, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[6, 0]) - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0])*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(-((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0]) - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0]))], [0, -(((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - ((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[1, 0])*numpy.sin(q[4, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[5, 0]) + (-numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) - ((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[4, 0] + numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[1, 0])*numpy.sin(q[6, 0]), -((numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[4, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[5, 0]) - numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[1, 0])*dq[6, 0] - (numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]) - numpy.cos(q[2, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[1, 0])*numpy.cos(q[6, 0])*dq[6, 0] - ((numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[4, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[1, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[4, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[4, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[4, 0])*numpy.cos(q[5, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[5, 0])*dq[5, 0] + numpy.sin(q[2, 0])*numpy.sin(q[5, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.sin(q[5, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[6, 0]) - (-numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] - numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[2, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[4, 0])*numpy.sin(q[6, 0]), ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]) - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[6, 0])*dq[6, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[4, 0])*dq[4, 0] - ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[5, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[4, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.sin(q[4, 0])*numpy.sin(q[6, 0]), ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[6, 0])*dq[5, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[5, 0])*dq[6, 0] - ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (-(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[6, 0]) - ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[4, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0])*numpy.cos(q[5, 0])*numpy.cos(q[6, 0]), (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0]) - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (-(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0]), -(((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - (-(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0]) - ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[4, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0])*numpy.cos(q[6, 0])]])
def jrx_ee_dot(q, dq):
return numpy.array([[0.707106781186548*numpy.sqrt(2)*((-(-(-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (((-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - ((-(-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + ((-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - ((-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - (-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) - (((-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (-(-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((-((-numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[6, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[5, 0])*dq[5, 0] + (-(-numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) + numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[4, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] - numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) - (-(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[4, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[4, 0])*dq[1, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[4, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0])*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0]) - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.cos(q[5, 0])*dq[5, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[5, 0])*numpy.cos(q[3, 0])*dq[3, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.cos(q[4, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*numpy.sin(q[4, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(-(-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[6, 0])*dq[6, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[4, 0])*dq[4, 0] + (-(-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[4, 0])*numpy.sin(q[6, 0]) - ((-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[5, 0])*dq[4, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[4, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(-(-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[6, 0])*dq[5, 0] - (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[5, 0])*dq[6, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[6, 0]) - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] - numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0])*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0])), -0.707106781186548*numpy.sqrt(2)*(((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0]) + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0]))], [-0.707106781186548*numpy.sqrt(2)*(((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0]) - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0]) + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[1, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[6, 0])*dq[6, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[5, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[3, 0])*dq[0, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[0, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.sin(q[4, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[4, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[0, 0])*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (-((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.cos(q[5, 0])*dq[5, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[3, 0])*dq[3, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.cos(q[4, 0])*dq[3, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] - numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[3, 0])*numpy.sin(q[4, 0])*dq[3, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[4, 0])*dq[4, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[6, 0])*dq[6, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[4, 0])*dq[4, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.sin(q[4, 0])*numpy.sin(q[6, 0]) + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[5, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[4, 0])*dq[5, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[6, 0])*dq[5, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[5, 0])*dq[6, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[6, 0]) - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[5, 0])*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0])), 0.707106781186548*numpy.sqrt(2)*(-((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + ((((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] - (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.cos(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) + ((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[0, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0]) - (((numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.cos(q[3, 0]) + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0]) + numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (-(numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi))*numpy.sin(q[3, 0])*dq[3, 0] + (-numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*dq[0, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] - numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[3, 0]) - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[1, 0])*dq[0, 0])*numpy.sin(q[4, 0]) + (numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*dq[1, 0] - numpy.sin(q[0, 0] + (1/4)*numpy.pi)*numpy.cos(q[2, 0])*dq[0, 0] + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[0, 0] - numpy.sin(q[2, 0])*numpy.cos(q[0, 0] + (1/4)*numpy.pi)*dq[2, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0]))], [0, -(((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0]))*numpy.cos(q[5, 0]) - (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - ((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + ((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[1, 0])*numpy.sin(q[4, 0])*numpy.cos(q[2, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[5, 0]) + (-numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[3, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) - ((numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[4, 0])*dq[4, 0] - (numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0])*numpy.sin(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[4, 0] + numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[1, 0])*numpy.sin(q[6, 0]), -((numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[4, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[5, 0]) - numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[1, 0])*dq[6, 0] - (numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[3, 0]) - numpy.cos(q[2, 0])*numpy.cos(q[4, 0]))*numpy.cos(q[1, 0])*numpy.cos(q[6, 0])*dq[6, 0] - ((numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0]) + numpy.sin(q[4, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[1, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[4, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[4, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[4, 0])*numpy.cos(q[5, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[5, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[5, 0])*dq[5, 0] + numpy.sin(q[2, 0])*numpy.sin(q[5, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.sin(q[5, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[6, 0]) - (-numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] - numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[2, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[4, 0])*numpy.sin(q[6, 0]), ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[5, 0]) - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[6, 0])*dq[6, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[4, 0])*dq[4, 0] - ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[4, 0])*numpy.cos(q[5, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[4, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.cos(q[4, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0]) + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.sin(q[4, 0])*numpy.sin(q[6, 0]), ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[5, 0])*numpy.cos(q[6, 0])*dq[5, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*numpy.cos(q[5, 0])*dq[6, 0] - ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[6, 0])*dq[6, 0] - (-(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[6, 0]) - ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[4, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0])*numpy.cos(q[5, 0])*numpy.cos(q[6, 0]), (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0]) - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[5, 0]))*numpy.sin(q[6, 0])*dq[6, 0] - (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0])*dq[5, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0])*dq[5, 0] + (-(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.sin(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.cos(q[5, 0]))*numpy.cos(q[6, 0]), -(((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.sin(q[5, 0]))*numpy.cos(q[6, 0])*dq[6, 0] + ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0]) - numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0]))*numpy.sin(q[6, 0])*dq[6, 0] + (((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0]) + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0]))*numpy.sin(q[5, 0])*dq[5, 0] - (numpy.sin(q[1, 0])*numpy.cos(q[3, 0]) + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0]))*numpy.cos(q[5, 0])*dq[5, 0] - (-(numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.sin(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.cos(q[4, 0]) - numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[4, 0])*dq[4, 0] + numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[2, 0])*numpy.cos(q[5, 0]) + (numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*numpy.cos(q[2, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.sin(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[2, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[3, 0] - numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[1, 0])*numpy.sin(q[5, 0]))*numpy.sin(q[6, 0]) - ((numpy.sin(q[1, 0])*numpy.sin(q[3, 0]) - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0]))*numpy.cos(q[4, 0])*dq[4, 0] + (numpy.sin(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[3, 0])*dq[1, 0] + numpy.sin(q[1, 0])*numpy.cos(q[3, 0])*dq[3, 0] + numpy.sin(q[2, 0])*numpy.cos(q[1, 0])*numpy.cos(q[3, 0])*dq[2, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*dq[3, 0] + numpy.sin(q[3, 0])*numpy.cos(q[1, 0])*dq[1, 0])*numpy.sin(q[4, 0]) + numpy.sin(q[1, 0])*numpy.sin(q[2, 0])*numpy.cos(q[4, 0])*dq[1, 0] + numpy.sin(q[2, 0])*numpy.sin(q[4, 0])*numpy.cos(q[1, 0])*dq[4, 0] - numpy.cos(q[1, 0])*numpy.cos(q[2, 0])*numpy.cos(q[4, 0])*dq[2, 0])*numpy.cos(q[6, 0])]])
| 8,924.538462 | 82,893 | 0.542907 | 58,840 | 232,038 | 2.140636 | 0.000561 | 0.32121 | 0.314612 | 0.272637 | 0.998976 | 0.998928 | 0.998865 | 0.998761 | 0.998682 | 0.998515 | 0 | 0.127722 | 0.088654 | 232,038 | 25 | 82,894 | 9,281.52 | 0.467903 | 0 | 0 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 15 |
6795689ac6eb59acadb89f57aaaee6eaa14beed7 | 5,326 | py | Python | test/testifs.py | mvz/vb2py | 6ea046f6fc202527a1b3fcd3ef5a67b969dea715 | [
"BSD-3-Clause"
] | 2 | 2015-12-01T10:52:36.000Z | 2021-04-20T05:15:01.000Z | test/testifs.py | mvz/vb2py | 6ea046f6fc202527a1b3fcd3ef5a67b969dea715 | [
"BSD-3-Clause"
] | 4 | 2016-07-18T18:28:24.000Z | 2016-07-19T08:30:14.000Z | test/testifs.py | mvz/vb2py | 6ea046f6fc202527a1b3fcd3ef5a67b969dea715 | [
"BSD-3-Clause"
] | 3 | 2015-07-15T21:08:19.000Z | 2021-02-25T09:39:12.000Z | from testframework import *
# << If tests >> (1 of 7)
# Test main branch of If
tests.append(
("""a = 10
b = 0
If a = 10 Then
b = 1
End If
""",
{"a" : 10, "b" : 1}
))
# Test else branch of If
tests.append(
("""a = 20
b = 0
If a = 10 Then
b = 1
End If
""",
{"a" : 20, "b" : 0}
))
# Test main branch of If with not
tests.append(
("""a = 10
b = 0
If Not a = 10 Then
b = 1
End If
""",
{"a" : 10, "b" : 0}
))
tests.append(
("""a = 11
b = 0
If Not a = 10 Then
b = 1
End If
""",
{"a" : 11, "b" : 1}
))
# This test with the redundant parenthesis used to fail
tests.append(
("""a = 11
b = 0
If (Not a = 10) Then
b = 1
End If
""",
{"a" : 11, "b" : 1}
))
# << If tests >> (2 of 7)
# Test main branch of If
tests.append(
("""a = 10
If a = 10 Then
b = 1
Else
b = 0
End If
""",
{"a" : 10, "b" : 1}
))
# Test else branch of If
tests.append(
("""a = 20
If a = 10 Then
b = 1
Else
b = 0
End If
""",
{"a" : 20, "b" : 0}
))
# << If tests >> (3 of 7)
# Test main branch of If
tests.append(
("""a = 10
If a = 10 Then
b = 1
ElseIf a = 20 Then
b = 2
Else
b = 0
End If
""",
{"a" : 10, "b" : 1}
))
# Test elseif branch of If
tests.append(
("""a = 20
If a = 10 Then
b = 1
ElseIf a = 20 Then
b = 2
Else
b = 0
End If
""",
{"a" : 20, "b" : 2}
))
# Test else branch of If
tests.append(
("""a = 30
If a = 10 Then
b = 1
ElseIf a = 20 Then
b = 2
Else
b = 0
End If
""",
{"a" : 30, "b" : 0}
))
# << If tests >> (4 of 7)
# Test main branch of If
tests.append(
("""a = 10
b = 0
c = 20
If a = 10 Then
If c = 20 Then
b = 1
End If
End If
""",
{"a" : 10, "b" : 1, "c" : 20}
))
# Test else branch of If
tests.append(
("""a = 10
b = 0
c = 20
If a = 10 Then
If c = 30 Then
b = 1
End If
End If
""",
{"a" : 10, "b" : 0, "c" : 20}
))
# << If tests >> (5 of 7)
# Test main branch of If
tests.append(
("""a = 10
b = 0
c = 20
If a = 10 Then
If c = 20 Then
b = 1
Else
b = 2
End If
Else
b = 3
End If
""",
{"a" : 10, "b" : 1, "c" : 20}
))
# Test else branch of If
tests.append(
("""a = 10
b = 0
c = 20
If a = 10 Then
If c = 25 Then
b = 1
Else
b = 2
End If
Else
b = 3
End If
""",
{"a" : 10, "b" : 2, "c" : 20}
))
tests.append(
("""a = 10
b = 0
c = 20
If a = 15 Then
If c = 25 Then
b = 1
Else
b = 2
End If
Else
b = 3
End If
""",
{"a" : 10, "b" : 3, "c" : 20}
))
# << If tests >> (6 of 7)
# Test main branch of If
tests.append(
("""a = 10
b = 0
c = 20
If a = 10 Then
If c = 20 Then
b = 1
ElseIf c = 30 Then
b = 4
Else
b = 2
End If
ElseIf a = 15 Then
b = 5
Else
b = 3
End If
""",
{"a" : 10, "b" : 1, "c" : 20}
))
# Test else branch of If
tests.append(
("""a = 10
b = 0
c = 30
If a = 10 Then
If c = 20 Then
b = 1
ElseIf c = 30 Then
b = 4
Else
b = 2
End If
ElseIf a = 15 Then
b = 5
Else
b = 3
End If
""",
{"a" : 10, "b" : 4, "c" : 30}
))
# Test else branch of If
tests.append(
("""a = 15
b = 0
c = 30
If a = 10 Then
If c = 20 Then
b = 1
ElseIf c = 30 Then
b = 4
Else
b = 2
End If
ElseIf a = 15 Then
b = 5
Else
b = 3
End If
""",
{"a" : 15, "b" : 5, "c" : 30}
))
# << If tests >> (7 of 7)
# Lots of inline ifs
tests.extend([
("a = 0\nIf 1 < 2 Then a = 10", {"a" : 10,}),
("a = 0\nIf 2 < 1 Then a = 10", {"a" : 0,}),
("If 1 < 2 Then a = 10 Else a = 20", {"a" : 10,}),
("If 1 > 2 Then a = 10 Else a = 20", {"a" : 20,}),
])
# Bug #810401 python if statements may be missing a body
tests.append((
"""
a = 0
If 1 < 2 Then Resume Next
a = 10
""", {"a" : 10,}))
# -- end -- << If tests >>
import vb2py.vbparser
vb2py.vbparser.log.setLevel(0) # Don't print all logging stuff
TestClass = addTestsTo(BasicTest, tests)
if __name__ == "__main__":
main()
| 18.177474 | 62 | 0.338152 | 723 | 5,326 | 2.479945 | 0.095436 | 0.078639 | 0.069716 | 0.117122 | 0.769102 | 0.751813 | 0.741774 | 0.730619 | 0.693809 | 0.693809 | 0 | 0.122417 | 0.5276 | 5,326 | 292 | 63 | 18.239726 | 0.590223 | 0.13237 | 0 | 0.890688 | 0 | 0 | 0.686098 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008097 | 0 | 0.008097 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
67d4052733885cf66eadb40caa2e8856a39982a1 | 34,563 | py | Python | tests/tibanna/unicorn/test_ec2_utils.py | nhartwic/tibanna | 889490e5895c6c3e081b65c54573903e8c0daa53 | [
"MIT"
] | null | null | null | tests/tibanna/unicorn/test_ec2_utils.py | nhartwic/tibanna | 889490e5895c6c3e081b65c54573903e8c0daa53 | [
"MIT"
] | null | null | null | tests/tibanna/unicorn/test_ec2_utils.py | nhartwic/tibanna | 889490e5895c6c3e081b65c54573903e8c0daa53 | [
"MIT"
] | null | null | null | from tibanna.ec2_utils import (
UnicornInput,
Args,
Config,
Execution,
upload_workflow_to_s3,
get_file_size
)
from tibanna.utils import create_jobid
from tibanna.exceptions import (
MissingFieldInInputJsonException,
MalFormattedInputJsonException,
EC2InstanceLimitException,
EC2InstanceLimitWaitException
)
import boto3
import pytest
def fun():
raise Exception("InstanceLimitExceeded")
def test_args():
input_dict = {'args': {'input_files': {}, 'output_S3_bucket': 'somebucket', 'app_name': 'someapp'}}
args = Args(**input_dict['args'])
args_dict = args.as_dict()
assert 'input_files' in args_dict
assert 'app_name' in args_dict
assert args_dict['app_name'] == 'someapp'
def test_args_missing_field():
input_dict = {'args': {'input_files': {}, 'app_name': 'someapp'}}
with pytest.raises(MissingFieldInInputJsonException) as ex:
Args(**input_dict['args'])
assert ex
assert 'output_S3_bucket' in str(ex.value)
def test_args_parse_input_files():
input_dict = {'args': {'input_files': {"file1": "s3://somebucket/somekey"},
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl',
'cwl_directory_url': 'someurl',
'app_name': 'someapp'}}
args = Args(**input_dict['args'])
args.fill_default()
assert hasattr(args, 'input_files')
assert 'file1' in args.input_files
assert 'bucket_name' in args.input_files['file1']
assert 'object_key' in args.input_files['file1']
assert args.input_files['file1']['bucket_name'] == 'somebucket'
assert args.input_files['file1']['object_key'] == 'somekey'
def test_args_parse_input_files2():
input_dict = {'args': {'input_files': {"file1": [["s3://somebucket/somekey1",
"s3://somebucket/somekey2"],
["s3://somebucket/somekey3",
"s3://somebucket/somekey4"]]},
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl',
'cwl_directory_url': 'someurl',
'app_name': 'someapp'}}
args = Args(**input_dict['args'])
args.fill_default()
assert hasattr(args, 'input_files')
assert 'file1' in args.input_files
assert 'bucket_name' in args.input_files['file1']
assert 'object_key' in args.input_files['file1']
assert args.input_files['file1']['bucket_name'] == 'somebucket'
assert isinstance(args.input_files['file1']['object_key'], list)
assert len(args.input_files['file1']['object_key']) == 2
assert isinstance(args.input_files['file1']['object_key'][0], list)
assert len(args.input_files['file1']['object_key'][0]) == 2
assert isinstance(args.input_files['file1']['object_key'][1], list)
assert len(args.input_files['file1']['object_key'][1]) == 2
assert args.input_files['file1']['object_key'][0][0] == 'somekey1'
assert args.input_files['file1']['object_key'][0][1] == 'somekey2'
assert args.input_files['file1']['object_key'][1][0] == 'somekey3'
assert args.input_files['file1']['object_key'][1][1] == 'somekey4'
def test_args_parse_input_files3():
input_dict = {'args': {'input_files': {"file1": ["s3://somebucket/somekey1",
"s3://somebucket/somekey2"]},
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl',
'cwl_directory_url': 'someurl',
'app_name': 'someapp'}}
args = Args(**input_dict['args'])
args.fill_default()
assert hasattr(args, 'input_files')
assert 'file1' in args.input_files
assert 'bucket_name' in args.input_files['file1']
assert 'object_key' in args.input_files['file1']
assert args.input_files['file1']['bucket_name'] == 'somebucket'
assert isinstance(args.input_files['file1']['object_key'], list)
assert len(args.input_files['file1']['object_key']) == 2
assert args.input_files['file1']['object_key'][0] == 'somekey1'
assert args.input_files['file1']['object_key'][1] == 'somekey2'
def test_args_parse_input_files_format_error():
input_dict = {'args': {'input_files': {"file1": "somerandomstr"},
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl',
'cwl_directory_url': 'someurl',
'app_name': 'someapp'}}
args = Args(**input_dict['args'])
with pytest.raises(MalFormattedInputJsonException) as ex:
args.fill_default()
assert ex
assert 'S3 url must begin with' in str(ex.value)
def test_args_parse_input_files_format_error2():
input_dict = {'args': {'input_files': {"file1": ["s3://somebucket/somekey1",
"s3://otherbucket/somekey2"]},
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl',
'cwl_directory_url': 'someurl',
'app_name': 'someapp'}}
args = Args(**input_dict['args'])
with pytest.raises(MalFormattedInputJsonException) as ex:
args.fill_default()
assert ex
assert 'bucket' in str(ex.value)
def test_args_input_files_w_mount():
input_dict = {'args': {'input_files': {
"file1": {"bucket_name": "a", "object_key": "b", "mount": True}
},
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl',
'cwl_directory_url': 'someurl',
'app_name': 'someapp'}}
args = Args(**input_dict['args'])
assert args.input_files['file1']['mount']
def test_parse_command():
input_dict = {'args': {'command': ['command1', 'command2', 'command3'],
'output_S3_bucket': 'somebucket',
'language': 'shell',
'container_image': 'someimage',
'app_name': 'someapp'}}
args = Args(**input_dict['args'])
args.fill_default()
assert args.command == 'command1; command2; command3'
def test_config():
input_dict = {'config': {'log_bucket': 'tibanna-output', 'shutdown_min': 30}}
cfg = Config(**input_dict['config'])
cfg_dict = cfg.as_dict()
assert 'log_bucket' in cfg_dict
assert 'shutdown_min' in cfg_dict
assert cfg_dict['shutdown_min'] == 30
def test_config2():
input_dict = {'config': {'log_bucket': 'tibanna-output'}}
cfg = Config(**input_dict['config'])
cfg.fill_default()
cfg_dict = cfg.as_dict()
assert 'log_bucket' in cfg_dict
assert 'shutdown_min' in cfg_dict
assert 'root_ebs_size' in cfg_dict
assert cfg_dict['shutdown_min'] == 'now'
assert cfg_dict['root_ebs_size'] == 8
def test_config_root_ebs_size():
input_dict = {'config': {'log_bucket': 'tibanna-output', 'root_ebs_size': 20}}
cfg = Config(**input_dict['config'])
cfg.fill_default()
cfg_dict = cfg.as_dict()
assert 'log_bucket' in cfg_dict
assert cfg_dict['root_ebs_size'] == 20
def test_unicorn_input():
input_dict = {'args': {'input_files': {}, 'app_name': 'bwa-mem',
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output', 'shutdown_min': 30}}
unicorn_input = UnicornInput(input_dict)
unicorn_dict = unicorn_input.as_dict()
print(unicorn_dict)
assert 'args' in unicorn_dict
assert 'config' in unicorn_dict
assert 'jobid' in unicorn_dict # should be created
def test_unicorn_input2():
"""instance_type is provided but not app_name, which should be fine.
ebs_size is not provided (no benchmarking) so default value (10) is entered
also testing non-conventional fields
language is wdl this time"""
input_dict = {'args': {'input_files': {}, 'language': 'wdl',
'output_S3_bucket': 'somebucket',
'wdl_main_filename': 'main.wdl',
'wdl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output', 'instance_type': 't2.nano'},
'_tibanna': {}}
unicorn_input = UnicornInput(input_dict)
unicorn_dict = unicorn_input.as_dict()
print(unicorn_dict)
assert 'args' in unicorn_dict
assert 'config' in unicorn_dict
assert 'ebs_size' in unicorn_dict['config']
assert unicorn_dict['config']['ebs_size'] == 10
def test_execution_mem_cpu():
"""mem and cpu are provided but not app_name or instance_type,
which should be fine.
language is snakemake this time"""
input_dict = {'args': {'input_files': {}, 'language': 'snakemake',
'output_S3_bucket': 'somebucket',
'snakemake_main_filename': 'Snakefile',
'snakemake_directory_url': 'someurl',
'command': 'snakemake',
'container_image': 'quay.io/snakemake/snakemake'},
'config': {'log_bucket': 'tibanna-output', 'mem': 1, 'cpu': 1}}
execution = Execution(input_dict)
unicorn_dict = execution.input_dict
assert len(execution.instance_type_list) == 10
assert 'args' in unicorn_dict
assert 'config' in unicorn_dict
assert 'instance_type' in unicorn_dict['config']
assert unicorn_dict['config']['instance_type'] == 't3.micro'
def test_execution_benchmark():
randomstr = 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr)
input_dict = {'args': {'input_files': {'input_file': {'bucket_name': 'tibanna-output',
'object_key': randomstr}},
'output_S3_bucket': 'somebucket',
'app_name': 'md5',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output'}}
execution = Execution(input_dict)
unicorn_dict = execution.input_dict
print(unicorn_dict)
assert 'args' in unicorn_dict
assert 'config' in unicorn_dict
assert 'instance_type' in unicorn_dict['config']
assert unicorn_dict['config']['instance_type'] == 't3.micro'
assert unicorn_dict['config']['ebs_size'] == 10
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr}]})
def test_get_file_size():
randomstr = 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr)
size = get_file_size(randomstr, 'tibanna-output')
assert size == 4
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr}]})
def test_get_file_size2():
randomstr = 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr + '/1')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr + '/2')
size = get_file_size(randomstr, 'tibanna-output')
assert size == 8
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr + '/1'},
{'Key': randomstr + '/2'}]})
def test_get_input_size_in_bytes():
randomstr = 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr)
input_dict = {'args': {'input_files': {'input_file': {'bucket_name': 'tibanna-output',
'object_key': randomstr}},
'output_S3_bucket': 'somebucket',
'app_name': 'md5',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output'}}
execution = Execution(input_dict)
execution.input_size_in_bytes = execution.get_input_size_in_bytes()
assert execution.total_input_size_in_gb > 3E-9
assert execution.total_input_size_in_gb < 4E-9
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr}]})
def test_get_input_size_in_bytes_with_secondary_files():
randomstr, randomstr_1, randomstr_2 = 'test-' + create_jobid(), 'test-' + create_jobid(), 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr)
s3.put_object(Body='fooooooo'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr_1)
s3.put_object(Body='pippo'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr_2)
input_dict = {'args': {'input_files': {'input_file': {'bucket_name': 'tibanna-output',
'object_key': randomstr}},
'secondary_files': {'input_file': {'bucket_name': 'tibanna-output',
'object_key': [randomstr_1, randomstr_2]}},
'output_S3_bucket': 'somebucket',
'app_name': 'md5',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output'}}
execution = Execution(input_dict)
execution.input_size_in_bytes = execution.get_input_size_in_bytes()
assert execution.total_input_size_in_gb == 1.5832483768463135E-8
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr}, {'Key': randomstr_1}, {'Key': randomstr_2}]})
def test_update_config_ebs_size():
"""ebs_size is given as the 'x' format. The total estimated ebs_size is smaller than 10"""
randomstr = 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr)
input_dict = {'args': {'input_files': {'input_file': {'bucket_name': 'tibanna-output',
'object_key': randomstr}},
'output_S3_bucket': 'somebucket',
'app_name': 'md5',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output', 'ebs_size': '5.5x'}}
execution = Execution(input_dict)
execution.input_size_in_bytes = execution.get_input_size_in_bytes()
execution.update_config_ebs_size()
assert execution.cfg.ebs_size == 10
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr}]})
def test_update_config_ebs_size2():
"""ebs_size is given as the 'x' format. The total estimated ebs_size is larger than 10"""
randomstr = 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr)
input_dict = {'args': {'input_files': {'input_file': {'bucket_name': 'tibanna-output',
'object_key': randomstr}},
'output_S3_bucket': 'somebucket',
'app_name': 'md5',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output', 'ebs_size': '5000000000x'}}
execution = Execution(input_dict)
execution.input_size_in_bytes = execution.get_input_size_in_bytes()
execution.update_config_ebs_size()
assert execution.cfg.ebs_size == 19
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr}]})
def test_unicorn_input_missing_field():
"""app_name that doesn't exist in benchmark, without instance type, mem, cpu info"""
input_dict = {'args': {'input_files': {}, 'app_name': 'app_name_not_in_benchmark',
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output', 'shutdown_min': 30}}
with pytest.raises(MissingFieldInInputJsonException) as ex:
UnicornInput(input_dict)
assert ex
assert 'app_name' in str(ex.value)
def test_unicorn_input_missing_field2():
"""no app_name without instance type, mem, cpu info"""
input_dict = {'args': {'input_files': {},
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output', 'shutdown_min': 30}}
with pytest.raises(MissingFieldInInputJsonException) as ex:
UnicornInput(input_dict)
assert ex
assert 'app_name' in str(ex.value)
def test_unicorn_input_missing_field3():
"""cwl_main_filename missing for cwl workflow
(language is not specified which means it is cwl)
"""
input_dict = {'args': {'input_files': {}, 'app_name': 'bwa-mem',
'output_S3_bucket': 'somebucket',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output', 'shutdown_min': 30}}
with pytest.raises(MissingFieldInInputJsonException) as ex:
UnicornInput(input_dict)
assert ex
assert 'cwl_main_filename' in str(ex.value)
def test_unicorn_input_missing_field4():
"""neither cwl_directory_url nor cwl_directory_local is provided"""
input_dict = {'args': {'input_files': {}, 'app_name': 'app_name_not_in_benchmark',
'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'main.cwl'},
'config': {'log_bucket': 'tibanna-output', 'shutdown_min': 30}}
with pytest.raises(MissingFieldInInputJsonException) as ex:
UnicornInput(input_dict)
assert ex
assert 'cwl_directory_url' in str(ex.value)
def test_execution_missing_field5():
"""language is snakemake but command is missing"""
input_dict = {'args': {'input_files': {}, 'language': 'snakemake',
'output_S3_bucket': 'somebucket',
'snakemake_main_filename': 'Snakefile',
'snakemake_directory_url': 'someurl',
'container_image': 'quay.io/snakemake/snakemake'},
'config': {'log_bucket': 'tibanna-output', 'mem': 1, 'cpu': 1}}
with pytest.raises(MissingFieldInInputJsonException) as ex:
Execution(input_dict)
assert ex
assert 'command' in str(ex.value)
def test_execution_missing_field6():
"""language is shell but container_image is missing"""
input_dict = {'args': {'input_files': {}, 'language': 'shell',
'output_S3_bucket': 'somebucket',
'command': 'some command'},
'config': {'log_bucket': 'tibanna-output', 'mem': 1, 'cpu': 1}}
with pytest.raises(MissingFieldInInputJsonException) as ex:
Execution(input_dict)
assert ex
assert 'container_image' in str(ex.value)
def test_create_run_json_dict():
randomstr = 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr)
input_dict = {'args': {'input_files': {'input_file': {'bucket_name': 'tibanna-output',
'object_key': randomstr}},
'output_S3_bucket': 'somebucket',
'app_name': 'md5',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output'}}
execution = Execution(input_dict)
runjson = execution.create_run_json_dict()
assert runjson
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr}]})
def test_create_userdata():
randomstr = 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr)
input_dict = {'args': {'input_files': {'input_file': {'bucket_name': 'tibanna-output',
'object_key': randomstr}},
'output_S3_bucket': 'somebucket',
'app_name': 'md5',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output'},
'jobid': 'myjobid'}
execution = Execution(input_dict)
userdata = execution.create_userdata()
print(userdata)
assert userdata
assert 'JOBID=myjobid' in userdata
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr}]})
def test_create_userdata_w_profile():
randomstr = 'test-' + create_jobid()
s3 = boto3.client('s3')
s3.put_object(Body='haha'.encode('utf-8'),
Bucket='tibanna-output', Key=randomstr)
input_dict = {'args': {'input_files': {'input_file': {'bucket_name': 'tibanna-output',
'object_key': randomstr}},
'output_S3_bucket': 'somebucket',
'app_name': 'md5',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': 'tibanna-output'},
'jobid': 'myjobid'}
execution = Execution(input_dict)
profile = {'access_key': 'haha', 'secret_key': 'lala'}
userdata = execution.create_userdata(profile=profile)
print(userdata)
assert userdata
assert '-a haha -s lala' in userdata
# cleanup afterwards
s3.delete_objects(Bucket='tibanna-output',
Delete={'Objects': [{'Key': randomstr}]})
def test_upload_run_json():
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'mem': 1, 'cpu': 1},
'jobid': jobid}
somejson = {'haha': 'lala'}
execution = Execution(input_dict)
execution.upload_run_json(somejson)
s3 = boto3.client('s3')
res = s3.get_object(Bucket=log_bucket, Key=jobid + '.run.json')
assert res
# clean up afterwards
s3.delete_objects(Bucket=log_bucket,
Delete={'Objects': [{'Key': jobid + '.run.json'}]})
def test_launch_args():
"""test creating launch arguments - also test spot_instance"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'mem': 1, 'cpu': 1,
'spot_instance': True},
'jobid': jobid}
execution = Execution(input_dict)
# userdata is required before launch_args is created
execution.userdata = execution.create_userdata()
launch_args = execution.launch_args
print(launch_args)
assert launch_args
assert 't3.micro' in str(launch_args)
assert 'InstanceMarketOptions' in str(launch_args)
def test_launch_and_get_instance_id():
"""test dryrun of ec2 launch"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'mem': 1, 'cpu': 1,
'spot_instance': True},
'jobid': jobid}
execution = Execution(input_dict, dryrun=True)
# userdata is required before launch_args is created
execution.userdata = execution.create_userdata()
with pytest.raises(Exception) as ex:
execution.launch_and_get_instance_id()
assert 'Request would have succeeded, but DryRun flag is set' in str(ex.value)
def test_ec2_exception_coordinator2():
"""ec2 limit exceptions with 'fail'"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'instance_type': 'c5.4xlarge',
'spot_instance': True},
'jobid': jobid}
execution = Execution(input_dict, dryrun=True)
execution.userdata = execution.create_userdata()
with pytest.raises(EC2InstanceLimitException) as exec_info:
execution.ec2_exception_coordinator(fun)()
assert exec_info
def test_ec2_exception_coordinator3():
"""ec2 exceptions with 'wait_and_retry'"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'instance_type': 'c5.4xlarge',
'spot_instance': True,
'behavior_on_capacity_limit': 'wait_and_retry'},
'jobid': jobid}
execution = Execution(input_dict, dryrun=True)
execution.userdata = execution.create_userdata()
with pytest.raises(EC2InstanceLimitWaitException) as exec_info:
execution.ec2_exception_coordinator(fun)()
assert exec_info
def test_ec2_exception_coordinator4():
"""ec2 exceptions with 'other_instance_types'"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'mem': 1, 'cpu': 1,
'spot_instance': True,
'behavior_on_capacity_limit': 'other_instance_types'},
'jobid': jobid}
execution = Execution(input_dict, dryrun=True)
assert execution.cfg.instance_type == 't3.micro'
execution.userdata = execution.create_userdata()
res = execution.ec2_exception_coordinator(fun)()
assert res == 'continue'
assert execution.cfg.instance_type == 't2.micro'
res = execution.ec2_exception_coordinator(fun)()
assert res == 'continue'
assert execution.cfg.instance_type == 't3.small'
res = execution.ec2_exception_coordinator(fun)()
assert res == 'continue'
assert execution.cfg.instance_type == 't2.small'
def test_ec2_exception_coordinator5():
"""ec2 exceptions with 'other_instance_types' but had only one option"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'instance_type': 't2.micro',
'spot_instance': True,
'behavior_on_capacity_limit': 'other_instance_types'},
'jobid': jobid}
execution = Execution(input_dict, dryrun=True)
assert execution.cfg.instance_type == 't2.micro'
execution.userdata = execution.create_userdata()
with pytest.raises(EC2InstanceLimitException) as exec_info:
execution.ec2_exception_coordinator(fun)()
assert 'No more instance type available' in str(exec_info.value)
def test_ec2_exception_coordinator6():
"""ec2 exceptions with 'retry_without_spot'"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'instance_type': 't2.micro',
'spot_instance': True,
'behavior_on_capacity_limit': 'retry_without_spot'},
'jobid': jobid}
execution = Execution(input_dict, dryrun=True)
execution.userdata = execution.create_userdata()
res = execution.ec2_exception_coordinator(fun)()
assert res == 'continue'
assert execution.cfg.spot_instance is False # changed to non-spot
assert execution.cfg.behavior_on_capacity_limit == 'fail' # changed to non-spot
with pytest.raises(EC2InstanceLimitException) as exec_info:
res = execution.ec2_exception_coordinator(fun)() # this time, it fails
assert exec_info
def test_ec2_exception_coordinator7():
"""ec2 exceptions with 'retry_without_spot' without spot instance"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'instance_type': 't2.micro',
'behavior_on_capacity_limit': 'retry_without_spot'},
'jobid': jobid}
execution = Execution(input_dict, dryrun=True)
assert execution.cfg.spot_instance is False
execution.userdata = execution.create_userdata()
with pytest.raises(Exception) as exec_info:
execution.ec2_exception_coordinator(fun)()
assert "'retry_without_spot' works only with 'spot_instance'" in str(exec_info.value)
def test_ec2_exception_coordinator8():
"""ec2 exceptions with 'other_instance_types' with both instance_type and mem/cpu
specified"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket, 'instance_type': 't2.micro',
'mem': 1, 'cpu': 1,
'behavior_on_capacity_limit': 'other_instance_types'},
'jobid': jobid}
execution = Execution(input_dict, dryrun=True)
assert execution.cfg.instance_type == 't2.micro'
execution.userdata = execution.create_userdata()
res = execution.ec2_exception_coordinator(fun)()
assert res == 'continue'
assert execution.cfg.instance_type == 't3.micro'
execution.userdata = execution.create_userdata()
res = execution.ec2_exception_coordinator(fun)()
assert res == 'continue'
assert execution.cfg.instance_type == 't3.small' # skill t2.micro since it was already tried
def test_ec2_exception_coordinator9():
"""ec2 exceptions with 'other_instance_types' with both instance_type and mem/cpu
specified"""
jobid = create_jobid()
log_bucket = 'tibanna-output'
input_dict = {'args': {'output_S3_bucket': 'somebucket',
'cwl_main_filename': 'md5.cwl',
'cwl_directory_url': 'someurl'},
'config': {'log_bucket': log_bucket,
'mem': 2, 'cpu': 1,
'behavior_on_capacity_limit': 'other_instance_types'},
'jobid': jobid}
execution = Execution(input_dict, dryrun=True)
assert execution.cfg.instance_type == 't3.small'
execution.userdata = execution.create_userdata()
res = execution.ec2_exception_coordinator(fun)()
assert res == 'continue'
assert execution.cfg.instance_type == 't2.small'
def test_upload_workflow_to_s3(run_task_awsem_event_cwl_upload):
jobid = create_jobid()
run_task_awsem_event_cwl_upload['jobid'] = jobid
log_bucket = run_task_awsem_event_cwl_upload['config']['log_bucket']
unicorn_input = UnicornInput(run_task_awsem_event_cwl_upload)
upload_workflow_to_s3(unicorn_input)
s3 = boto3.client('s3')
res1 = s3.get_object(Bucket=log_bucket, Key=jobid + '.workflow/main.cwl')
res2 = s3.get_object(Bucket=log_bucket, Key=jobid + '.workflow/child1.cwl')
res3 = s3.get_object(Bucket=log_bucket, Key=jobid + '.workflow/child2.cwl')
assert res1
assert res2
assert res3
assert unicorn_input.args.cwl_directory_url == 's3://tibanna-output/' + jobid + '.workflow/'
# clean up afterwards
s3.delete_objects(Bucket=log_bucket,
Delete={'Objects': [{'Key': jobid + '.workflow/main.cwl'},
{'Key': jobid + '.workflow/child1.cwl'},
{'Key': jobid + '.workflow/child2.cwl'}]})
| 45.12141 | 118 | 0.586986 | 3,719 | 34,563 | 5.172627 | 0.075827 | 0.038364 | 0.041483 | 0.044913 | 0.840828 | 0.814472 | 0.789624 | 0.763633 | 0.733378 | 0.686282 | 0 | 0.016848 | 0.280473 | 34,563 | 765 | 119 | 45.180392 | 0.756685 | 0.053873 | 0 | 0.701893 | 0 | 0 | 0.253741 | 0.019542 | 0 | 0 | 0 | 0 | 0.198738 | 1 | 0.069401 | false | 0 | 0.007886 | 0 | 0.077287 | 0.009464 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
db331a78f98b57a1a1512ed970ef713f41addedd | 105 | py | Python | python/src/test/resources/pyfunc/numpy_random3_test.py | maropu/lljvm-translator | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 70 | 2017-12-12T10:54:00.000Z | 2022-03-22T07:45:19.000Z | python/src/test/resources/pyfunc/numpy_random3_test.py | maropu/lljvm-as | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 14 | 2018-02-28T01:29:46.000Z | 2019-12-10T01:42:22.000Z | python/src/test/resources/pyfunc/numpy_random3_test.py | maropu/lljvm-as | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 4 | 2019-07-21T07:58:25.000Z | 2021-02-01T09:46:59.000Z | import numpy as np
def numpy_random3_test(low, high, size):
return np.random.randint(low, high, size)
| 21 | 43 | 0.752381 | 18 | 105 | 4.277778 | 0.722222 | 0.181818 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011111 | 0.142857 | 105 | 4 | 44 | 26.25 | 0.844444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
c039fda6ec0b8db70e5207914b7d61a38a60aee8 | 32,476 | py | Python | tests/platforms/test_lambda.py | tirkarthi/python-sensor | 9872d146ac00baff2673fde5ba97fdbe596869a4 | [
"MIT"
] | 61 | 2017-09-27T02:50:17.000Z | 2022-03-22T12:13:37.000Z | tests/platforms/test_lambda.py | tirkarthi/python-sensor | 9872d146ac00baff2673fde5ba97fdbe596869a4 | [
"MIT"
] | 82 | 2017-07-11T13:47:33.000Z | 2022-03-22T10:10:38.000Z | tests/platforms/test_lambda.py | takeaway/python-sensor | 52d6eaa2d6a8e625201bad36ac2448201c4bd63d | [
"MIT"
] | 27 | 2017-09-11T16:22:32.000Z | 2022-03-11T17:21:49.000Z | # (c) Copyright IBM Corp. 2021
# (c) Copyright Instana Inc. 2020
from __future__ import absolute_import
import os
import sys
import json
import time
import wrapt
import logging
import unittest
from instana.tracer import InstanaTracer
from instana.agent.aws_lambda import AWSLambdaAgent
from instana.options import AWSLambdaOptions
from instana.recorder import StanRecorder
from instana import lambda_handler
from instana import get_lambda_handler_or_default
from instana.instrumentation.aws.lambda_inst import lambda_handler_with_instana
from instana.instrumentation.aws.triggers import read_http_query_params
from instana.singletons import get_agent, set_agent, get_tracer, set_tracer
from instana.util.aws import normalize_aws_lambda_arn
# Mock Context object
class MockContext(dict):
def __init__(self, **kwargs):
super(MockContext, self).__init__(**kwargs)
self.invoked_function_arn = "arn:aws:lambda:us-east-2:12345:function:TestPython:1"
self.function_name = "TestPython"
self.function_version = "1"
# This is the target handler that will be instrumented for these tests
def my_lambda_handler(event, context):
# print("target_handler called")
return {
'statusCode': 200,
'headers': {'Content-Type': 'application/json'},
'body': json.dumps({'site': 'pwpush.com', 'response': 204})
}
# We only want to monkey patch the test handler once so do it here
os.environ["LAMBDA_HANDLER"] = "tests.platforms.test_lambda.my_lambda_handler"
module_name, function_name = get_lambda_handler_or_default()
wrapt.wrap_function_wrapper(module_name, function_name, lambda_handler_with_instana)
class TestLambda(unittest.TestCase):
def __init__(self, methodName='runTest'):
super(TestLambda, self).__init__(methodName)
self.agent = None
self.span_recorder = None
self.tracer = None
self.pwd = os.path.dirname(os.path.realpath(__file__))
self.original_agent = get_agent()
self.original_tracer = get_tracer()
def setUp(self):
os.environ["AWS_EXECUTION_ENV"] = "AWS_Lambda_python_3.8"
os.environ["LAMBDA_HANDLER"] = "tests.platforms.test_lambda.my_lambda_handler"
os.environ["INSTANA_ENDPOINT_URL"] = "https://localhost/notreal"
os.environ["INSTANA_AGENT_KEY"] = "Fake_Key"
self.context = MockContext()
def tearDown(self):
""" Reset all environment variables of consequence """
if "AWS_EXECUTION_ENV" in os.environ:
os.environ.pop("AWS_EXECUTION_ENV")
if "LAMBDA_HANDLER" in os.environ:
os.environ.pop("LAMBDA_HANDLER")
if "INSTANA_EXTRA_HTTP_HEADERS" in os.environ:
os.environ.pop("INSTANA_EXTRA_HTTP_HEADERS")
if "INSTANA_ENDPOINT_URL" in os.environ:
os.environ.pop("INSTANA_ENDPOINT_URL")
if "INSTANA_ENDPOINT_PROXY" in os.environ:
os.environ.pop("INSTANA_ENDPOINT_PROXY")
if "INSTANA_AGENT_KEY" in os.environ:
os.environ.pop("INSTANA_AGENT_KEY")
if "INSTANA_SERVICE_NAME" in os.environ:
os.environ.pop("INSTANA_SERVICE_NAME")
if "INSTANA_DEBUG" in os.environ:
os.environ.pop("INSTANA_DEBUG")
if "INSTANA_LOG_LEVEL" in os.environ:
os.environ.pop("INSTANA_LOG_LEVEL")
set_agent(self.original_agent)
set_tracer(self.original_tracer)
def create_agent_and_setup_tracer(self):
self.agent = AWSLambdaAgent()
self.span_recorder = StanRecorder(self.agent)
self.tracer = InstanaTracer(recorder=self.span_recorder)
set_agent(self.agent)
set_tracer(self.tracer)
def test_invalid_options(self):
# None of the required env vars are available...
if "LAMBDA_HANDLER" in os.environ:
os.environ.pop("LAMBDA_HANDLER")
if "INSTANA_EXTRA_HTTP_HEADERS" in os.environ:
os.environ.pop("INSTANA_EXTRA_HTTP_HEADERS")
if "INSTANA_ENDPOINT_URL" in os.environ:
os.environ.pop("INSTANA_ENDPOINT_URL")
if "INSTANA_AGENT_KEY" in os.environ:
os.environ.pop("INSTANA_AGENT_KEY")
agent = AWSLambdaAgent()
self.assertFalse(agent._can_send)
self.assertIsNone(agent.collector)
def test_secrets(self):
self.create_agent_and_setup_tracer()
self.assertTrue(hasattr(self.agent.options, 'secrets_matcher'))
self.assertEqual(self.agent.options.secrets_matcher, 'contains-ignore-case')
self.assertTrue(hasattr(self.agent.options, 'secrets_list'))
self.assertEqual(self.agent.options.secrets_list, ['key', 'pass', 'secret'])
def test_has_extra_http_headers(self):
self.create_agent_and_setup_tracer()
self.assertTrue(hasattr(self.agent, 'options'))
self.assertTrue(hasattr(self.agent.options, 'extra_http_headers'))
def test_has_options(self):
self.create_agent_and_setup_tracer()
self.assertTrue(hasattr(self.agent, 'options'))
self.assertTrue(isinstance(self.agent.options, AWSLambdaOptions))
assert(self.agent.options.endpoint_proxy == { })
def test_get_handler(self):
os.environ["LAMBDA_HANDLER"] = "tests.lambda_handler"
handler_module, handler_function = get_lambda_handler_or_default()
self.assertEqual("tests", handler_module)
self.assertEqual("lambda_handler", handler_function)
def test_get_handler_with_multi_subpackages(self):
os.environ["LAMBDA_HANDLER"] = "tests.one.two.three.lambda_handler"
handler_module, handler_function = get_lambda_handler_or_default()
self.assertEqual("tests.one.two.three", handler_module)
self.assertEqual("lambda_handler", handler_function)
def test_get_handler_with_space_in_it(self):
os.environ["LAMBDA_HANDLER"] = " tests.another_module.lambda_handler"
handler_module, handler_function = get_lambda_handler_or_default()
self.assertEqual("tests.another_module", handler_module)
self.assertEqual("lambda_handler", handler_function)
os.environ["LAMBDA_HANDLER"] = "tests.another_module.lambda_handler "
handler_module, handler_function = get_lambda_handler_or_default()
self.assertEqual("tests.another_module", handler_module)
self.assertEqual("lambda_handler", handler_function)
def test_agent_extra_http_headers(self):
os.environ['INSTANA_EXTRA_HTTP_HEADERS'] = "X-Test-Header;X-Another-Header;X-And-Another-Header"
self.create_agent_and_setup_tracer()
self.assertIsNotNone(self.agent.options.extra_http_headers)
should_headers = ['x-test-header', 'x-another-header', 'x-and-another-header']
self.assertEqual(should_headers, self.agent.options.extra_http_headers)
def test_custom_proxy(self):
os.environ["INSTANA_ENDPOINT_PROXY"] = "http://myproxy.123"
self.create_agent_and_setup_tracer()
assert(self.agent.options.endpoint_proxy == { 'https': "http://myproxy.123" })
def test_custom_service_name(self):
os.environ['INSTANA_SERVICE_NAME'] = "Legion"
with open(self.pwd + '/../data/lambda/api_gateway_event.json', 'r') as json_file:
event = json.load(json_file)
self.create_agent_and_setup_tracer()
# Call the Instana Lambda Handler as we do in the real world. It will initiate tracing and then
# figure out the original (the users') Lambda Handler and execute it.
# The original Lambda handler is set in os.environ["LAMBDA_HANDLER"]
result = lambda_handler(event, self.context)
os.environ.pop('INSTANA_SERVICE_NAME')
assert isinstance(result, dict)
assert 'headers' in result
assert 'Server-Timing' in result['headers']
time.sleep(1)
payload = self.agent.collector.prepare_payload()
self.assertTrue("metrics" in payload)
self.assertTrue("spans" in payload)
self.assertEqual(2, len(payload.keys()))
self.assertTrue(isinstance(payload['metrics']['plugins'], list))
self.assertTrue(len(payload['metrics']['plugins']) == 1)
plugin_data = payload['metrics']['plugins'][0]
self.assertEqual('com.instana.plugin.aws.lambda', plugin_data['name'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', plugin_data['entityId'])
self.assertEqual(1, len(payload['spans']))
span = payload['spans'][0]
self.assertEqual('aws.lambda.entry', span.n)
self.assertEqual('d5cb361b256413a9', span.t)
self.assertIsNotNone(span.s)
self.assertEqual('0901d8ae4fbf1529', span.p)
self.assertIsNotNone(span.ts)
self.assertIsNotNone(span.d)
server_timing_value = "intid;desc=%s" % span.t
assert result['headers']['Server-Timing'] == server_timing_value
self.assertEqual({'hl': True, 'cp': 'aws', 'e': 'arn:aws:lambda:us-east-2:12345:function:TestPython:1'},
span.f)
self.assertTrue(span.sy)
self.assertIsNone(span.ec)
self.assertIsNone(span.data['lambda']['error'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', span.data['lambda']['arn'])
self.assertEqual(None, span.data['lambda']['alias'])
self.assertEqual('python', span.data['lambda']['runtime'])
self.assertEqual('TestPython', span.data['lambda']['functionName'])
self.assertEqual('1', span.data['lambda']['functionVersion'])
self.assertEqual('Legion', span.data['service'])
self.assertEqual('aws:api.gateway', span.data['lambda']['trigger'])
self.assertEqual('POST', span.data['http']['method'])
self.assertEqual('/path/to/resource', span.data['http']['url'])
self.assertEqual('/{proxy+}', span.data['http']['path_tpl'])
if sys.version[:3] == '2.7':
self.assertEqual(u"foo=[u'bar']", span.data['http']['params'])
else:
self.assertEqual("foo=['bar']", span.data['http']['params'])
def test_api_gateway_trigger_tracing(self):
with open(self.pwd + '/../data/lambda/api_gateway_event.json', 'r') as json_file:
event = json.load(json_file)
self.create_agent_and_setup_tracer()
# Call the Instana Lambda Handler as we do in the real world. It will initiate tracing and then
# figure out the original (the users') Lambda Handler and execute it.
# The original Lambda handler is set in os.environ["LAMBDA_HANDLER"]
result = lambda_handler(event, self.context)
assert isinstance(result, dict)
assert 'headers' in result
assert 'Server-Timing' in result['headers']
time.sleep(1)
payload = self.agent.collector.prepare_payload()
self.assertTrue("metrics" in payload)
self.assertTrue("spans" in payload)
self.assertEqual(2, len(payload.keys()))
self.assertTrue(isinstance(payload['metrics']['plugins'], list))
self.assertTrue(len(payload['metrics']['plugins']) == 1)
plugin_data = payload['metrics']['plugins'][0]
self.assertEqual('com.instana.plugin.aws.lambda', plugin_data['name'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', plugin_data['entityId'])
self.assertEqual(1, len(payload['spans']))
span = payload['spans'][0]
self.assertEqual('aws.lambda.entry', span.n)
self.assertEqual('d5cb361b256413a9', span.t)
self.assertIsNotNone(span.s)
self.assertEqual('0901d8ae4fbf1529', span.p)
self.assertIsNotNone(span.ts)
self.assertIsNotNone(span.d)
server_timing_value = "intid;desc=%s" % span.t
assert result['headers']['Server-Timing'] == server_timing_value
self.assertEqual({'hl': True, 'cp': 'aws', 'e': 'arn:aws:lambda:us-east-2:12345:function:TestPython:1'},
span.f)
self.assertTrue(span.sy)
self.assertIsNone(span.ec)
self.assertIsNone(span.data['lambda']['error'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', span.data['lambda']['arn'])
self.assertEqual(None, span.data['lambda']['alias'])
self.assertEqual('python', span.data['lambda']['runtime'])
self.assertEqual('TestPython', span.data['lambda']['functionName'])
self.assertEqual('1', span.data['lambda']['functionVersion'])
self.assertIsNone(span.data['service'])
self.assertEqual('aws:api.gateway', span.data['lambda']['trigger'])
self.assertEqual('POST', span.data['http']['method'])
self.assertEqual('/path/to/resource', span.data['http']['url'])
self.assertEqual('/{proxy+}', span.data['http']['path_tpl'])
if sys.version[:3] == '2.7':
self.assertEqual(u"foo=[u'bar']", span.data['http']['params'])
else:
self.assertEqual("foo=['bar']", span.data['http']['params'])
def test_api_gateway_v2_trigger_tracing(self):
with open(self.pwd + '/../data/lambda/api_gateway_v2_event.json', 'r') as json_file:
event = json.load(json_file)
self.create_agent_and_setup_tracer()
# Call the Instana Lambda Handler as we do in the real world. It will initiate tracing and then
# figure out the original (the users') Lambda Handler and execute it.
# The original Lambda handler is set in os.environ["LAMBDA_HANDLER"]
result = lambda_handler(event, self.context)
assert isinstance(result, dict)
assert 'headers' in result
assert 'Server-Timing' in result['headers']
time.sleep(1)
payload = self.agent.collector.prepare_payload()
self.assertTrue("metrics" in payload)
self.assertTrue("spans" in payload)
self.assertEqual(2, len(payload.keys()))
self.assertTrue(isinstance(payload['metrics']['plugins'], list))
self.assertTrue(len(payload['metrics']['plugins']) == 1)
plugin_data = payload['metrics']['plugins'][0]
self.assertEqual('com.instana.plugin.aws.lambda', plugin_data['name'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', plugin_data['entityId'])
self.assertEqual(1, len(payload['spans']))
span = payload['spans'][0]
self.assertEqual('aws.lambda.entry', span.n)
self.assertEqual('0000000000001234', span.t)
self.assertIsNotNone(span.s)
self.assertEqual('0000000000004567', span.p)
self.assertIsNotNone(span.ts)
self.assertIsNotNone(span.d)
server_timing_value = "intid;desc=%s" % span.t
assert result['headers']['Server-Timing'] == server_timing_value
self.assertEqual({'hl': True, 'cp': 'aws', 'e': 'arn:aws:lambda:us-east-2:12345:function:TestPython:1'},
span.f)
self.assertTrue(span.sy)
self.assertIsNone(span.ec)
self.assertIsNone(span.data['lambda']['error'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', span.data['lambda']['arn'])
self.assertEqual(None, span.data['lambda']['alias'])
self.assertEqual('python', span.data['lambda']['runtime'])
self.assertEqual('TestPython', span.data['lambda']['functionName'])
self.assertEqual('1', span.data['lambda']['functionVersion'])
self.assertIsNone(span.data['service'])
self.assertEqual('aws:api.gateway', span.data['lambda']['trigger'])
self.assertEqual('POST', span.data['http']['method'])
self.assertEqual('/my/path', span.data['http']['url'])
self.assertEqual('/my/{resource}', span.data['http']['path_tpl'])
if sys.version[:3] == '2.7':
self.assertEqual(u"q=term&secret=key", span.data['http']['params'])
else:
self.assertEqual("secret=key&q=term", span.data['http']['params'])
def test_application_lb_trigger_tracing(self):
with open(self.pwd + '/../data/lambda/api_gateway_event.json', 'r') as json_file:
event = json.load(json_file)
self.create_agent_and_setup_tracer()
# Call the Instana Lambda Handler as we do in the real world. It will initiate tracing and then
# figure out the original (the users') Lambda Handler and execute it.
# The original Lambda handler is set in os.environ["LAMBDA_HANDLER"]
result = lambda_handler(event, self.context)
assert isinstance(result, dict)
assert 'headers' in result
assert 'Server-Timing' in result['headers']
time.sleep(1)
payload = self.agent.collector.prepare_payload()
self.assertTrue("metrics" in payload)
self.assertTrue("spans" in payload)
self.assertEqual(2, len(payload.keys()))
self.assertTrue(isinstance(payload['metrics']['plugins'], list))
self.assertTrue(len(payload['metrics']['plugins']) == 1)
plugin_data = payload['metrics']['plugins'][0]
self.assertEqual('com.instana.plugin.aws.lambda', plugin_data['name'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', plugin_data['entityId'])
self.assertEqual(1, len(payload['spans']))
span = payload['spans'][0]
self.assertEqual('aws.lambda.entry', span.n)
self.assertEqual('d5cb361b256413a9', span.t)
self.assertIsNotNone(span.s)
self.assertEqual('0901d8ae4fbf1529', span.p)
self.assertIsNotNone(span.ts)
self.assertIsNotNone(span.d)
server_timing_value = "intid;desc=%s" % span.t
assert result['headers']['Server-Timing'] == server_timing_value
self.assertEqual({'hl': True, 'cp': 'aws', 'e': 'arn:aws:lambda:us-east-2:12345:function:TestPython:1'},
span.f)
self.assertTrue(span.sy)
self.assertIsNone(span.ec)
self.assertIsNone(span.data['lambda']['error'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', span.data['lambda']['arn'])
self.assertEqual(None, span.data['lambda']['alias'])
self.assertEqual('python', span.data['lambda']['runtime'])
self.assertEqual('TestPython', span.data['lambda']['functionName'])
self.assertEqual('1', span.data['lambda']['functionVersion'])
self.assertIsNone(span.data['service'])
self.assertEqual('aws:api.gateway', span.data['lambda']['trigger'])
self.assertEqual('POST', span.data['http']['method'])
self.assertEqual('/path/to/resource', span.data['http']['url'])
if sys.version[:3] == '2.7':
self.assertEqual(u"foo=[u'bar']", span.data['http']['params'])
else:
self.assertEqual("foo=['bar']", span.data['http']['params'])
def test_cloudwatch_trigger_tracing(self):
with open(self.pwd + '/../data/lambda/cloudwatch_event.json', 'r') as json_file:
event = json.load(json_file)
self.create_agent_and_setup_tracer()
# Call the Instana Lambda Handler as we do in the real world. It will initiate tracing and then
# figure out the original (the users') Lambda Handler and execute it.
# The original Lambda handler is set in os.environ["LAMBDA_HANDLER"]
result = lambda_handler(event, self.context)
assert isinstance(result, dict)
assert 'headers' in result
assert 'Server-Timing' in result['headers']
time.sleep(1)
payload = self.agent.collector.prepare_payload()
self.assertTrue("metrics" in payload)
self.assertTrue("spans" in payload)
self.assertEqual(2, len(payload.keys()))
self.assertTrue(isinstance(payload['metrics']['plugins'], list))
self.assertTrue(len(payload['metrics']['plugins']) == 1)
plugin_data = payload['metrics']['plugins'][0]
self.assertEqual('com.instana.plugin.aws.lambda', plugin_data['name'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', plugin_data['entityId'])
self.assertEqual(1, len(payload['spans']))
span = payload['spans'][0]
self.assertEqual('aws.lambda.entry', span.n)
self.assertIsNotNone(span.t)
self.assertIsNotNone(span.s)
self.assertIsNone(span.p)
self.assertIsNotNone(span.ts)
self.assertIsNotNone(span.d)
server_timing_value = "intid;desc=%s" % span.t
assert result['headers']['Server-Timing'] == server_timing_value
self.assertEqual({'hl': True, 'cp': 'aws', 'e': 'arn:aws:lambda:us-east-2:12345:function:TestPython:1'},
span.f)
self.assertIsNone(span.sy)
self.assertIsNone(span.ec)
self.assertIsNone(span.data['lambda']['error'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', span.data['lambda']['arn'])
self.assertEqual(None, span.data['lambda']['alias'])
self.assertEqual('python', span.data['lambda']['runtime'])
self.assertEqual('TestPython', span.data['lambda']['functionName'])
self.assertEqual('1', span.data['lambda']['functionVersion'])
self.assertIsNone(span.data['service'])
self.assertEqual('aws:cloudwatch.events', span.data['lambda']['trigger'])
self.assertEqual('cdc73f9d-aea9-11e3-9d5a-835b769c0d9c', span.data["lambda"]["cw"]["events"]["id"])
self.assertEqual(False, span.data["lambda"]["cw"]["events"]["more"])
self.assertTrue(isinstance(span.data["lambda"]["cw"]["events"]["resources"], list))
self.assertEqual(1, len(span.data["lambda"]["cw"]["events"]["resources"]))
self.assertEqual('arn:aws:events:eu-west-1:123456789012:rule/ExampleRule',
span.data["lambda"]["cw"]["events"]["resources"][0])
def test_cloudwatch_logs_trigger_tracing(self):
with open(self.pwd + '/../data/lambda/cloudwatch_logs_event.json', 'r') as json_file:
event = json.load(json_file)
self.create_agent_and_setup_tracer()
# Call the Instana Lambda Handler as we do in the real world. It will initiate tracing and then
# figure out the original (the users') Lambda Handler and execute it.
# The original Lambda handler is set in os.environ["LAMBDA_HANDLER"]
result = lambda_handler(event, self.context)
assert isinstance(result, dict)
assert 'headers' in result
assert 'Server-Timing' in result['headers']
time.sleep(1)
payload = self.agent.collector.prepare_payload()
self.assertTrue("metrics" in payload)
self.assertTrue("spans" in payload)
self.assertEqual(2, len(payload.keys()))
self.assertTrue(isinstance(payload['metrics']['plugins'], list))
self.assertTrue(len(payload['metrics']['plugins']) == 1)
plugin_data = payload['metrics']['plugins'][0]
self.assertEqual('com.instana.plugin.aws.lambda', plugin_data['name'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', plugin_data['entityId'])
self.assertEqual(1, len(payload['spans']))
span = payload['spans'][0]
self.assertEqual('aws.lambda.entry', span.n)
self.assertIsNotNone(span.t)
self.assertIsNotNone(span.s)
self.assertIsNone(span.p)
self.assertIsNotNone(span.ts)
self.assertIsNotNone(span.d)
server_timing_value = "intid;desc=%s" % span.t
assert result['headers']['Server-Timing'] == server_timing_value
self.assertEqual({'hl': True, 'cp': 'aws', 'e': 'arn:aws:lambda:us-east-2:12345:function:TestPython:1'},
span.f)
self.assertIsNone(span.sy)
self.assertIsNone(span.ec)
self.assertIsNone(span.data['lambda']['error'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', span.data['lambda']['arn'])
self.assertEqual(None, span.data['lambda']['alias'])
self.assertEqual('python', span.data['lambda']['runtime'])
self.assertEqual('TestPython', span.data['lambda']['functionName'])
self.assertEqual('1', span.data['lambda']['functionVersion'])
self.assertIsNone(span.data['service'])
self.assertEqual('aws:cloudwatch.logs', span.data['lambda']['trigger'])
self.assertFalse("decodingError" in span.data['lambda']['cw']['logs'])
self.assertEqual('testLogGroup', span.data['lambda']['cw']['logs']['group'])
self.assertEqual('testLogStream', span.data['lambda']['cw']['logs']['stream'])
self.assertEqual(None, span.data['lambda']['cw']['logs']['more'])
self.assertTrue(isinstance(span.data['lambda']['cw']['logs']['events'], list))
self.assertEqual(2, len(span.data['lambda']['cw']['logs']['events']))
self.assertEqual('[ERROR] First test message', span.data['lambda']['cw']['logs']['events'][0])
self.assertEqual('[ERROR] Second test message', span.data['lambda']['cw']['logs']['events'][1])
def test_s3_trigger_tracing(self):
with open(self.pwd + '/../data/lambda/s3_event.json', 'r') as json_file:
event = json.load(json_file)
self.create_agent_and_setup_tracer()
# Call the Instana Lambda Handler as we do in the real world. It will initiate tracing and then
# figure out the original (the users') Lambda Handler and execute it.
# The original Lambda handler is set in os.environ["LAMBDA_HANDLER"]
result = lambda_handler(event, self.context)
assert isinstance(result, dict)
assert 'headers' in result
assert 'Server-Timing' in result['headers']
time.sleep(1)
payload = self.agent.collector.prepare_payload()
self.assertTrue("metrics" in payload)
self.assertTrue("spans" in payload)
self.assertEqual(2, len(payload.keys()))
self.assertTrue(isinstance(payload['metrics']['plugins'], list))
self.assertTrue(len(payload['metrics']['plugins']) == 1)
plugin_data = payload['metrics']['plugins'][0]
self.assertEqual('com.instana.plugin.aws.lambda', plugin_data['name'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', plugin_data['entityId'])
self.assertEqual(1, len(payload['spans']))
span = payload['spans'][0]
self.assertEqual('aws.lambda.entry', span.n)
self.assertIsNotNone(span.t)
self.assertIsNotNone(span.s)
self.assertIsNone(span.p)
self.assertIsNotNone(span.ts)
self.assertIsNotNone(span.d)
server_timing_value = "intid;desc=%s" % span.t
assert result['headers']['Server-Timing'] == server_timing_value
self.assertEqual({'hl': True, 'cp': 'aws', 'e': 'arn:aws:lambda:us-east-2:12345:function:TestPython:1'},
span.f)
self.assertIsNone(span.sy)
self.assertIsNone(span.ec)
self.assertIsNone(span.data['lambda']['error'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', span.data['lambda']['arn'])
self.assertEqual(None, span.data['lambda']['alias'])
self.assertEqual('python', span.data['lambda']['runtime'])
self.assertEqual('TestPython', span.data['lambda']['functionName'])
self.assertEqual('1', span.data['lambda']['functionVersion'])
self.assertIsNone(span.data['service'])
self.assertEqual('aws:s3', span.data['lambda']['trigger'])
self.assertTrue(isinstance(span.data["lambda"]["s3"]["events"], list))
events = span.data["lambda"]["s3"]["events"]
self.assertEqual(1, len(events))
event = events[0]
self.assertEqual('ObjectCreated:Put', event['event'])
self.assertEqual('example-bucket', event['bucket'])
self.assertEqual('test/key', event['object'])
def test_sqs_trigger_tracing(self):
with open(self.pwd + '/../data/lambda/sqs_event.json', 'r') as json_file:
event = json.load(json_file)
self.create_agent_and_setup_tracer()
# Call the Instana Lambda Handler as we do in the real world. It will initiate tracing and then
# figure out the original (the users') Lambda Handler and execute it.
# The original Lambda handler is set in os.environ["LAMBDA_HANDLER"]
result = lambda_handler(event, self.context)
assert isinstance(result, dict)
assert 'headers' in result
assert 'Server-Timing' in result['headers']
time.sleep(1)
payload = self.agent.collector.prepare_payload()
self.assertTrue("metrics" in payload)
self.assertTrue("spans" in payload)
self.assertEqual(2, len(payload.keys()))
self.assertTrue(isinstance(payload['metrics']['plugins'], list))
self.assertTrue(len(payload['metrics']['plugins']) == 1)
plugin_data = payload['metrics']['plugins'][0]
self.assertEqual('com.instana.plugin.aws.lambda', plugin_data['name'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', plugin_data['entityId'])
self.assertEqual(1, len(payload['spans']))
span = payload['spans'][0]
self.assertEqual('aws.lambda.entry', span.n)
self.assertIsNotNone(span.t)
self.assertIsNotNone(span.s)
self.assertIsNone(span.p)
self.assertIsNotNone(span.ts)
self.assertIsNotNone(span.d)
server_timing_value = "intid;desc=%s" % span.t
assert result['headers']['Server-Timing'] == server_timing_value
self.assertEqual({'hl': True, 'cp': 'aws', 'e': 'arn:aws:lambda:us-east-2:12345:function:TestPython:1'},
span.f)
self.assertIsNone(span.sy)
self.assertIsNone(span.ec)
self.assertIsNone(span.data['lambda']['error'])
self.assertEqual('arn:aws:lambda:us-east-2:12345:function:TestPython:1', span.data['lambda']['arn'])
self.assertEqual(None, span.data['lambda']['alias'])
self.assertEqual('python', span.data['lambda']['runtime'])
self.assertEqual('TestPython', span.data['lambda']['functionName'])
self.assertEqual('1', span.data['lambda']['functionVersion'])
self.assertIsNone(span.data['service'])
self.assertEqual('aws:sqs', span.data['lambda']['trigger'])
self.assertTrue(isinstance(span.data["lambda"]["sqs"]["messages"], list))
messages = span.data["lambda"]["sqs"]["messages"]
self.assertEqual(1, len(messages))
message = messages[0]
self.assertEqual('arn:aws:sqs:us-west-1:123456789012:MyQueue', message['queue'])
def test_read_query_params(self):
event = { "queryStringParameters": {"foo": "bar" },
"multiValueQueryStringParameters": { "foo": ["bar"] } }
params = read_http_query_params(event)
self.assertEqual("foo=['bar']", params)
def test_read_query_params_with_none_data(self):
event = { "queryStringParameters": None,
"multiValueQueryStringParameters": None }
params = read_http_query_params(event)
self.assertEqual("", params)
def test_read_query_params_with_bad_event(self):
event = None
params = read_http_query_params(event)
self.assertEqual("", params)
def test_arn_parsing(self):
ctx = MockContext()
assert(normalize_aws_lambda_arn(ctx) == "arn:aws:lambda:us-east-2:12345:function:TestPython:1")
# Without version should return a fully qualified ARN (with version)
ctx.invoked_function_arn = "arn:aws:lambda:us-east-2:12345:function:TestPython"
assert(normalize_aws_lambda_arn(ctx) == "arn:aws:lambda:us-east-2:12345:function:TestPython:1")
# Fully qualified already with the '$LATEST' special tag
ctx.invoked_function_arn = "arn:aws:lambda:us-east-2:12345:function:TestPython:$LATEST"
assert(normalize_aws_lambda_arn(ctx) == "arn:aws:lambda:us-east-2:12345:function:TestPython:$LATEST")
def test_agent_default_log_level(self):
self.create_agent_and_setup_tracer()
assert self.agent.options.log_level == logging.WARNING
def test_agent_custom_log_level(self):
os.environ['INSTANA_LOG_LEVEL'] = "eRror"
self.create_agent_and_setup_tracer()
assert self.agent.options.log_level == logging.ERROR
| 44.00542 | 112 | 0.654052 | 3,975 | 32,476 | 5.216855 | 0.079245 | 0.111395 | 0.049284 | 0.020254 | 0.829628 | 0.815113 | 0.788156 | 0.775522 | 0.757824 | 0.753677 | 0 | 0.017325 | 0.196638 | 32,476 | 737 | 113 | 44.065129 | 0.777501 | 0.070914 | 0 | 0.691011 | 0 | 0 | 0.237672 | 0.093383 | 0 | 0 | 0 | 0 | 0.565543 | 1 | 0.054307 | false | 0.001873 | 0.033708 | 0.001873 | 0.093633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c03f45cdfc076e766598ae5602f4ebee48f598c5 | 12,857 | py | Python | EvalData/migrations/0019_auto_20170619_1617.py | amalinovskiy/Appraise | 03446dacebd91c556b29420fe917e2b0547047bd | [
"BSD-3-Clause"
] | 11 | 2021-02-08T08:40:23.000Z | 2022-03-30T09:56:40.000Z | EvalData/migrations/0019_auto_20170619_1617.py | amalinovskiy/Appraise | 03446dacebd91c556b29420fe917e2b0547047bd | [
"BSD-3-Clause"
] | 29 | 2021-01-23T16:50:47.000Z | 2022-03-25T13:46:01.000Z | EvalData/migrations/0019_auto_20170619_1617.py | amalinovskiy/Appraise | 03446dacebd91c556b29420fe917e2b0547047bd | [
"BSD-3-Clause"
] | 5 | 2021-05-22T14:34:47.000Z | 2021-08-23T15:50:05.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11 on 2017-06-19 23:17
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('Campaign', '0005_trusteduser'),
('EvalData', '0018_auto_20170607_2120'),
]
operations = [
migrations.CreateModel(
name='MultiModalAssessmentResult',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('dateCreated', models.DateTimeField(auto_now_add=True, verbose_name='Date created')),
('dateActivated', models.DateTimeField(blank=True, null=True, verbose_name='Date activated')),
('dateCompleted', models.DateTimeField(blank=True, null=True, verbose_name='Date completed')),
('dateRetired', models.DateTimeField(blank=True, null=True, verbose_name='Date retired')),
('dateModified', models.DateTimeField(blank=True, null=True, verbose_name='Date modified')),
('activated', models.BooleanField(db_index=True, default=False, verbose_name='Activated?')),
('completed', models.BooleanField(db_index=True, default=False, verbose_name='Completed?')),
('retired', models.BooleanField(db_index=True, default=False, verbose_name='Retired?')),
('rawData', models.TextField(blank=True, editable=False, verbose_name='Raw data')),
('score', models.PositiveSmallIntegerField(help_text='(value in range=[1,100])', verbose_name='Score')),
('start_time', models.FloatField(help_text='(in seconds)', verbose_name='Start time')),
('end_time', models.FloatField(help_text='(in seconds)', verbose_name='End time')),
('activatedBy', models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_activated_by', related_query_name='evaldata_multimodalassessmentresults', to=settings.AUTH_USER_MODEL, verbose_name='Activated by')),
('completedBy', models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_completed_by', related_query_name='evaldata_multimodalassessmentresults', to=settings.AUTH_USER_MODEL, verbose_name='Completed by')),
('createdBy', models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_created_by', related_query_name='evaldata_multimodalassessmentresults', to=settings.AUTH_USER_MODEL, verbose_name='Created by')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='MultiModalAssessmentTask',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('dateCreated', models.DateTimeField(auto_now_add=True, verbose_name='Date created')),
('dateActivated', models.DateTimeField(blank=True, null=True, verbose_name='Date activated')),
('dateCompleted', models.DateTimeField(blank=True, null=True, verbose_name='Date completed')),
('dateRetired', models.DateTimeField(blank=True, null=True, verbose_name='Date retired')),
('dateModified', models.DateTimeField(blank=True, null=True, verbose_name='Date modified')),
('activated', models.BooleanField(db_index=True, default=False, verbose_name='Activated?')),
('completed', models.BooleanField(db_index=True, default=False, verbose_name='Completed?')),
('retired', models.BooleanField(db_index=True, default=False, verbose_name='Retired?')),
('rawData', models.TextField(blank=True, editable=False, verbose_name='Raw data')),
('requiredAnnotations', models.PositiveSmallIntegerField(help_text='(value in range=[1,50])', verbose_name='Required annotations')),
('batchNo', models.PositiveIntegerField(help_text='(1-based)', verbose_name='Batch number')),
('activatedBy', models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_activated_by', related_query_name='evaldata_multimodalassessmenttasks', to=settings.AUTH_USER_MODEL, verbose_name='Activated by')),
('assignedTo', models.ManyToManyField(blank=True, db_index=True, help_text='(users working on this task)', related_name='evaldata_multimodalassessmenttask_assignedTo', related_query_name='evaldata_multimodalassessmenttasks', to=settings.AUTH_USER_MODEL, verbose_name='Assigned to')),
('batchData', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_batchData', related_query_name='evaldata_multimodalassessmenttasks', to='Campaign.CampaignData', verbose_name='Batch data')),
('campaign', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_campaign', related_query_name='evaldata_multimodalassessmenttasks', to='Campaign.Campaign', verbose_name='Campaign')),
('completedBy', models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_completed_by', related_query_name='evaldata_multimodalassessmenttasks', to=settings.AUTH_USER_MODEL, verbose_name='Completed by')),
('createdBy', models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_created_by', related_query_name='evaldata_multimodalassessmenttasks', to=settings.AUTH_USER_MODEL, verbose_name='Created by')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='TextPairWithImage',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('dateCreated', models.DateTimeField(auto_now_add=True, verbose_name='Date created')),
('dateActivated', models.DateTimeField(blank=True, null=True, verbose_name='Date activated')),
('dateCompleted', models.DateTimeField(blank=True, null=True, verbose_name='Date completed')),
('dateRetired', models.DateTimeField(blank=True, null=True, verbose_name='Date retired')),
('dateModified', models.DateTimeField(blank=True, null=True, verbose_name='Date modified')),
('activated', models.BooleanField(db_index=True, default=False, verbose_name='Activated?')),
('completed', models.BooleanField(db_index=True, default=False, verbose_name='Completed?')),
('retired', models.BooleanField(db_index=True, default=False, verbose_name='Retired?')),
('rawData', models.TextField(blank=True, editable=False, verbose_name='Raw data')),
('itemID', models.PositiveIntegerField(help_text='(1-based)', verbose_name='Item ID')),
('itemType', models.CharField(choices=[('SRC', 'Source text'), ('TGT', 'Target text'), ('REF', 'Reference text'), ('BAD', 'Bad reference'), ('CHK', 'Redundant check')], db_index=True, max_length=5, verbose_name='Item type')),
('sourceID', models.CharField(help_text='(max. 1000 characters)', max_length=1000, verbose_name='Source ID')),
('sourceText', models.CharField(help_text='(max. 2000 characters)', max_length=2000, verbose_name='Source text')),
('targetID', models.CharField(help_text='(max. 1000 characters)', max_length=1000, verbose_name='Target ID')),
('targetText', models.CharField(help_text='(max. 2000 characters)', max_length=2000, verbose_name='Target text')),
('imageURL', models.URLField(verbose_name='image URL')),
('activatedBy', models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_activated_by', related_query_name='evaldata_textpairwithimages', to=settings.AUTH_USER_MODEL, verbose_name='Activated by')),
('completedBy', models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_completed_by', related_query_name='evaldata_textpairwithimages', to=settings.AUTH_USER_MODEL, verbose_name='Completed by')),
('createdBy', models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_created_by', related_query_name='evaldata_textpairwithimages', to=settings.AUTH_USER_MODEL, verbose_name='Created by')),
('metadata', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='EvalData.Metadata')),
('modifiedBy', models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_modified_by', related_query_name='evaldata_textpairwithimages', to=settings.AUTH_USER_MODEL, verbose_name='Modified by')),
('retiredBy', models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_retired_by', related_query_name='evaldata_textpairwithimages', to=settings.AUTH_USER_MODEL, verbose_name='Retired by')),
],
options={
'abstract': False,
},
),
migrations.AddField(
model_name='multimodalassessmenttask',
name='items',
field=models.ManyToManyField(related_name='evaldata_multimodalassessmenttask_items', related_query_name='evaldata_multimodalassessmenttasks', to='EvalData.TextPairWithImage', verbose_name='Items'),
),
migrations.AddField(
model_name='multimodalassessmenttask',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_modified_by', related_query_name='evaldata_multimodalassessmenttasks', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AddField(
model_name='multimodalassessmenttask',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_retired_by', related_query_name='evaldata_multimodalassessmenttasks', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AddField(
model_name='multimodalassessmentresult',
name='item',
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_item', related_query_name='evaldata_multimodalassessmentresults', to='EvalData.TextPairWithImage', verbose_name='Item'),
),
migrations.AddField(
model_name='multimodalassessmentresult',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_modified_by', related_query_name='evaldata_multimodalassessmentresults', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AddField(
model_name='multimodalassessmentresult',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_retired_by', related_query_name='evaldata_multimodalassessmentresults', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AddField(
model_name='multimodalassessmentresult',
name='task',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_task', related_query_name='evaldata_multimodalassessmentresults', to='EvalData.MultiModalAssessmentTask', verbose_name='Task'),
),
]
| 94.536765 | 317 | 0.708252 | 1,344 | 12,857 | 6.540179 | 0.119792 | 0.07884 | 0.033447 | 0.05256 | 0.860182 | 0.850057 | 0.809784 | 0.779522 | 0.756086 | 0.734699 | 0 | 0.007251 | 0.163335 | 12,857 | 135 | 318 | 95.237037 | 0.809891 | 0.005133 | 0 | 0.570313 | 1 | 0 | 0.278386 | 0.152174 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.03125 | 0 | 0.054688 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c040125ed279df6d1797a746d027a8fa195284fa | 623 | py | Python | python/minify/tumult_gzip.py | lcary/tmp | 1ea8e06bc25d13f5be6a0ac578d3302ee2134a77 | [
"MIT"
] | null | null | null | python/minify/tumult_gzip.py | lcary/tmp | 1ea8e06bc25d13f5be6a0ac578d3302ee2134a77 | [
"MIT"
] | null | null | null | python/minify/tumult_gzip.py | lcary/tmp | 1ea8e06bc25d13f5be6a0ac578d3302ee2134a77 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import zlib, base64
exec(zlib.decompress(base64.b64decode('eJyNUUFu2zAQvOsVW/UQxTDkewAdjLYGAuSWAkFPxJpaO1tTJLFc2dHvS0qB7dx6oECMZmZnh9+/bcYkmz37DfkzxEnfg69UpqcKeIhBFHoaeJQj24o+LEWF5xn/JRIks6Kw16Z+Q/Hsj0/wJ4wPQuDDnbKFrdURnZvW8Az6zv6Uv6gPKfNkQNfWj1+HDlNSPrBFpf8b+xPlBOcQ+hBgwAn2BKMXcox7R8V++/tl+9rt0CWCyjpMCXYhNGH/l6w+ZseeDmAMe1ZjmkTusF6hHNN6tTpdyqVwIGbdQr0uZ25ZgyzCHq1ObhawL+E5+O7WRkQXWbGgzSc1M/N/FDuj3d36bZ83M8tmzdWtCIR0FH+vW5IpJV1yXHLHdCZZos+NXaGKy7oeBzKm62pjBmRvTH3rdpefJnfbtqU+OHSlrppiTiX52Kle15EkFbfsOXPaeXb9I/Tpkp87xIz+A/i71IU=')))
# Created by pyminifier (https://github.com/liftoff/pyminifier)
| 103.833333 | 515 | 0.895666 | 45 | 623 | 12.4 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11293 | 0.019262 | 623 | 5 | 516 | 124.6 | 0.800327 | 0.131621 | 0 | 0 | 0 | 0.5 | 0.877323 | 0.877323 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
222798a06ddb9e1fa756242b1438df6cd2521333 | 1,408 | py | Python | MVC/model.py | abhiWriteCode/design-patterns-in-python | 4400eb9bfadd59a598f0c33c309a55fc41b28b96 | [
"MIT"
] | null | null | null | MVC/model.py | abhiWriteCode/design-patterns-in-python | 4400eb9bfadd59a598f0c33c309a55fc41b28b96 | [
"MIT"
] | null | null | null | MVC/model.py | abhiWriteCode/design-patterns-in-python | 4400eb9bfadd59a598f0c33c309a55fc41b28b96 | [
"MIT"
] | null | null | null | """
Model:
It consists of pure application logic, which interacts with the database.
It includes all the information to represent data to the end user.
"""
import json
<<<<<<< HEAD
class Person(objects):
"""docstring for Person"""
def __init__(self, first_name=None, last_name=None):
self.first_name = first_name
self.last_name = last_name
def __repr__(self):
return f'{self.first_name} {self.last_name}'
@classmethod
def get_all(self):
persons = []
with open('db.txt', 'r') as db:
json_list = json.loads(db.read())
for item in json_list:
item = json.loads(item)
person = Person(item['first_name'], item['last_name'])
persons.append(person)
return persons
=======
class Person:
"""docstring for Person"""
def __init__(self, first_name=None, last_name=None):
self.first_name = first_name
self.last_name = last_name
def __repr__(self):
return f'{self.first_name} {self.last_name}'
@classmethod
def get_all(self):
persons = []
with open('db.txt', 'r') as db:
json_list = json.loads(db.read())
for item in json_list:
item = json.loads(item)
person = Person(item['first_name'], item['last_name'])
persons.append(person)
return persons
>>>>>>> b1945172333d5eb638288d7abaab9c37c27405a9
| 24.701754 | 76 | 0.620028 | 181 | 1,408 | 4.59116 | 0.292818 | 0.108303 | 0.093863 | 0.081829 | 0.755716 | 0.755716 | 0.755716 | 0.755716 | 0.755716 | 0.755716 | 0 | 0.025714 | 0.254261 | 1,408 | 56 | 77 | 25.142857 | 0.765714 | 0 | 0 | 0.833333 | 0 | 0 | 0.105079 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.027778 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
224e6f25296ae5e630475ab4436267072eec3c32 | 15,441 | py | Python | src/ssnmf/visualization.py | rmadushani/ssnmf | 36e8c8c2202087eafefccb70c7df7cb3a583d99d | [
"MIT"
] | 6 | 2020-05-03T18:02:27.000Z | 2021-11-26T08:24:33.000Z | src/ssnmf/visualization.py | rmadushani/ssnmf | 36e8c8c2202087eafefccb70c7df7cb3a583d99d | [
"MIT"
] | null | null | null | src/ssnmf/visualization.py | rmadushani/ssnmf | 36e8c8c2202087eafefccb70c7df7cb3a583d99d | [
"MIT"
] | 9 | 2020-04-25T01:21:28.000Z | 2022-02-09T19:15:15.000Z | # Import necessary packages
import numpy as np
import torch
#import torchvision
import matplotlib.pyplot as plt
from time import time
import os
#from google.colab import drive
import scipy.optimize.nnls as nnls
from numpy import linalg as la
import ssnmf
def topic_plot(A, vertpixels, horizpixels, colnum):
topic = np.transpose(np.reshape(A[:,colnum],[horizpixels,vertpixels]))
plt.imshow(topic, cmap='binary')
plt.show()
def visualize_reconstr(dictionary, representation, vertpixels, horizpixels, indices):
# Indices is a list
recon = np.matmul(dictionary,representation)
for index in indices:
image = np.transpose(np.reshape(recon[:,index],[horizpixels,vertpixels]))
plt.imshow(image, cmap='binary')
plt.savefig('./'+'reconstruction'+str(index)+'.png')
def plot_util(kchoices,train_errs,test_errs,train_reconerrs,test_reconerrs,train_classerrs,test_classerrs,train_accs,test_accs,namme,iOption):
# iOption - indicate which x-axis label to use - options are 'k','l','n'
# namme - Name of figure to be saved.
fig, axs = plt.subplots(2, 2, figsize=(17,10))
axs[0, 0].plot(kchoices,train_errs,color='blue',linewidth=4,label='Train Errors')
axs[0, 0].plot(kchoices,test_errs,color='red',linestyle='dashed',linewidth=4,label='Test Errors')
axs[0, 0].legend()
axs[0, 0].set_title('Train Errors')
axs[0, 1].plot(kchoices,train_reconerrs,color='blue',linewidth=4,label='Train Reconstruction Errors')
axs[0, 1].plot(kchoices,test_reconerrs,color='red',linestyle='dashed',linewidth=4,label='Test Reconstruction Errors')
axs[0, 1].legend()
axs[0, 1].set_title('Train Reconstruction Errors')
axs[1, 0].plot(kchoices,train_classerrs,color='blue',linewidth=4,label='Train Classification Errors')
axs[1, 0].plot(kchoices,test_classerrs,color='red',linestyle='dashed',linewidth=4,label='Test Classification Errors')
axs[1, 0].legend()
axs[1, 0].set_title('Train Classification Errors')
axs[1, 1].plot(kchoices,train_accs,color='blue',linewidth=4,label='Train Classification Accuracies')
axs[1, 1].plot(kchoices,test_accs,color='red',linestyle='dashed',linewidth=4,label='Test Classification Accuracies')
axs[1, 1].legend()
axs[1, 1].set_title('Train Classification Accuracies')
if(iOption == 'k'):
axs[0, 0].set_xlabel('k')
axs[0, 1].set_xlabel('k')
axs[1, 0].set_xlabel('k')
axs[1, 1].set_xlabel('k')
if(iOption == 'l'):
axs[0, 0].set_xlabel('lambda')
axs[0, 1].set_xlabel('lambda')
axs[1, 0].set_xlabel('lambda')
axs[1, 1].set_xlabel('lambda')
if(iOption == 'n'):
axs[0, 0].set_xlabel('Iterations')
axs[0, 1].set_xlabel('Iterations')
axs[1, 0].set_xlabel('Iterations')
axs[1, 1].set_xlabel('Iterations')
lines, labels = fig.axes[-1].get_legend_handles_labels()
fig.legend(lines, labels, loc = 'upper center')
plt.savefig('./'+namme+'.png')
def k_plots(train_features, train_labels, test_features, test_labels, kchoices, lam, numiters,avgnum):
numks = np.shape(kchoices)[0]
train_errs = [0]*numks
train_reconerrs = [0]*numks
train_classerrs = [0]*numks
train_accs = [0]*numks
test_errs = [0]*numks
test_reconerrs = [0]*numks
test_classerrs = [0]*numks
test_accs = [0]*numks
for i in range(numks):
for j in range(avgnum):
module = TrainTestSetEvaluation(train_features, train_labels, test_features, test_labels, kchoices[i], lam, numiters)
train_model_error, train_acc, numiters, [test_err, test_reconerr, test_classerr, test_acc], S_test = module.tt_eval_ssnmfmult()
train_model_errors = train_model_error[0]
train_errs[i] = train_errs[i]+train_model_errors[numiters-1]
train_model_reconerrs = train_model_error[1]
train_reconerrs[i] = train_reconerrs[i]+train_model_reconerrs[numiters-1]
train_model_classerrs = train_model_error[2]
train_classerrs[i] = train_classerrs[i]+train_model_classerrs[numiters-1]
train_accs[i] = train_accs[i]+train_acc
test_errs[i] = test_errs[i]+test_err
test_reconerrs[i] = test_reconerrs[i]+test_reconerr
test_classerrs[i] = test_classerrs[i]+test_classerr
test_accs[i] = test_accs[i]+test_acc
train_errs = [element/avgnum for element in train_errs]
train_reconerrs = [element/avgnum for element in train_reconerrs]
train_classerrs = [element/avgnum for element in train_classerrs]
train_accs = [element/avgnum for element in train_accs]
test_errs = [element/avgnum for element in test_errs]
test_reconerrs = [element/avgnum for element in test_reconerrs]
test_classerrs = [element/avgnum for element in test_classerrs]
test_accs = [element/avgnum for element in test_accs]
plot_util(kchoices,train_errs,test_errs,train_reconerrs,test_reconerrs,train_classerrs,test_classerrs,train_accs,test_accs,'k_SSNMF','k')
def kl_k_plots(train_features, train_labels, test_features, test_labels, kchoices, lam, numiters,avgnum):
numks = np.shape(kchoices)[0]
train_errs = [0]*numks
train_reconerrs = [0]*numks
train_classerrs = [0]*numks
train_accs = [0]*numks
test_errs = [0]*numks
test_reconerrs = [0]*numks
test_classerrs = [0]*numks
test_accs = [0]*numks
for i in range(numks):
for j in range(avgnum):
module = TrainTestSetEvaluation(train_features, train_labels, test_features, test_labels, kchoices[i], lam, numiters)
train_model_error, train_acc, numiters, [test_err, test_reconerr, test_classerr, test_acc], S_test = module.tt_eval_kl_ssnmfmult()
train_model_errors = train_model_error[0]
train_errs[i] = train_errs[i]+train_model_errors[numiters-1]
train_model_reconerrs = train_model_error[1]
train_reconerrs[i] = train_reconerrs[i]+train_model_reconerrs[numiters-1]
train_model_classerrs = train_model_error[2]
train_classerrs[i] = train_classerrs[i]+train_model_classerrs[numiters-1]
train_accs[i] = train_accs[i]+train_acc
test_errs[i] = test_errs[i]+test_err
test_reconerrs[i] = test_reconerrs[i]+test_reconerr
test_classerrs[i] = test_classerrs[i]+test_classerr
test_accs[i] = test_accs[i]+test_acc
train_errs = [element/avgnum for element in train_errs]
train_reconerrs = [element/avgnum for element in train_reconerrs]
train_classerrs = [element/avgnum for element in train_classerrs]
train_accs = [element/avgnum for element in train_accs]
test_errs = [element/avgnum for element in test_errs]
test_reconerrs = [element/avgnum for element in test_reconerrs]
test_classerrs = [element/avgnum for element in test_classerrs]
test_accs = [element/avgnum for element in test_accs]
plot_util(kchoices,train_errs,test_errs,train_reconerrs,test_reconerrs,train_classerrs,test_classerrs,train_accs,test_accs,'k_KLSSNMF','k')
def lam_plots(train_features, train_labels, test_features, test_labels, k, lamchoices, numiters,avgnum):
numlams = np.shape(lamchoices)[0]
train_errs = [0]*numlams
train_reconerrs = [0]*numlams
train_classerrs = [0]*numlams
train_accs = [0]*numlams
test_errs = [0]*numlams
test_reconerrs = [0]*numlams
test_classerrs = [0]*numlams
test_accs = [0]*numlams
for i in range(numlams):
for j in range(avgnum):
module = TrainTestSetEvaluation(train_features, train_labels, test_features, test_labels, k, lamchoices[i], numiters)
train_model_error, train_acc, numiters, [test_err, test_reconerr, test_classerr, test_acc], S_test = module.tt_eval_ssnmfmult()
train_model_errors = train_model_error[0]
train_errs[i] = train_errs[i]+train_model_errors[numiters-1]
train_model_reconerrs = train_model_error[1]
train_reconerrs[i] = train_reconerrs[i]+train_model_reconerrs[numiters-1]
train_model_classerrs = train_model_error[2]
train_classerrs[i] = train_classerrs[i]+train_model_classerrs[numiters-1]
train_accs[i] = train_accs[i]+train_acc
test_errs[i] = test_errs[i]+test_err
test_reconerrs[i] = test_reconerrs[i]+test_reconerr
test_classerrs[i] = test_classerrs[i]+test_classerr
test_accs[i] = test_accs[i]+test_acc
train_errs = [element/avgnum for element in train_errs]
train_reconerrs = [element/avgnum for element in train_reconerrs]
train_classerrs = [element/avgnum for element in train_classerrs]
train_accs = [element/avgnum for element in train_accs]
test_errs = [element/avgnum for element in test_errs]
test_reconerrs = [element/avgnum for element in test_reconerrs]
test_classerrs = [element/avgnum for element in test_classerrs]
test_accs = [element/avgnum for element in test_accs]
plot_util(lamchoices,train_errs,test_errs,train_reconerrs,test_reconerrs,train_classerrs,test_classerrs,train_accs,test_accs,'l_SSNMF','l')
def kl_lam_plots(train_features, train_labels, test_features, test_labels, k, lamchoices, numiters,avgnum):
numlams = np.shape(lamchoices)[0]
train_errs = [0]*numlams
train_reconerrs = [0]*numlams
train_classerrs = [0]*numlams
train_accs = [0]*numlams
test_errs = [0]*numlams
test_reconerrs = [0]*numlams
test_classerrs = [0]*numlams
test_accs = [0]*numlams
for i in range(numlams):
for j in range(avgnum):
module = TrainTestSetEvaluation(train_features, train_labels, test_features, test_labels, k, lamchoices[i], numiters)
train_model_error, train_acc, numiters, [test_err, test_reconerr, test_classerr, test_acc], S_test = module.tt_eval_kl_ssnmfmult()
train_model_errors = train_model_error[0]
train_errs[i] = train_errs[i]+train_model_errors[numiters-1]
train_model_reconerrs = train_model_error[1]
train_reconerrs[i] = train_reconerrs[i]+train_model_reconerrs[numiters-1]
train_model_classerrs = train_model_error[2]
train_classerrs[i] = train_classerrs[i]+train_model_classerrs[numiters-1]
train_accs[i] = train_accs[i]+train_acc
test_errs[i] = test_errs[i]+test_err
test_reconerrs[i] = test_reconerrs[i]+test_reconerr
test_classerrs[i] = test_classerrs[i]+test_classerr
test_accs[i] = test_accs[i]+test_acc
train_errs = [element/avgnum for element in train_errs]
train_reconerrs = [element/avgnum for element in train_reconerrs]
train_classerrs = [element/avgnum for element in train_classerrs]
train_accs = [element/avgnum for element in train_accs]
test_errs = [element/avgnum for element in test_errs]
test_reconerrs = [element/avgnum for element in test_reconerrs]
test_classerrs = [element/avgnum for element in test_classerrs]
test_accs = [element/avgnum for element in test_accs]
plot_util(lamchoices,train_errs,test_errs,train_reconerrs,test_reconerrs,train_classerrs,test_classerrs,train_accs,test_accs,'l_KLSSNMF','l')
def numiters_plots(train_features, train_labels, test_features, test_labels, k, lam, numiterschoices,avgnum):
numnums = np.shape(numiterschoices)[0]
train_errs = [0]*numnums
train_reconerrs = [0]*numnums
train_classerrs = [0]*numnums
train_accs = [0]*numnums
test_errs = [0]*numnums
test_reconerrs = [0]*numnums
test_classerrs = [0]*numnums
test_accs = [0]*numnums
for i in range(numnums):
for j in range(avgnum):
module = TrainTestSetEvaluation(train_features, train_labels, test_features, test_labels, k, lam, numiterschoices[i])
train_model_error, train_acc, numiters, [test_err, test_reconerr, test_classerr, test_acc], S_test = module.tt_eval_ssnmfmult()
train_model_errors = train_model_error[0]
train_errs[i] = train_errs[i]+train_model_errors[numiters-1]
train_model_reconerrs = train_model_error[1]
train_reconerrs[i] = train_reconerrs[i]+train_model_reconerrs[numiters-1]
train_model_classerrs = train_model_error[2]
train_classerrs[i] = train_classerrs[i]+train_model_classerrs[numiters-1]
train_accs[i] = train_accs[i]+train_acc
test_errs[i] = test_errs[i]+test_err
test_reconerrs[i] = test_reconerrs[i]+test_reconerr
test_classerrs[i] = test_classerrs[i]+test_classerr
test_accs[i] = test_accs[i]+test_acc
train_errs = [element/avgnum for element in train_errs]
train_reconerrs = [element/avgnum for element in train_reconerrs]
train_classerrs = [element/avgnum for element in train_classerrs]
train_accs = [element/avgnum for element in train_accs]
test_errs = [element/avgnum for element in test_errs]
test_reconerrs = [element/avgnum for element in test_reconerrs]
test_classerrs = [element/avgnum for element in test_classerrs]
test_accs = [element/avgnum for element in test_accs]
plot_util(numiterschoices,train_errs,test_errs,train_reconerrs,test_reconerrs,train_classerrs,test_classerrs,train_accs,test_accs,'n_SSNMF','n')
def kl_numiters_plots(train_features, train_labels, test_features, test_labels, k, lam, numiterschoices,avgnum):
numnums = np.shape(numiterschoices)[0]
train_errs = [0]*numnums
train_reconerrs = [0]*numnums
train_classerrs = [0]*numnums
train_accs = [0]*numnums
test_errs = [0]*numnums
test_reconerrs = [0]*numnums
test_classerrs = [0]*numnums
test_accs = [0]*numnums
for i in range(numnums):
for j in range(avgnum):
module = TrainTestSetEvaluation(train_features, train_labels, test_features, test_labels, k, lam, numiterschoices[i])
train_model_error, train_acc, numiters, [test_err, test_reconerr, test_classerr, test_acc], S_test = module.tt_eval_kl_ssnmfmult()
train_model_errors = train_model_error[0]
train_errs[i] = train_errs[i]+train_model_errors[numiters-1]
train_model_reconerrs = train_model_error[1]
train_reconerrs[i] = train_reconerrs[i]+train_model_reconerrs[numiters-1]
train_model_classerrs = train_model_error[2]
train_classerrs[i] = train_classerrs[i]+train_model_classerrs[numiters-1]
train_accs[i] = train_accs[i]+train_acc
test_errs[i] = test_errs[i]+test_err
test_reconerrs[i] = test_reconerrs[i]+test_reconerr
test_classerrs[i] = test_classerrs[i]+test_classerr
test_accs[i] = test_accs[i]+test_acc
train_errs = [element/avgnum for element in train_errs]
train_reconerrs = [element/avgnum for element in train_reconerrs]
train_classerrs = [element/avgnum for element in train_classerrs]
train_accs = [element/avgnum for element in train_accs]
test_errs = [element/avgnum for element in test_errs]
test_reconerrs = [element/avgnum for element in test_reconerrs]
test_classerrs = [element/avgnum for element in test_classerrs]
test_accs = [element/avgnum for element in test_accs]
plot_util(numiterschoices,train_errs,test_errs,train_reconerrs,test_reconerrs,train_classerrs,test_classerrs,train_accs,test_accs,'n_KLSSNMF','n')
| 50.792763 | 150 | 0.709475 | 2,140 | 15,441 | 4.840654 | 0.066822 | 0.057921 | 0.074138 | 0.106574 | 0.889178 | 0.85182 | 0.83734 | 0.829038 | 0.820929 | 0.810117 | 0 | 0.012787 | 0.184574 | 15,441 | 303 | 151 | 50.960396 | 0.809944 | 0.012953 | 0 | 0.744186 | 0 | 0 | 0.033743 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034884 | false | 0 | 0.031008 | 0 | 0.065891 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3f5566f485c3ea9a6fdd010f108fa5c7504585a4 | 13,573 | py | Python | polymath/srdfg/templates/gradient_defs.py | lite-david/polymath | cf1addc75e203fa606ebc6d32bc552fb3975ea99 | [
"Apache-2.0"
] | 15 | 2021-05-09T05:46:04.000Z | 2022-03-06T20:46:32.000Z | polymath/srdfg/templates/gradient_defs.py | lite-david/polymath | cf1addc75e203fa606ebc6d32bc552fb3975ea99 | [
"Apache-2.0"
] | null | null | null | polymath/srdfg/templates/gradient_defs.py | lite-david/polymath | cf1addc75e203fa606ebc6d32bc552fb3975ea99 | [
"Apache-2.0"
] | 4 | 2021-08-24T07:46:29.000Z | 2022-03-05T18:23:07.000Z | import polymath as pm
from .template_utils import _get_indices, _get_single_node_indices, _get_elem_indices
from polymath.srdfg.util import squeeze_shape
from numbers import Integral
import numpy as np
import functools
OPTIMIZERS = {'sgd': pm.sgd}
LOSS_FUNCS = {'cross_entropy': pm.cross_entropy_loss}
class batchnorm_grad(pm.Template):
def define_graph(self, x, scale, b, mean, var, grad, x_grad,
scale_grad, b_grad, optimizer, optimizer_kwargs, eps=1e-5):
indices = _get_single_node_indices(x, shape=x.shape)
reduce_idx = (indices[0], indices[2], indices[3])
N = np.prod((x.shape[0], x.shape[2], x.shape[3]))
sum_grad = pm.sum([reduce_idx], grad[indices])
mean_grad_y = sum_grad / N
mean_x = pm.sum([reduce_idx], x[indices]) / N
sqr_err = (x[indices] - mean_x[indices[1]])**2
var_x = pm.sum([reduce_idx], sqr_err[indices]) / N
grad_y_offset = (grad[indices] - mean_grad_y[indices[1]])
x_offset = x[indices] - mean_x[indices[1]]
var_eps = var_x[indices[1]] + eps
offset_sum = pm.sum([reduce_idx], grad[indices]*x_offset[indices])
new_mean = offset_sum[indices[1]] / N
rsqrt_var = (pm.rsqrt(var_eps[indices[1]])).set_name(f"{x.name}_rsqrt_var")
unsq_indices = _get_single_node_indices(rsqrt_var, shape=(1, x.shape[1], 1, 1))
coeff = (scale[unsq_indices[1]] * rsqrt_var[unsq_indices])
grad_sub = ((x_offset[indices] * new_mean[indices[1]])/ (var_eps[indices[1]]))
x_grad[indices] = coeff[indices[1]] * (grad_y_offset[indices] - grad_sub[indices])
scale_grad[indices[1]] = rsqrt_var[indices[1]] * offset_sum[indices[1]]
b_grad[indices[1]] = sum_grad[indices[1]]
with self.graph:
OPTIMIZERS[optimizer](scale, scale_grad, **optimizer_kwargs)
OPTIMIZERS[optimizer](b, b_grad, **optimizer_kwargs)
@property
def inputs(self):
return (self.args[0], self.args[1], self.args[2], self.args[3], self.args[4], self.args[5])
@property
def outputs(self):
return (self.args[6], self.args[7], self.args[8])
class global_average_pool_grad(pm.Template):
def define_graph(self, data, grad, data_grad):
pass
@property
def inputs(self):
return (self.args[0], self.args[1])
@property
def outputs(self):
return (self.args[2],)
class max_pool_grad(pm.Template):
def define_graph(self, data, grad, data_grad, kh, kw, stride=(1, 1), pad=(0,0)):
data_grad.set_shape(data.shape)
@property
def inputs(self):
return (self.args[0], self.args[1])
@property
def outputs(self):
return (self.args[2],)
@property
def stride(self):
return self.kwargs['stride']
@property
def kernel_size(self):
return (self.args[3], self.args[4])
@property
def pad(self):
return self.kwargs['pad']
class average_pool_grad(pm.Template):
def define_graph(self, data, grad, data_grad, kh, kw, stride=(1, 1), pad=(0,0)):
data_grad.set_shape(data.shape)
@property
def inputs(self):
return (self.args[0], self.args[1])
@property
def outputs(self):
return (self.args[2],)
@property
def stride(self):
return self.kwargs['stride']
@property
def kernel_size(self):
return (self.args[3], self.args[4])
@property
def pad(self):
return self.kwargs['pad']
class flatten_grad(pm.Template):
def define_graph(self, inp, grad, inp_grad):
inp_grad.set_shape(inp.shape)
@property
def inputs(self):
return (self.args[0], self.args[1])
@property
def outputs(self):
return (self.args[2],)
class elem_add_grad(pm.Template):
def define_graph(self, a, b, grad, a_grad, b_grad):
a_grad.set_shape(grad.shape)
b_grad.set_shape(grad.shape)
# a_idx, grad_idx, indices = _get_elem_indices(a, grad, a_grad)
# pm.elem_add(a, grad, a_grad)
# pm.elem_add(b, grad, b_grad)
# a_grad[indices] = a[a_idx] + grad[grad_idx]
# b_grad[indices] = b[a_idx] + grad[grad_idx]
@property
def inputs(self):
return (self.args[0], self.args[1], self.args[2])
@property
def outputs(self):
return (self.args[3], self.args[4])
class relu_grad(pm.Template):
def define_graph(self, x, grad, x_grad):
assert x.shape == grad.shape and grad.shape == x_grad.shape
x_idx, grad_idx, x_grad_idx = _get_elem_indices(x, grad, x_grad)
x_grad[x_grad_idx] = grad[grad_idx] * (x[x_idx] >= 0)
@property
def inputs(self):
return (self.args[0], self.args[1])
@property
def outputs(self):
return (self.args[2],)
class elem_tanh_grad(pm.Template):
def define_graph(self, x, grad, x_grad):
x_idx, grad_idx, x_grad_idx = _get_elem_indices(x, grad, x_grad)
# # x_grad[x_grad_idx] = grad[grad_idx] * (1 - pm.square(pm.tanh(x[x_idx])))
# x_grad[x_grad_idx] = grad[grad_idx] * (1 - pm.tanh(x[x_idx])))
@property
def inputs(self):
return (self.args[0], self.args[1])
@property
def outputs(self):
return (self.args[2],)
class conv_grad_no_bias(pm.Template):
def define_graph(self, inp, weight, grad, inp_grad, weight_grad, optimizer, optimizer_kwargs,
stride=1, pad=0, dilation=1):
min_sizes = []
k = len(grad.shape) - 2
for d in range(k):
min_sizes.append(
(grad.shape[d + 2] - 1) * stride
- 2 * pad
+ (weight.shape[-1] - 1) * dilation
+ 1
)
grad_input_padding = tuple(inp.shape[-k + d] - min_sizes[d] for d in range(k))
assert grad_input_padding[0] == grad_input_padding[1]
pm.conv_transpose(grad, weight, inp_grad, stride=stride, pad=pad, out_pad=grad_input_padding[0])
inp_indices = tuple(pm.index(0, s - 1) for s in inp.shape)
grad_indices = tuple(pm.index(0, s - 1) for s in grad.shape)
weight_indices = tuple(pm.index(0, s - 1) for s in weight.shape)
inp_transposed = pm.temp(name=f"transposed_{inp.name}", shape=(inp.shape[1], inp.shape[0], inp.shape[2], inp.shape[3]))
grad_transposed = pm.state(name=f"transposed_{grad.name}", shape=(grad.shape[1], grad.shape[0], grad.shape[2], grad.shape[3]))
wgt_grad_transposed = pm.temp(name=f"transposed_{weight.name}",
shape=(weight.shape[1], weight.shape[0], weight.shape[2], weight.shape[3]))
pm.tensor_transpose(inp, inp_transposed, perm=(1, 0, 2, 3))
pm.tensor_transpose(grad, grad_transposed, perm=(1, 0, 2, 3))
pm.conv(inp_transposed, grad_transposed, wgt_grad_transposed, stride=dilation, pad=pad, dilation=stride)
pm.tensor_transpose(wgt_grad_transposed, weight_grad, perm=(1, 0, 2, 3))
# Weight update
OPTIMIZERS[optimizer](weight, weight_grad, **optimizer_kwargs)
@property
def inputs(self):
return (self.args[0], self.args[1], self.args[2])
@property
def outputs(self):
return (self.args[3], self.args[4])
class conv_grad(pm.Template):
def define_graph(self, inp, weight, bias, grad, inp_grad, weight_grad,
bias_grad, optimizer, optimizer_kwargs,
stride=1, pad=0, dilation=1):
min_sizes = []
k = len(grad.shape) - 2
for d in range(k):
min_sizes.append(
(grad.shape[d + 2] - 1) * stride
- 2 * pad
+ (weight.shape[-1] - 1) * dilation
+ 1
)
grad_input_padding = tuple(inp.shape[-k + d] - min_sizes[d] for d in range(k))
assert grad_input_padding[0] == grad_input_padding[1]
pm.conv_transpose_bias(grad, weight, bias, inp_grad, stride=stride, pad=pad, out_pad=grad_input_padding[0])
inp_indices = tuple(pm.index(0, s-1) for s in inp.shape)
grad_indices = tuple(pm.index(0, s-1) for s in grad.shape)
weight_indices = tuple(pm.index(0, s-1) for s in weight.shape)
inp_transposed = pm.temp(name=f"transposed_{inp.name}", shape=(inp.shape[1], inp.shape[0], inp.shape[2], inp.shape[3]))
grad_transposed = pm.state(name=f"transposed_{grad.name}", shape=(grad.shape[1], grad.shape[0], grad.shape[2], grad.shape[3]))
wgt_grad_transposed = pm.temp(name=f"transposed_{weight.name}",
shape=(weight.shape[1], weight.shape[0], weight.shape[2], weight.shape[3]))
pm.tensor_transpose(inp, inp_transposed, perm=(1, 0, 2, 3))
pm.tensor_transpose(grad, grad_transposed, perm=(1, 0, 2, 3))
pm.conv(inp_transposed, grad_transposed, wgt_grad_transposed, stride=dilation, pad=pad, dilation=stride)
pm.tensor_transpose(wgt_grad_transposed, weight_grad, perm=(1, 0, 2, 3))
# Weight update
OPTIMIZERS[optimizer](weight, weight_grad, **optimizer_kwargs)
pm.reduce_sum(grad, bias_grad)
OPTIMIZERS[optimizer](bias, bias_grad, **optimizer_kwargs)
@property
def inputs(self):
return (self.args[0], self.args[1], self.args[2], self.args[3])
@property
def outputs(self):
return (self.args[4], self.args[5], self.args[6])
class gemm_grad_no_bias(pm.Template):
def define_graph(self, inp, weight, grad, inp_grad, weight_grad, optimizer, optimizer_kwargs):
transA = False
transB = False
if grad.shape[1] != weight.shape[0]:
indices = tuple([pm.index(0, s - 1) for s in weight.shape])
weight_transposed = pm.state(name=f"{weight.name}_transposed", shape=(weight.shape[1], weight.shape[0]))
weight_transposed[indices[1], indices[0]] = weight[indices]
pm.gemm_no_bias(grad, weight_transposed, inp_grad, transA=transA, transB=transB, strict_shapes=True)
else:
pm.gemm_no_bias(grad, weight, inp_grad, transA=transA, transB=transB, strict_shapes=True)
if grad.shape[0] != inp.shape[1]:
indices = tuple([pm.index(0, s - 1) for s in inp.shape])
# inp_transposed = pm.temp(name=f"{inp.name}_transposed", shape=(inp.shape[1], inp.shape[0]))
inp_transposed = pm.state(name=f"{inp.name}_transposed", shape=(inp.shape[1], inp.shape[0]))
inp_transposed[indices[1], indices[0]] = inp[indices]
pm.gemm_no_bias(inp_transposed, grad, weight_grad, transA=transA, transB=transB, strict_shapes=True)
else:
pm.gemm_no_bias(inp, grad, weight_grad, transA=transA, transB=transB, strict_shapes=True)
OPTIMIZERS[optimizer](weight, weight_grad, **optimizer_kwargs)
@property
def inputs(self):
return (self.args[0], self.args[1], self.args[2])
@property
def outputs(self):
return (self.args[3], self.args[4])
class gemm_grad(pm.Template):
def define_graph(self, inp, weight, bias, grad, inp_grad, weight_grad, bias_grad, optimizer, optimizer_kwargs):
transA = False
transB = False
if grad.shape[1] != weight.shape[0]:
indices = tuple([pm.index(0, s - 1) for s in weight.shape])
# weight_transposed = pm.temp(name=f"{weight.name}_transposed", shape=(weight.shape[1], weight.shape[0]))
weight_transposed = pm.state(name=f"{weight.name}_transposed", shape=(weight.shape[1], weight.shape[0]))
weight_transposed[indices[1], indices[0]] = weight[indices]
pm.gemm_no_bias(grad, weight_transposed, inp_grad, transA=transA, transB=transB, strict_shapes=True)
else:
pm.gemm_no_bias(grad, weight, inp_grad, transA=transA, transB=transB, strict_shapes=True)
if grad.shape[0] != inp.shape[1]:
indices = tuple([pm.index(0, s-1) for s in inp.shape])
# inp_transposed = pm.temp(name=f"{inp.name}_transposed", shape=(inp.shape[1], inp.shape[0]))
inp_transposed = pm.state(name=f"{inp.name}_transposed", shape=(inp.shape[1], inp.shape[0]))
inp_transposed[indices[1], indices[0]] = inp[indices]
pm.gemm_no_bias(inp_transposed, grad, weight_grad, transA=transA, transB=transB, strict_shapes=True)
else:
pm.gemm_no_bias(inp, grad, weight_grad, transA=transA, transB=transB, strict_shapes=True)
# Weight update
assert weight_grad.shape == weight.shape
OPTIMIZERS[optimizer](weight, weight_grad, **optimizer_kwargs)
pm.reduce_sum(grad, bias_grad)
OPTIMIZERS[optimizer](bias, bias_grad, **optimizer_kwargs)
@property
def inputs(self):
return (self.args[0], self.args[1], self.args[2], self.args[3])
@property
def outputs(self):
return (self.args[4], self.args[5], self.args[6])
class cross_entropy_loss_grad(pm.Template):
def define_graph(self, z, y, grad, grad_inp, reduction="mean"):
indices = [pm.index(0, s - 1, name=f"{z.name}[{i}]") for i, s in enumerate(z.shape)]
grad_inp[indices] = grad * (z[indices] - y[indices[0]]) / z.shape[0]
@property
def inputs(self):
return (self.args[0], self.args[1], self.args[2])
@property
def outputs(self):
return (self.args[3],)
AUTODIFF_OPS = ['cross_entropy_loss_grad', 'sgd', 'relu_grad', 'max_pool_grad', 'elem_tanh_grad',
'global_average_pool_grad', 'elem_add_grad', 'flatten_grad', 'batchnorm_grad',
'average_pool_grad'] | 39.920588 | 134 | 0.624475 | 1,980 | 13,573 | 4.09697 | 0.070202 | 0.063116 | 0.055227 | 0.06213 | 0.827539 | 0.789818 | 0.776257 | 0.759246 | 0.750863 | 0.750863 | 0 | 0.022105 | 0.230089 | 13,573 | 340 | 135 | 39.920588 | 0.754163 | 0.049657 | 0 | 0.714286 | 0 | 0 | 0.03376 | 0.021032 | 0 | 0 | 0 | 0 | 0.015444 | 1 | 0.173745 | false | 0.003861 | 0.023166 | 0.123552 | 0.370656 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.